Crave For Games


The text below is one of the notes from the StoneDrop development log. If you want a list of only technical articles you can go to the Articles page.
The Rendering of INSIDE

Key points from Playdead talk about techniques in the Inside game (powered by Unity) which allowed to increase visual quality and improve rendering performance.


Video 1. Low Complexity, High Fidelity: The Rendering of INSIDE. December 2016, GDC

Rendering:

- deferred rendering is used: basepass (depth, normals), lights (shadows, lighting), volume lighting, final (apply lighting), translucency (+decals and volume lighting resolve), post-effects (temporal anti-aliasing (TAA), wide glow blur at 1/4 res, HDR glow blur, multi-post-effect composing and distortion)

- cap fog to allow bright light sources to shine through

- fake atmospheric scattering - wide glow of a screen

- for emissive materials - narrow glow in post-effect stage with LDR->HDR color remapping

- for smooth chromatic aberration - use radial blur with 3+ samples and jittering with color adjustment by bilinearly sampled 3x1 texture with red, green and blue pixels

- to avoid capping while color grading - use smooth minimum levels - smoothed curve between y=x and y=1 lines.

 

Volumetric lighting:

- raymarching with only 3-9 samples at 1/2 resolution

- ray start jittered with blue noise (high-passed white noise and uniform distribution) funtion

- fog is rendered inside volumes which are intersections of predetermined boxes and camera frustum (each frame)

- volume rendered in 2 passes: write front-face depths; read them to determine optimal ray start and end points and perform raymarching

- use 1/2 resolution to improve performance: in 2nd pass output light intensity (8 bits) and max depth (24 bits), 3rd pass for depth-aware upsampling with noise blur (to break 1/2 resolution structure)

- temporal anti-aliasing removes noise artifacts (if 1/2 resolution structure is broken)

- use animated cookie textures to fake caustics

- cookies and low resolution and 16 bit shadow maps can be applied (it's a low-frequency effect, after all).

 

Dithering:

- eye can percieve ~14 bits colors, so 8 bits is not enough; high precision increases bandwidth, sRGB has "interesting" implementation => add noise to remove banding

- Bayer matrix introduces easily noticeable patterns

- uniform white noise easily modulated by the signal

- triangular 2 bit noise is not modulated but has higher errors

- blue noise is a tradeoff between the two

- precalculate blue noise into texture (blue remapped to triangular) to increase texture cache cohirency

- dither everything: lighting pass to remove banding on light intensity, final pass to remove subtle banding of every gradient, translucency pass to avoid banding from multiple blending, posteffects to avoid banding from wide glow, base pass to avoid normals banding

- animate the noise to avoid noticeable screen-space patterns and to employ TAA to remove noise.

 

Custom lighting:

- double Lambert: LdotN = LdotN * _Hardness + 1.0 - _Hardness

- deferred light pass allows to substract or multiply light and do fake ambient occlusion (AO) with "black" lights, special cases: point AO source, sphere, box on a plane, projected decals for columns-like objects

- speedup projection math for decals: move decal ray to object space and back-transfrom it to decal object space in vertex shader to avoid matrix multiplications in fragment shader.

 

Effects:

- use decals to setup a screen-space reflections (SSR) - easy setup for deferred rendering

- to avoid going outside the screen in SSR sometimes you can fake normals and drop x component

- for small occluders in SSR you can introduce a consept of hard walls while reflection marching - use sample from the other side of the occluder if we are trying to go behind it

- blue noise and jittering for start point in SSR marching

- planar camera reflections for large water surfaces - different rendering order when inside/outside the water; for border cases - render multiple times using stencil buffer

- use fake lights and fake upper lighting for fog particles

- use wabbling (with tiles swirl noise texture) for fog particles to hide the particle boundaries

- for fire use colorgrading (black-white to black-red-orange-yellow-white) after blending (into some buffer, for example HDR mask)

- use flipbook for fire flames and select rows sequentially and columns randomly to avoid loops and lags; fade between flipbooks with vertical gradient (with noise)

- for lens flare you can sample depth buffer instead of raytrace back to light source and also sample not in the source position but with a little offcet to the corners of the flare quad

- for rain effect - mesh with individual drops animated in vertex shader (height is a fraction part of time) and randomized positions (as an integer part of time)

- there are tiled textures in a water-wave forms - just remember the stone bricks on a paved roadway.

 

UPD June 2017: btw Playdead released source code for their (TAA) implementation and recently the VR support was added: github.com/playdea...[1]

 

Link 1: https://github.com/playdeadgames/temporal

This article in social networks: