Quantcast
Channel: Timothy Lottes
Viewing all articles
Browse latest Browse all 434

Thoughts on Display Color Calibration for Games

$
0
0
On Apple products across the board, the factory tonal configuration is Gamma 2.2 not sRGB. Using an sRGB backbuffer is totally useless, instead whatever shader converts from linear high dynamic range to display target needs to manually do the pow(). Typically this step is manual anyway, because that is required to properly dither the floating point color to 8-bit per channel output. On the plus side, Apple products are so well calibrated and matched even between desktop and mobile, that anyone with a color calibrated authoring pipeline can target the hardware and the consumer will experience the artist's intent. This is simply awesome.

On the PC side, and from what I could see on a very small sampling of the fragmented Android space, sRGB is a better match (than Gamma 2.2) to default device factory calibration. This is not surprising given that both sRGB and Rec.709 (and later) HDTV standards adopted a linear segment close to black. The idea being that the linear segment enables a better perceptual distribution given a fixed set of bits.

The disadvantage of encodings like sRGB, which mix both a linear segment and gamma curve into the tonal curve, is that "correct" manual dithering can be more expensive (because the conversion is much more expensive). Given that all realtime digital content should use temporal dithering to avoid output banding, Apple's choice of fixed Gamma 2.2 seems like a much better choice. However...

On the Topic of Banding
TN panels often are 6-bit/channel, with temporal dithering. Plasma hits the other end of the extreme (maybe 1 or 2-bit/channel?) with extreme temporal dithering (600 Hz). In both these cases, applications need to manually dither beyond the exact amount required for 8-bit output (display dither is too conservative). Also in both these cases, the application temporal dither can mix in bad ways with the display's temporal dithering. My current feeling is that the correct solution to this problem is to replace the "correct" temporal dither with a film grain with a gamma responce (like film) applied in the linear HDR colorspace. This film grain would have a minimum amount even in the light areas which is large enough to serve as the temporal dither (to remove banding on worst case target). Also the film grain would be between 1.5 and 2 pixel in size, so that it does not conflict with a display's 1 pixel sized temporal dithering. The end result of this, is that sRGB again is a fine target, and Gamma 2.2 requires extra shader overhead.

White Point Calibration
Seems best to just target the D65 (daylight filtered 6500K) of sRGB. Knowing that: (a.) displays will be +/- that value, (b.) the mind automatically adapts to small differences in white point, and (c.) that white point will change +/- towards darks as well even on a given display. The cause of (c.) is that even if the display is calibrated to D65, the native black point of the display typically is not D65, and the only way to fix that is to raise the black level (adding intensity to come channels, reducing contrast), which is not something OEMs and users want.

Simple Production Calibration Goals
So the goal of simple calibration of displays is to get the {R,G,B} LUTs to provide a D65 white, through the entire gray scale, with the sRGB tonal curve, with the exception that somewhere in the darks, the very dark grey color will start to color shift to the native black point color tint, and then terminate at something which is not fully black. The color gamut of the display ultimately decides saturation scaling, which changes per type of display.

Simple In-Game Controls
Display gamma is the wild west, so at least need some user-adjustable gamma. Something like "move the slider until the dark symbol is just barely visible". Still not sure if user-adjustable offset is required as well.

Viewing all articles
Browse latest Browse all 434

Trending Articles