I'm going to start with some reference material here, then later get to how this relates to physically based shading in games. If you are a photographer and you start with un-edited raw captures from your camera, then you already understand the visual example: what is rendered by light in real life, is a lot different than what your mind sees.
EDIT: If you don't have an sRGB color calibrated monitor, then these images are going to look wrong, sorry. Just noticed how bad the images look on a cheep Android device (fail). Unfortunately I only have machines with calibrated monitors (double fail)...
First A Visual Example
Here is what the raw un-edited capture looks like of a scene shot around sun-down on the lava fields on the Big Island in Hawaii. This is not what I would describe as a good landscape shot professionally, but it is good enough to make the point at hand. The only adjustment done here, is application of the camera's auto white balance recorded at the time of capture, and a color conversion from the camera's native linear colorspace to sRGB for saving as a PNG. No adjustment was done to contrast or saturation or anything else. Take note of the lack of contrast, the lack of saturation, the camera's auto white point not understanding the color of the sunset, etc. The real physically correct image looks "dull" and "flat".
Below is an example edited version which contains white point adjustment, tone curve to bring out contrast, and a little increased saturation masked to low saturation areas of the photo (vibrancy control in Aperture). This is closer to what I remember the scene to have actually looked like, and very close to how it felt to be in this location around sunset. Notice how the difference between the darker old and lighter newer lava flow is visible. Notice how the sunset skybox reflection is clearly seen in the rock, how now the lighting direction is obvious from the contrast of the warm tones of the sunset to the cool tones of the rest of the sky reflecting in the scene. Side note, I'm apologizing early for the clamping on the blacks on this image. A good edit of this photo would not have this problem, but I used Aperture and it lacks good black control when using correct linear curves adjustment.
Relating to Physically Based Shading in Games
The post processing step has the function of translating a physically lit scene into something closer to what the human mind would perceive. Typical art pipeline does not decouple the physical lighting from the post processing. Often it is the environmental artist's job to work under some standard tone-mapping, and attempt to place lights to bring out the feeling of the scene. An interesting complement to this would be to also work with post processing OFF and attempt to match un-edited raw camera reference material. Then later attempt to tune the post processing, then repeat this cycle.
Post processing is a critical step in reaching the desired feeling of the scene. A few games have employed adaptive exposure level control, but I'm not sure if anyone is doing adaptive tone-mapping in the blacks, and this might be useful in games. From the photography perspective, working with scenes with large dynamic range, control over the toe of the tone mapping curve is very important. The aim is to squash the darks and the highlights to leave as much contrast for the mid tones as possible without making the scene look to unnatural. Games have this easy compared to print. The dynamic range of luster photo paper is bad, the dynamic range of canvas is horribly bad. Below is the tone curve from the above edited image. It would take a lot more fine control over the darks to avoid the incorrect black clipping in my example,
Something important to note, really good darks adjustment is not something which can be done well in color grading with the practical sized tiny 16^3 or 32^3 3D textures used in games even in the sRGB colorspace.
Surface Format Choice
When working in a linear colorspace, the 10-bit and 11-bit formats do not offer enough precision to have a lossless conversion to 8-bit/channel sRGB even for just the 0.0 to 1.0 range. Integer formats have a loss of precision in the darks and will result in a certain amount of banding during conversion. Floating point formats have a loss of precision in the lights and will also result in banding. The N:1 column below states that in the worst case 1 value in the linear format skips N values in the 8-bit sRGB conversion.
FP16 has more than enough range to work without banding. Assuming direct linear FP16 to 8-bit/channel sRGB, the following table shows the amount of total dynamic range given certain scale factors, and provides the worst case precision after the conversion. In this case 1:N, means N FP16 values map to 1 8-bit sRGB value. N larger than 1 is describing cases of increased precision.
Note that 8-bit sRGB does not have enough precision itself to reproduce an image on today's high contrast displays without showing banding, so games still need to temporally dither using some kind of film grain even if they don't want the grain to be visible outside of just removing the banding seen with 24-bit/pixel display.
Highlight Compression and Saturation
Another area which needs some R&D time for games: saturation as a function of high dynamic range in combination with highlight compression. Game developers are starting to get a better grasp on the inseparable bond between tone-mapping and contrast and saturation. Early tone-mapping operators typically botched contrast and saturation in the darks, but this is thankfully changing. However, what is the best thing to do for highlights? Pure primaries tone-map to pure hues, however anything with some amount of all primaries will at some point tone-map to white loosing all color cues. The problem is that physically correct bright lights don't often align with the exact display RGB primaries, and colors don't really clip to "white" often in human vision. On top of this, highlight compression provided by tone-mapping acts like a desaturation operator on really bright colors. Perhaps tone-mapping should include something to saturate based on intensity to work around this problem, instead of artists needing to always select lights which are closer to primaries in order to maintain color with really bright HDR lighting?
EDIT: If you don't have an sRGB color calibrated monitor, then these images are going to look wrong, sorry. Just noticed how bad the images look on a cheep Android device (fail). Unfortunately I only have machines with calibrated monitors (double fail)...
First A Visual Example
Here is what the raw un-edited capture looks like of a scene shot around sun-down on the lava fields on the Big Island in Hawaii. This is not what I would describe as a good landscape shot professionally, but it is good enough to make the point at hand. The only adjustment done here, is application of the camera's auto white balance recorded at the time of capture, and a color conversion from the camera's native linear colorspace to sRGB for saving as a PNG. No adjustment was done to contrast or saturation or anything else. Take note of the lack of contrast, the lack of saturation, the camera's auto white point not understanding the color of the sunset, etc. The real physically correct image looks "dull" and "flat".
Below is an example edited version which contains white point adjustment, tone curve to bring out contrast, and a little increased saturation masked to low saturation areas of the photo (vibrancy control in Aperture). This is closer to what I remember the scene to have actually looked like, and very close to how it felt to be in this location around sunset. Notice how the difference between the darker old and lighter newer lava flow is visible. Notice how the sunset skybox reflection is clearly seen in the rock, how now the lighting direction is obvious from the contrast of the warm tones of the sunset to the cool tones of the rest of the sky reflecting in the scene. Side note, I'm apologizing early for the clamping on the blacks on this image. A good edit of this photo would not have this problem, but I used Aperture and it lacks good black control when using correct linear curves adjustment.
Relating to Physically Based Shading in Games
The post processing step has the function of translating a physically lit scene into something closer to what the human mind would perceive. Typical art pipeline does not decouple the physical lighting from the post processing. Often it is the environmental artist's job to work under some standard tone-mapping, and attempt to place lights to bring out the feeling of the scene. An interesting complement to this would be to also work with post processing OFF and attempt to match un-edited raw camera reference material. Then later attempt to tune the post processing, then repeat this cycle.
Post processing is a critical step in reaching the desired feeling of the scene. A few games have employed adaptive exposure level control, but I'm not sure if anyone is doing adaptive tone-mapping in the blacks, and this might be useful in games. From the photography perspective, working with scenes with large dynamic range, control over the toe of the tone mapping curve is very important. The aim is to squash the darks and the highlights to leave as much contrast for the mid tones as possible without making the scene look to unnatural. Games have this easy compared to print. The dynamic range of luster photo paper is bad, the dynamic range of canvas is horribly bad. Below is the tone curve from the above edited image. It would take a lot more fine control over the darks to avoid the incorrect black clipping in my example,
Something important to note, really good darks adjustment is not something which can be done well in color grading with the practical sized tiny 16^3 or 32^3 3D textures used in games even in the sRGB colorspace.
Surface Format Choice
When working in a linear colorspace, the 10-bit and 11-bit formats do not offer enough precision to have a lossless conversion to 8-bit/channel sRGB even for just the 0.0 to 1.0 range. Integer formats have a loss of precision in the darks and will result in a certain amount of banding during conversion. Floating point formats have a loss of precision in the lights and will also result in banding. The N:1 column below states that in the worst case 1 value in the linear format skips N values in the 8-bit sRGB conversion.
COLORSPACE & FORMAT   | sRGB BANDING | LOCATION OF BANDING |
Linear UNORM10 | 4:1 | blacks (0 to 0.13) |
Linear UNORM11 | 2:1 | blacks (0 to 0.06) |
Linear FP10/64512.0 | 3:1 | whites (0.39 to 1.0) |
Linear FP11/65024.0 | 2:1 | whites (0.75 to 1.0) |
FP16 has more than enough range to work without banding. Assuming direct linear FP16 to 8-bit/channel sRGB, the following table shows the amount of total dynamic range given certain scale factors, and provides the worst case precision after the conversion. In this case 1:N, means N FP16 values map to 1 8-bit sRGB value. N larger than 1 is describing cases of increased precision.
LINEAR FP16 SCALING   | sRGB PRECISION | DYNAMIC RANGE |
FP16*5348.0 | 1:1 | 1:350M |
FP16*2604.0 | 1:2 | 1:170M |
FP16*1284.0 | 1:4 | 1:84M |
FP16*638.0 | 1:8 | 1:41M |
Note that 8-bit sRGB does not have enough precision itself to reproduce an image on today's high contrast displays without showing banding, so games still need to temporally dither using some kind of film grain even if they don't want the grain to be visible outside of just removing the banding seen with 24-bit/pixel display.
Highlight Compression and Saturation
Another area which needs some R&D time for games: saturation as a function of high dynamic range in combination with highlight compression. Game developers are starting to get a better grasp on the inseparable bond between tone-mapping and contrast and saturation. Early tone-mapping operators typically botched contrast and saturation in the darks, but this is thankfully changing. However, what is the best thing to do for highlights? Pure primaries tone-map to pure hues, however anything with some amount of all primaries will at some point tone-map to white loosing all color cues. The problem is that physically correct bright lights don't often align with the exact display RGB primaries, and colors don't really clip to "white" often in human vision. On top of this, highlight compression provided by tone-mapping acts like a desaturation operator on really bright colors. Perhaps tone-mapping should include something to saturate based on intensity to work around this problem, instead of artists needing to always select lights which are closer to primaries in order to maintain color with really bright HDR lighting?