Quantcast
Channel: Timothy Lottes
Viewing all articles
Browse latest Browse all 434

Black Friday vs Display Technology

$
0
0
My wife and I are in the Toronto area in Canada visiting family for the the US Thanksgiving holiday. Naturally food and shopping resulted in a trip to Costco and the mall, where I ran across an Apple store, and managed to entertain myself by getting back up to speed again with the state of 4K...

I'm Lost, Where is the 4K?
In some Markham Apple store, I tried one of these 27" 5K iMacs. Was very confused at first, looking at the desktop photo backgrounds, everything is up-sampled. But the rendered text in the UI, that was sharp, actually too sharp. Had to find out what GPU was driving this 5K display. Opened Safari and went to www.apple.com, went to the website with iMac tech specs. Couldn't read the text. Solved this problem by literally sticking my face to the screen. AMD GPU, awesome.

Second Costco, everything is 4K, and yet nothing shows 4K content. Seems like out of desperation they were attempting to show some time-lapse videos from DLSR still frames (easier to get high resolution from stills than from video). Except even those looked massively over-processed and up-sampled (looked like less than Canon 5D Mark I source resolution). All the BluRay content looks like someone accidentally set the median filter strength to 1000%. All the live digital TV content looks like "larger-than-life" video compression artifacts post processed by some haloing sharpening up-sampling filter. There is a tremendous opportunity for improvement here driving the 4K TV from the PC, and doing better image processing.

Ok So Lets Face Reality, Consumers Have Never Seen Real 4K Video Content
Well except for IMAX in the theater. Scratch that, consumers haven't even seen "real" 2K content (again other than at the movie theater). Artifact-free 2K content simply doesn't exist at the bit-rate people stream content. Down-sample to 540p and things look massively better.

What About Photographs
Highest-end "prosumer" Canon 5DS's 8688x5792 output isn't truly 8K. It is under 4K by the time it is down-sampled enough to remove artifacts. At a minimum the Bayer sampling pattern results in technically something around 8K/2 in pixel area, but typical Bayer reconstruction introduces serious chroma aliasing artifacts. Go see for yourself, zoom into the white on black text pixels on dpreview's studio scene web app. But that is just ideal lighting, at sharpest ideal aperture, on studio stills. Try keeping pixel-perfect focus everywhere in hard lighting in a real scene with a 8K 35mm full-frame camera... Anyway most consumers have relatively low resolution cameras in comparison. Even the Canon 5D Mark III is only 5760x3840 which is well under 4K after down-sampling to reach pixel perfection. Ultimately a Canon 5D Mark III still looks better full screen on a 2K panel because the down-sampling removes all the artifacts which the 4K panel makes crystal clear.

4K Artifact Amplification Machine Problem
Judging by actual results on Black Friday, no one is doing a good job of real-time video super-resolution in 4K TVs. It is like trying to squeeze blood from a stone, more so when placing in the context that the source is typically suffering from severe compression artifacts. The HDTV industry needs it's form of the "Organic" food trend: please bring back some non-synthetic image processing. The technology to do a good job of image enlargement was actually perfected a long time ago, it is called the slide projector. Hint use a DOF filter to up-sample.

4K and Rendered Content
What would sell 4K as a TV to me: a Pixar movie rendered at 4K natively, played back from losslessly compressed video. That is how I define "real" content.

What about games? I've been too busy to try Battlefront yet except the beta, and I'm on vacation now without access to my machine (1st world problems). But there are some perf reviews for 4K online. Looks like Fury X is the top single-GPU score at 45 fps. Given there are no 45 Hz 4K panels, that at best translates to 30 Hz without tearing. Seems like variable refresh is even a more important feature for 4K than it was at 1080p, this is something IMO TV OEMs should get on board with. Personally speaking, 60 Hz is the minimum fps I'm willing to accept for a FPS, so I'm probably only going to play at 1080p down-sampled to 540p (for the anti-aliasing) on a CRT scanning out around 85 Hz (to provide a window to avoid v-sync misses).

Up-Sampling Quality and Panel Resolution
The display industry seems to have adopted a one-highest-resolution fits all model, which is quite scary, because for gamers like myself, FPS and anti-aliasing quality is the most important metric. Panel resolution beyond the capacity to drive perfect pixels is actually what destroys quality, because it is impossible to up-sample with good quality.

CRT can handle variable resolutions because the reconstruction filtering is at an infinite resolution. Beam has a nice filtered falloff which blends scan-lines perfectly. 1080p up-sampled on a 4K panel will never look great in comparison, because filtering quality is limited to alignment to two square display pixels. 4K however probably will provide a better quality 540p. Stylized up-sampling is only going to get more important as resolutions increase. Which reminds me, I need to release the optimized versions of all my CRT and stylized up-sampling algorithms...

CRT vs LCD or OLED in the Context of HDR
Perhaps it might be possible to say that HDR would finally bring forward something superior to CRTs. Except there is one problem, display persistence. A typical 400 nit PC LCD driven at 120 Hz has a 8.33 ms refresh. Dropping this to a 1 ms low persistence strobed frame would result in 1/8 the brightness (aka it would act like a 50 nit panel). With LCDs there is also the problem of strobe to scanning pattern (that changes the pixel value) mismatch resulting in ghosting towards the top and bottom of the screen. Add on top, large displays like TV panels have power per screen area problems resulting in global dimming. So I highly doubt LCD displaces the CRT any time soon even in the realm of 1000 nit panels.

OLED is a wild card. Works great for low persistence in VR, but what about large panels? Seems like the WRGB LG 4K OLED panel is the one option right now (it is apparently in the Panasonic as well). Based on displaymate.com results, OLED is not there yet. Blacks should be awesome, and better than CRTs, but according to hdtvtest.co.uk's review of the latest LG OLED, there is a serious problem with near black vertical banding after 200+ hours of use. Also with only around 400 some nits peak with more global dimming issues than typical LCDs, looks like large-panel low-persistence is going to be a problem for now. Hopefully this tech gets improved and they eventually release a 1080p panel which does low persistence at as low as 80 Hz.

Reading tech specs of these 4K HDR TVs paints a clear picture of what HDR means for Black Friday consumers. Around a 400-500 nit peak, just like current PC panels, but PC panels don't have global dimming problems. Newest TV LCD panels seem to have a 1-stop ANSI contrast advantage, perhaps less back-light leakage with larger pixels. Screen reflectance has been dropping (this is an important step in getting to better blacks). TV LCD panels are approaching P3 gamut. PC panels have been approaching Adobe RGB gamut. Both are similar in area. Adobe RGB mostly adds green area to sRGB/Rec709 primaries, where P3 adds less green, but more red. So ultimately if you grab an Black Friday OLED and don't get the near black 200+ hour problem, HDR translates to literally "better shadows".

Rec 2020 Gamuts and Metamerism
This Observer Variability in Color Image Matching on a LCD monitor and a Laser Projector paper is a great read. The laser projector is a Microvision, the same tech in a Pico Pro, has a huge Rec 2020 like gamut. The route to this gamut is via really narrow band primaries. As the primaries get narrow and move towards the extremes of the visible spectrum, the human perception of the color generated by the RGB signal becomes quite divergent, see Figure 6. Note it is impossible to use a measurement tool to calibrate this problem away. The only way to fix it is via manual user adjustment visually: probably via selecting between 2 or 3 stages of 3x3 swatches on the screen. And note that manual "calibration" would only be good for one user... anyone else looking at the screen is probably seeing a really strangely tinted image.

HDR and Resolution and Anti-Aliasing
While it will still take maybe 5 to 10 years before the industry realizes this, HDR + high resolution has already killed off the current screen-grid aligned shading. Lets start with the concept of "nyquist frequency" in the context of still and moving images. For a set display resolution, stills can have 2x the spatial resolution of video if they either align features to pixels (aka text rendering) or accept that pixel sized features not aligned to pixel center will disappear as they align with pixel boarder. LCDs adopted "square pixels" and amplified the spatial sharpness of pixel-aligned text, and this works to the disadvantage of moving video. Video without temporal aliasing can only safely resolve a 2 pixel wide edge at full contrast, as one pixel edges under proper filtering would disappear as they move towards pixel boarder (causing temporal aliasing). So contrast as a function of frequency of detail, needs to approach zero as features approach 1 pixel in width. HDR can greatly amplify temporal aliasing, making this filtering many times more important.

Screen-grid aligned shading via raster pipe requires N samples to represent N graduations between pixels. LDR white on black requires roughly 64 samples/pixel to avoid visible temporal aliasing, in this case worst aliasing being around 1/64 of white (possible to mask the remaining temporal aliasing in film grain). With HDR the number of samples scales by the contrast between the lightest and darkest sample. To improve this situation requires factoring in the fractional coverage of the shaded sample. And simply counting post-z sample coverage won't work (not enough samples for HDR). Maybe using Barycentric distance of triangle edges to compute a high precision coverage might be able to improve things...

The other aspect of graphics pipelines which needs to evolve is the gap between the high frequency MSAA resolve filter and low frequency bloom. The MSAA resolve filter cannot afford to get wide enough to properly resolve an HDR signal. The more the contrast, the larger the resolve filter kernel must be. MSAA resolve to not have temporal aliasing with LDR requires a 2 pixel window. With HDR the 1/luma bias is used which creates a wrong image. The correct way is to factor the larger than 2 pixel window into a bloom filter which starts at pixel frequency (instead of say half pixel).

But these are really just bandages, shading screen aligned samples doesn't scale. Switching to object/texture space shading with sub-pixel precision reconstruction is the only way to decouple resolution from the problem. And after reaching this point, the video contrast rule of approaching zero contrast around 1 pixel wide features starts to work to a major advantage, as it reduces the number of shaded samples required across the image...

Viewing all articles
Browse latest Browse all 434

Trending Articles