Quantcast
View Post
Peh said:
SvennoJ said:

I wish my cable company would catch on. Still sending 720p/1080i mpeg-2 shit and calling it HD. My 4K tv does an awesome job rendering all the compression artifacts!

1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)

Why does 4K look better on 1080p monitors than 1080p
 
https://www.youtube.com/watch?time_continue=2&v=kIf9h2Gkm_U

I guess we have to wait until 8K tvs to get full color resolution at 4K!

That "720p/1080i mpeg-2" is set in stone and won't change anymore. 

"1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)"

You get 1080p on bluray, but you won't see chroma 4:4:4 in video. One reason could be the way higher bandwidth and storage space usage and the other would be that the difference is subjective way too low to see. To fully utilize Chroma 4:4:4 on Full HD use a PC and play video games on it :) 

4k 60 Hz Chroma 4:4:4 10bit is only possible at HDMI 2.1 or higher. HDMI 2.0 cannot manage such image quality alone. 

I have been playing video games in chroma 4:4:4 on a 1080p projector since 2007. True you need a big screen to see the difference and my 1080p TV from that time could not display 4:4:4 and converted the RGB signal to 4:2:0. My projector could definitely benefit from 4:4:4 video, yet I haven't found anything that supports that besides downsampled 4K you tube videos as in that link above. Games always looked more detailed color wise on the projector, until my 4K set which has better color reproduction than my projector.

The difference is not subjective way too low to see, people praise the color of 4K blu-ray more so than the sharpness. A lot of early 4K blu-rays were upscaled from 2K masters (some still are) so all you were really getting is 4 times the color resolution compared to the blu-ray version. I would say the difference between 4x the detail in color information is bigger than the expanded color space and 10 bit color. Anyway HDR stole the show as that's what's really visible. HDR kinda negates the 10 bit benefit as HDR 10 simply changes the brightness scale stretching it out. 4 times the brightness range fit to a scale of 0 to over 1000 nits instead of the old max of 50fl for bright living room experience (about 170 nits) to 15fl for home/cinema projection (51 nits) for which movies were calibrated. (Cinema open gate used to be 14fl, 47 nits)

The next big thing will be HLG (Hybrid log gamma) for HDR for a better logarithmic curve, downside current tvs are not compatible.
https://www.whathifi.com/advice/hybrid-log-gamma-explained-new-hdr-tv-broadcast-format

Anyway 10bit 4K displays are great, the higher resolution ensures upscaled material has much more to work with, less visible scaling artifacts plus a higher range for contrast and color adjustments with the 10 bit panel which is also a form of scaling. It also fixes screendoor effect of larger tvs. That was not a problem with the projector as that used a smart way to focus the image to eliminate screen door, which you can't really do with TVs. Yet there's still life in the good old 1080p displays.