Quantcast
So, I was completely wrong about 4K

Forums - Gaming Discussion - So, I was completely wrong about 4K

gcwy said:
There are still a lot of scepticism and misconceptions around it. 4K isn't a fad like 3D. It's the inevitable, next step forward in the progression of video quality.

I wish my cable company would catch on. Still sending 720p/1080i mpeg-2 shit and calling it HD. My 4K tv does an awesome job rendering all the compression artifacts!

1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)

Why does 4K look better on 1080p monitors than 1080p

https://www.youtube.com/watch?time_continue=2&v=kIf9h2Gkm_U

I guess we have to wait until 8K tvs to get full color resolution at 4K!



Around the Network
Azuren said:
kowenicki said:
I have LG OLED 65 2017 model. Simply stunning.

1) Which one?

2) Please tell me you don't do heavy gaming on that

3) Make sure you shut it off every three or so hours.

1. OLED 65C7V  (cool fact, all panels on 2017 LG's are the same, as you move up the range you just pay for the nicer surround - crazy)

2. I don't do heavy gaming full stop.

3. I won't be getting burn in, see 2 above.



I'm not really here!

Link: Shipment History Since 1995


Biggerboat1 said:
Azuren said:

First of all, that chart is from a site that believes Image Retention isn't a super-important stat for a screen while gaming or using as a computer monitor. They give great objective information on TVs, but they tend to be full of shit when it comes to more subjective information (which in this instance would be anything that can't be measured).

 

Second of all, yeah, a 55" isn't for ten feet away. Like I just said. That's why the industry standard isn't 55" now, it's 65". 55" is still popular, though, because many people live in apartments where the sitting area is 6-8ft away from the TV. Not everyone lives in a house with a living that gives them a 10ft distance from the couch.

 

Third of all, if he says he can see a difference, then his eyes are probably just better.

 

And finally, if you didn't have an agenda you would have put all the pieces together to learn why 4K is a thing. I can spell it out for you, though: 4K is so we can have bigger screens in smaller spaces without losing pixel density and succumbing to the "screen door" effect. That's it. Yes, it also means more detailed images, but the biggest thing is maintaining high pixel densities at larger sizes. Have you ever seen a 100" 1080p TV? The pixel structure looks like a fucking Light Bright.

 

I really don't see anything that I've written warranting you having to have spell anything out tbh... 

I understand why 4k is a thing, and I look forward to buying a 77" oled somwhere down the line when prices aren't quite so insane.

If you don't like the chart that I've quoted then chuck the term 'resolution vs viewing distance chart' in Google and you'll find countless others showing pretty much the same breakdown of info. They surely can't all be wrong...?

I don't have an agenda, I own a 4k telly for God's sake...

My issue is that Sony and Microsoft are pushing the 4k message hard at the mainstream when there's only a tiny percentage of gamers who'll really benefit in a meaningful way. Those that sit very close to a 55" or own a 65" plus, oh, and apparently have fantastic eyesight! The ven disagram is becoming ever more teeny-tiny...

Finally, watching netflix or Amazon or whatever makes sense as most people have unlimited Internet packages  - even if it's only a small improvement, why not! But gaming requires a whopping 4x the graphical power. That's a crazy amount of extra power and/or money for a gain that is proportionate to your tv size & imo you'd be nuts to go that route unless you are sitting crazy close or have a 65" plus. 

You can argue charts all you want on the viewing angle business, but most people will still argue that those charts are bullshit. The more you talk about that chart rather than your own experience, the more I suspect you've never tested it yourself.

 

As far as whether or not Sony and MS should be pushing this, it's up to the consumer. And they've decided they want higher resolution instead of higher frame rates.

 

Also, your last statement is essentially the same as your second-to-last statement... Not quite sure how to respond to it.



Watch me stream games and hunt trophies on my Twitch channel!

Check out my Twitch Channel!:

www.twitch.tv/AzurenGames

kowenicki said:
Azuren said:

1) Which one?

2) Please tell me you don't do heavy gaming on that

3) Make sure you shut it off every three or so hours.

1. OLED 65C7V  (cool fact, all panels on 2017 LG's are the same, as you move up the range you just pay for the nicer surround - crazy)

2. I don't do heavy gaming full stop.

3. I won't be getting burn in, see 2 above.

1. I run an audio video department, I'm aware of the similarities between the different 2017 LG OLEDs. It's the same panel for all five, but different form factors and audio systems.

 

2. Do you heavily watch News or Sports? Because those also have a lot of static images. 

 

3. Even if you don't heavily play video games on the TV, LG still recommends powering down for a refresh cycle every three hours. Channel logos, TV guides, and even LG's own logo have all been known to burn into OLEDs over time.



Watch me stream games and hunt trophies on my Twitch channel!

Check out my Twitch Channel!:

www.twitch.tv/AzurenGames

Azuren said:
Biggerboat1 said:

 

I really don't see anything that I've written warranting you having to have spell anything out tbh... 

I understand why 4k is a thing, and I look forward to buying a 77" oled somwhere down the line when prices aren't quite so insane.

If you don't like the chart that I've quoted then chuck the term 'resolution vs viewing distance chart' in Google and you'll find countless others showing pretty much the same breakdown of info. They surely can't all be wrong...?

I don't have an agenda, I own a 4k telly for God's sake...

My issue is that Sony and Microsoft are pushing the 4k message hard at the mainstream when there's only a tiny percentage of gamers who'll really benefit in a meaningful way. Those that sit very close to a 55" or own a 65" plus, oh, and apparently have fantastic eyesight! The ven disagram is becoming ever more teeny-tiny...

Finally, watching netflix or Amazon or whatever makes sense as most people have unlimited Internet packages  - even if it's only a small improvement, why not! But gaming requires a whopping 4x the graphical power. That's a crazy amount of extra power and/or money for a gain that is proportionate to your tv size & imo you'd be nuts to go that route unless you are sitting crazy close or have a 65" plus. 

You can argue charts all you want on the viewing angle business, but most people will still argue that those charts are bullshit. The more you talk about that chart rather than your own experience, the more I suspect you've never tested it yourself.

 

As far as whether or not Sony and MS should be pushing this, it's up to the consumer. And they've decided they want higher resolution instead of higher frame rates.

 

Also, your last statement is essentially the same as your second-to-last statement... Not quite sure how to respond to it.

My sofa is about 5 feet from my tv, but since you sit back when viewing I'm actually about 6 feet. The difference I've noticed is minimal.  I've never worn glasses or contacts, had the all clear on my last eye test about 18 months ago and as I mentioned, have a job that relies on at least a reasonable level of visual acuity. There's my personal experience, much of which I actually covered in my original post.

I can say one thing and you can say the opposite, except I can actually point towards some outside sources, all you can do is call BS on both my view and the charts... 

It's absolutely not up to the consumer what MS & Sony pushes & I must have missed the referendum on resolution vs frame rate your refer to.

The reason they are pushing 4k is that it's the easiest way to flog an updated system and have developers improve the fidelity of their games - simply increase the resolution to take advantage of the extra horse power - done! In reality the extra power of the pro & X could be harnessed with far greater effect by adding/improving effects/textures/models but that is too much hard work for too small a userbase for most devs to bother with.

My point will be proven when the PS5 & X2 come out and display most of their games <4K. And why do you think that will be?

I'm far from alone on this - have a read through the entirety of this thread and you'll see the overall theme is quite a tepid reception to 4K as it stands. And I'd hazard a guess that some of those seemingly impressed with their new sets are benefiting more from the overall PQ improvements over the tvs they're replacing rather than specifically the jump to 4K.

Finally my last and 2nd last statement on my previous post are variations on the same point but they are consistent so not sure why that would pose an issue to you responding? That doesn't make any sense. 



Around the Network

In other news: "Top scientists and mathematicians have discovered that better is better".



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Azuren said:
kowenicki said:

1. OLED 65C7V  (cool fact, all panels on 2017 LG's are the same, as you move up the range you just pay for the nicer surround - crazy)

2. I don't do heavy gaming full stop.

3. I won't be getting burn in, see 2 above.

1. I run an audio video department, I'm aware of the similarities between the different 2017 LG OLEDs. It's the same panel for all five, but different form factors and audio systems.

 

2. Do you heavily watch News or Sports? Because those also have a lot of static images. 

 

3. Even if you don't heavily play video games on the TV, LG still recommends powering down for a refresh cycle every three hours. Channel logos, TV guides, and even LG's own logo have all been known to burn into OLEDs over time.

 

I've had another OLED for a few years and yes I do watch a lot of sports.  Never had any issues whatsoever.  Its a slightly overblown problem imo.   Image retention certainly happens in some conditions (news channels are the worst culprits with their rolling news banners), but it 'washes away' on newer OLED's under normal use.  Most UK sports channels have a slightly translucent logo or score/time thing now and they do remove during replays etc. which helps.  

I even had plasma's in the past (where it was much more of a potential issue) and I didn't even have a problem there either with burn in.  



I'm not really here!

Link: Shipment History Since 1995


SvennoJ said:
gcwy said:
There are still a lot of scepticism and misconceptions around it. 4K isn't a fad like 3D. It's the inevitable, next step forward in the progression of video quality.

I wish my cable company would catch on. Still sending 720p/1080i mpeg-2 shit and calling it HD. My 4K tv does an awesome job rendering all the compression artifacts!

1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)

Why does 4K look better on 1080p monitors than 1080p
 
https://www.youtube.com/watch?time_continue=2&v=kIf9h2Gkm_U

I guess we have to wait until 8K tvs to get full color resolution at 4K!

That "720p/1080i mpeg-2" is set in stone and won't change anymore. 

"1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)"

You get 1080p on bluray, but you won't see chroma 4:4:4 in video. One reason could be the way higher bandwidth and storage space usage and the other would be that the difference is subjective way too low to see. To fully utilize Chroma 4:4:4 on Full HD use a PC and play video games on it :) 

4k 60 Hz Chroma 4:4:4 10bit is only possible at HDMI 2.1 or higher. HDMI 2.0 cannot manage such image quality alone. 



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | Nvidia RTX 2080 Ti 11GB VRAM | Asus PG27UQ gaming on 3840 x 2160 @120 Hz GSYNC HDR| HTC Vive Pro :3

Reached PC Masterrace level.

Peh said:
SvennoJ said:

I wish my cable company would catch on. Still sending 720p/1080i mpeg-2 shit and calling it HD. My 4K tv does an awesome job rendering all the compression artifacts!

1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)

Why does 4K look better on 1080p monitors than 1080p
 
https://www.youtube.com/watch?time_continue=2&v=kIf9h2Gkm_U

I guess we have to wait until 8K tvs to get full color resolution at 4K!

That "720p/1080i mpeg-2" is set in stone and won't change anymore. 

"1080p was never fully utilized. Modern 1080p tvs were perfectly capable of displaying 4:4:4 video, instead all video is still chroma subsampled to 4:2:0, which quarters the color resolution. 4K blu-ray only has 1920x1080 pixels for color (normal blu-ray 960x540). So even without the resolution increase you can still get 4x the color resolution on a 1080p tv with 4K blu-ray or streaming. (if that was supported, it's not)"

You get 1080p on bluray, but you won't see chroma 4:4:4 in video. One reason could be the way higher bandwidth and storage space usage and the other would be that the difference is subjective way too low to see. To fully utilize Chroma 4:4:4 on Full HD use a PC and play video games on it :) 

4k 60 Hz Chroma 4:4:4 10bit is only possible at HDMI 2.1 or higher. HDMI 2.0 cannot manage such image quality alone. 

I have been playing video games in chroma 4:4:4 on a 1080p projector since 2007. True you need a big screen to see the difference and my 1080p TV from that time could not display 4:4:4 and converted the RGB signal to 4:2:0. My projector could definitely benefit from 4:4:4 video, yet I haven't found anything that supports that besides downsampled 4K you tube videos as in that link above. Games always looked more detailed color wise on the projector, until my 4K set which has better color reproduction than my projector.

The difference is not subjective way too low to see, people praise the color of 4K blu-ray more so than the sharpness. A lot of early 4K blu-rays were upscaled from 2K masters (some still are) so all you were really getting is 4 times the color resolution compared to the blu-ray version. I would say the difference between 4x the detail in color information is bigger than the expanded color space and 10 bit color. Anyway HDR stole the show as that's what's really visible. HDR kinda negates the 10 bit benefit as HDR 10 simply changes the brightness scale stretching it out. 4 times the brightness range fit to a scale of 0 to over 1000 nits instead of the old max of 50fl for bright living room experience (about 170 nits) to 15fl for home/cinema projection (51 nits) for which movies were calibrated. (Cinema open gate used to be 14fl, 47 nits)

The next big thing will be HLG (Hybrid log gamma) for HDR for a better logarithmic curve, downside current tvs are not compatible.
https://www.whathifi.com/advice/hybrid-log-gamma-explained-new-hdr-tv-broadcast-format

Anyway 10bit 4K displays are great, the higher resolution ensures upscaled material has much more to work with, less visible scaling artifacts plus a higher range for contrast and color adjustments with the 10 bit panel which is also a form of scaling. It also fixes screendoor effect of larger tvs. That was not a problem with the projector as that used a smart way to focus the image to eliminate screen door, which you can't really do with TVs. Yet there's still life in the good old 1080p displays.



Puppyroach said:

I have been extremely skeptical about the industry´s transition to 4K from the start and felt that it was overhyped and not as visible a change as people claimed.

Well, I was utterly and completely wrong on every front. I actually told myself a week ago that I would buy an Xbox One X and a 4K TV since I needed a new TV anyway.

I have now used my computer, played games and watched movies on my 49" Phlips 6482 with my computer and Xbox One X for a couple of days and can only say.... wow.  The pixel density and sharpness of the image is visible right away and it is the first time since I got a 360 back in 2007 that I have felt amazed by the image quality.

And HDR is as impressive as the new resolution. Old games like Ninja Gaiden Black feel like remastered games now, thay look so amazing? And HDR makes every image become more vivid and vibrant.

Any more people that have made the complete jump? Experiences?

Unless I have a 72 inch television minimum and all of my major video sources are in 4k, it's just not worth it at this point.