By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - HDR or 4K gaming? You have to choose one.

 

I choose

HDR 12 44.44%
 
4K 7 25.93%
 
Neither impress me. 8 29.63%
 
Total:27
LegitHyperbole said:

I've played some games where the brightness peaks like abfade to white screen or an explosion/fire effect and went "whoah, too bright", I suppose mostly in the evening or night. I don't think going past 500 would be a good idea in the headset but sure I've yet to see what 250 in it looks like. 

With TV it's different since you're focused on a rectangle in a dark room. (I assume you play with low ambient light to be able to see the shadow detail).

VR fills half of your vision (110 degrees out of 220) so your eyes are better prepared to adjust. But yes, the screen suddenly fading to bright white after dark scenes (Moss is guilty of that) can hurt. It's the sudden change from being used to the dark to suddenly opening a door into full sunlight. Doesn't happen often in RL and VR games shouldn't do that too often either.

However your pupils actually adjusting to the brightness does add to the immersion. Dawn, sunset and night scenes in GT7 look very realistic, yet full day time still feels a bit off. Your eyes are still in low light mode while the screen tries to portray a bright sunny day.

Also night scenes can take advantage from higher nits. Like how fire and fireworks never look the same on tv as outside. They look a lot better already in GT7 on PSVR2, but still don't compare to watching a fireworks display outside.

But yes, there would need to be some guidelines not to go from 20 nits to full screen 1,000 nits without warning!



Around the Network
LegitHyperbole said:
OneTime said:

Well... I went to Amazon, and while HDR comes with most TV sets, it's certainly not on the cheapest.  So it's clearly not standard yet...

Maybe so, but on consoles and for games releasing it's standard. 1080p was standard for years but you could still buy 720p TV's and you still can I believe.  

Not really standard yet. A lot of games still fake it and on PC it's still a mess when it comes to HDR.

To get the full effect, textures have to be captured in 10 bit Rec.2020 color with a 12 bit render pipeline. 8 bit is still the standard with fake HDR applied at the end of the pipeline :/

Then you have HDR10, HDR10+, Dolby Vision, HLG as the current 'standards'.

I still find it pretty rare to encounter a great HDR implementation. I wish Digital Foundry would put more focus on that instead of pixel and frame rate counting.



My eyes can't tell the difference.



http://www.youtube.com/watch?v=F1gWECYYOSo

Please Watch/Share this video so it gets shown in Hollywood.

LegitHyperbole said:
Chrkeller said:

Optimal for me is 1440p, HDR and 80 to 120 fps.

4k is overkill unless someone has a 75 inch TV. 

And make no mistake, 4k takes a huge amount of resources.  I get 40 to 60 fps on TLoU at 4k.  1440p I get almost a locked 120 fps.  

Higher fps is absolutely game changing.  It impacts gameplay.  My accuracy in games, like RE4, is so much higher at 120 fps.  

Yep, 4k is definitely for TV console gaming but I'd say depends on distance. 4k on a 43" is still a massive improvement but you really need like 50" + to start seeing the gains. I'm 10 feet from a 50" and find it's a great sweet spot but alas, this is mostly TV cause PRO is usually 1300p or 1800p. 

4k is great when it can be pulled off.  But it requires some power.  A lot of power tbh.

If I were going to rank fidelity impact:

1) panel quality (OLED by itself is a massive upgrade)

2) fps, because it directly impacts gameplay via accuracy 

3) ultra settings via lighting, particles, shadows, textures, volumetric, etc.

4) resolution 



In movies the increased bith depth of HDR gives (on my TV) better colouring but the actual high dynamic range (while nice in certain scenes) I really don’t care that much about. To translate this to games both points are moot and not a big deal so thats why I don’t care about HDR.

4K is just more pixels and at a normal viewing distance is just a waste of resources.

Spend the computinational power on better (smarter) enemies/environments more enteties (stuff in game) etc instead.



Around the Network
Spindel said:

In movies the increased bith depth of HDR gives (on my TV) better colouring but the actual high dynamic range (while nice in certain scenes) I really don’t care that much about. To translate this to games both points are moot and not a big deal so thats why I don’t care about HDR.

4K is just more pixels and at a normal viewing distance is just a waste of resources.

Spend the computinational power on better (smarter) enemies/environments more enteties (stuff in game) etc instead.

And particle effects like in Ghost of Tsushima. 



Signalstar said:

My eyes can't tell the difference.

Perhaps a bad display, there's an obvious difference with HDR right away but you really have to be using 4k for a while and then go back to 1080p before it really hits home. 



Well my capture card only does 4k 60 so I guess that. I dunno which one looks better?



I am Iron Man

Whichever gives me 60 FPS.



HDR.
When I switch between watching Blu-rays and 4K Blu-rays (namely a comparison between the same movie) I notice the lack of HDR on Blu-ray more than the lower resolution.
But shoot, give me locked 60 FPS over either HDR or 4K.



Lifetime Sales Predictions 

Switch: 161 million (was 73 million, then 96 million, then 113 million, then 125 million, then 144 million, then 151 million, then 156 million)

PS5: 115 million (was 105 million) Xbox Series S/X: 48 million (was 60 million, then 67 million, then 57 million)

PS4: 120 mil (was 100 then 130 million, then 122 million) Xbox One: 51 mil (was 50 then 55 mil)

3DS: 75.5 mil (was 73, then 77 million)

"Let go your earthly tether, enter the void, empty and become wind." - Guru Laghima