Pemalite said:
Depends on the type of HDR. |
Lmao. Say you don't game in HDR without saying you don't game in HDR.
You can only choose one. | |||
Shit image quality, great graphics. | 9 | 34.62% | |
Shit graphics, prestine image quality. | 17 | 65.38% | |
Total: | 26 |
Pemalite said:
Depends on the type of HDR. |
Lmao. Say you don't game in HDR without saying you don't game in HDR.
curl-6 said: Sitting 3 meters back from a 42 inch screen as I do, stuff like Ace Combat 7 on Switch at 720p still looks fine. Image treatment is a major factor though; AC7 is relatively clean without a lot of post-processing and such, whereas games that throw around a lot of post-processing tend to look blurrier at the same pixel count, so there's more to image quality than just resolution. 4K is a waste of resources in my opinion, for consoles anyway, when 1440p and even a well treated 1080p still look good enough in my opinion. |
Only thing bigger of a waste of recourses for console couch gaming is Ray Tracing but even if that wasn't so intensive it would be still more noticeable an improvement than 4k detail at couch distance.
LegitHyperbole said:
Idk how. I still far exceed 20/20 vision and I got out a measuring tape from my 50 inch and used the PS5 menus as the perfect example of detail. If you're saying you notice a clear difference at 4 feet, you're lying to yourself. Between 3 and 4 feet is where it melts away and doesn't matter very much. Between 4 and 5 feet it matters basically to the point of nothing. At my usual 10 feet back there is zero difference. It's like having a 1440p phone, I had a 6 inch 1440p phone for about 2 years and I "upgraded" to a 1080p phone at 5.2 inches which most phones are now and for good reason and that looked clearer and sharper or at the very least no more sharper than the waste of battery power 1440p was and to that point 60fps vs 120fps is a noticeable difference if you're paying attention but it really matters not for daily use, only when you put your intent on it and switch between the two will you actually notice it, perhaps that is what you are doing. Switching between the two in the settings and having your attention on the difference make it feel like it's a bigger difference than it is. I would suggest you get someone to randomise the settings while you're not looking, go no closer than 4 feet from your large monitor and see if you can see the difference in an unbiased controlled manner and after a few attempts you'll see that you are guessing. Edit: I will admit, there is a slight difference to rounded edges, looking at some big blocky text and the roundness of the lettering is apparent but it's not enough to notice if I wasn't switching back and forth between 1080p and 2160p going out of my way to check. |
The big difference between 4 ft on a big monitor and 10 ft on a larger tv is, you can easily lean in to a monitor to get a closer look at details.
Lean in a foot from 4 feet distance gets you 1/4th of the way closer, lean in from 10ft on gets you 1/10th of the way closer.
Higher resolution also reduces aliasing, just like super sampling. Render in 1440p and display on a 1080p tv looks better than rendering at native resolution.
So if you're sensitive to aliasing, then yes higher resolution trumps everything.
The monitor also matters, 1080p on a 1440p monitor looks like crap. Downsampling makes stuff look better, more natural, less aliasing. Upscaling always makes it look worse.
If it's a 4K tv/monitor it gets kinda iffy. 1080p simply doubles to 4K, 1440p smears every pixel over 1.5 pixels. 1080p on 4K will look 'blocky' 1:2 upscale,
1440p looks more blurry. Again if you can't stand aliasing, then the blur from 1:1.5 upscaling can look nicer as it's basically another smoothing pass.
1080p on 1440p monitor is a 1:1.333 upscale, the worst.
Too bad CRT hit a dead end, much better tech to make all kinds of different resolutions and refresh rates look good. Modern displays kinda demand to be fed native resolutions, anything else looks bad.
As for guessing, experiments have determined that people can guess 'correctly' up to 90 pixels per degree. (As in more often right than wrong when asked to pick the better looking picture)
For a 50 inch TV, you should be able to guess correctly up to 4.9 ft from the TV for 4K, 7.3ft for 1440p, 9.8ft for 1080p.
For a large 43 inch Monitor, it's up to 4.2ft for 4K, 6.3ft for 1440p, 8.4ft for 1080p.
So you're both correct. You can't see any difference from 10ft, Pemalite should still be able to tell the difference between 1440p and 4K at 4ft.
However if it melts away for you between 3 and 4ft from 50 inch, you don't have better than 20/20 vision ;)
LegitHyperbole said: Idk how. I still far exceed 20/20 vision and I got out a measuring tape from my 50 inch and used the PS5 menus as the perfect example of detail. If you're saying you notice a clear difference at 4 feet, you're lying to yourself. Between 3 and 4 feet is where it melts away and doesn't matter very much. Between 4 and 5 feet it matters basically to the point of nothing. At my usual 10 feet back there is zero difference. It's like having a 1440p phone, I had a 6 inch 1440p phone for about 2 years and I "upgraded" to a 1080p phone at 5.2 inches which most phones are now and for good reason and that looked clearer and sharper or at the very least no more sharper than the waste of battery power 1440p was and to that point 60fps vs 120fps is a noticeable difference if you're paying attention but it really matters not for daily use, only when you put your intent on it and switch between the two will you actually notice it, perhaps that is what you are doing. Switching between the two in the settings and having your attention on the difference make it feel like it's a bigger difference than it is. I would suggest you get someone to randomise the settings while you're not looking, go no closer than 4 feet from your large monitor and see if you can see the difference in an unbiased controlled manner and after a few attempts you'll see that you are guessing. Edit: I will admit, there is a slight difference to rounded edges, looking at some big blocky text and the roundness of the lettering is apparent but it's not enough to notice if I wasn't switching back and forth between 1080p and 2160p going out of my way to check. |
My vision also exceeds 20/20.
My Television is an 85" OLED.
My computer monitor is a 43" IPS overclocked to 144hz.
At 4-5 feet you can see the difference in aliasing between 1440P and 1080P on both my displays. (Although I need to sit significantly further back with my OLED due to size.)
As for phones... I can tell a slight difference in clarity between 1080P and 1440P on my Galaxy S24 Ultra, but it's not significant, the benefit of battery life and system responsiveness of rendering at 1080P rather than 1440P on my phone is worth it.
This chart breaks down the benefits of resolutions over distance per panel size in a much more coherent manner than our back and forth unscientific anecdote exchanges.
Basically at 5 feet... 1440P has benefits over 1080P on my 43" panel.
...But to also be fair, 5 feet is an arbitrary number, I sit at about an arms length and run multiple quadrants... So there is visible gains from 1440P to 2160P... And I would even benefit from 8k.
However panel resolution is only part of the equation, things like pixel crawl, breakup, aliasing, shimmer and more can present itself even at "optimal resolution over distance" scenarios.
Rasterization is not always pixel accurate... For example shadows and other effects are often rendered at a fraction of the output resolution in games to conserve rendering budget.
Textures are also their own independent resolution with many textures being 16k in resolution these days, some even 32k or 64k, the higher those resolutions the better they look even on a 2k (1080P) display.
Anyhow, I have shared my preference of resolution to drive clarity, if you disagree or don't like it... That's really your prerogative. My personal preferences won't change on this matter, I like what I like.
--::{PC Gaming Master Race}::--
To get back on the topic.
Image quality (R&C ToD) 3840x2160
Or Graphics: Alan Wake (PS5 Pro) 1400x787
Anyway I played more of Hitman 3 on PSVR2 and not having proper reflections is very distracting. It doesn't help the setting in Paris is a Fashion Show with lots of mirrors and other reflective surfaces around that all use a cheap smear trick to reach 60 fps with good image quality. Also when you watch the crowd you can see the selective lower frequency updates 'rolling' through the crowd. While the draw distance and resolution clarity are great, it reveals other short cuts that take you out of the immersion.
Next the Amalfi Coast is very cool to run around in the town in VR, but it never feels any close to as real as the made up village in RE: Village. The geometry and objects are less detailed than Uncharted 3 next to the lighting being very basic with exaggerated fake HDR when entering / leaving darker areas. Stable clean image, but not very immersive.
My edible actually started to kick in during the Amalfi coast, and where that altered state of mind made RE8 look even more realists with the feeling I was really there, Hitman turned more into a cartoon representation. (Also cool to walk through a cartoon, or rather get chased while running around in a Hazmat suit after I got spotted and got horribly lost running away lol)
Anyway for stylized games, image quality rules.
For games representing real world (or realistic) locations, graphics need to sell the illusion.
Rise of the Ronin is the perfect example, it isn't a great looking game, it's not bad just not really good but the image quality aside from some shimmering on hair and slightly pop in, it's pretty damn good. I think I'm leaning towards image quality cause if your looking through a layer of transparent oil then you can't see the graphics underneath.
I've been looking at Mario Kart World too and noticed there is a lit of ailising which isn't on for a first party titles in the modern age, they have Hitman looking ad sharp as anything and then you see technical remnants of before 8th gen. There is no excuse for ailising anymore and I hold Atlus to that aswell cause if Team Ninja can get rid of their ailising issues that plagued their games anyone can.
Image Quality by a mile. It's more important the game looks clean than be a blurry noisy mess with all the bells and whistles. In fact the more detailed your graphics are the more important having a high resolution is so all that detail doesn't turn into a visual mess.
I can play a PS2 game emulated at 1080p no problem, but modern games dropping as low as sub-720p is gross, no matter how much graphical effects you throw at it. Which is why I never touched Doom or Wolfenstein on the Switch.
Having said that there's a limit. I don't need everything to be native 4k at the expense of graphical features either.
Definitely image quality. Something like Half-Life 2 ran at a really high resolution is gonna look nicer than a modern game that has aliasing and visual artifacts everywhere ruining the image. Upscaling tech will increasingly help improve image quality over time since now you can get a game that's running at 960p upscaled to 1440p that looks significantly nicer than a native 1080p image. Eventually it'll get good enough that even low-end hardware will be able to handle 4k decently even if the internal resolution is really low.