LegitHyperbole said:
Idk how. I still far exceed 20/20 vision and I got out a measuring tape from my 50 inch and used the PS5 menus as the perfect example of detail. If you're saying you notice a clear difference at 4 feet, you're lying to yourself. Between 3 and 4 feet is where it melts away and doesn't matter very much. Between 4 and 5 feet it matters basically to the point of nothing. At my usual 10 feet back there is zero difference. It's like having a 1440p phone, I had a 6 inch 1440p phone for about 2 years and I "upgraded" to a 1080p phone at 5.2 inches which most phones are now and for good reason and that looked clearer and sharper or at the very least no more sharper than the waste of battery power 1440p was and to that point 60fps vs 120fps is a noticeable difference if you're paying attention but it really matters not for daily use, only when you put your intent on it and switch between the two will you actually notice it, perhaps that is what you are doing. Switching between the two in the settings and having your attention on the difference make it feel like it's a bigger difference than it is. I would suggest you get someone to randomise the settings while you're not looking, go no closer than 4 feet from your large monitor and see if you can see the difference in an unbiased controlled manner and after a few attempts you'll see that you are guessing. Edit: I will admit, there is a slight difference to rounded edges, looking at some big blocky text and the roundness of the lettering is apparent but it's not enough to notice if I wasn't switching back and forth between 1080p and 2160p going out of my way to check. |
The big difference between 4 ft on a big monitor and 10 ft on a larger tv is, you can easily lean in to a monitor to get a closer look at details.
Lean in a foot from 4 feet distance gets you 1/4th of the way closer, lean in from 10ft on gets you 1/10th of the way closer.
Higher resolution also reduces aliasing, just like super sampling. Render in 1440p and display on a 1080p tv looks better than rendering at native resolution.
So if you're sensitive to aliasing, then yes higher resolution trumps everything.
The monitor also matters, 1080p on a 1440p monitor looks like crap. Downsampling makes stuff look better, more natural, less aliasing. Upscaling always makes it look worse.
If it's a 4K tv/monitor it gets kinda iffy. 1080p simply doubles to 4K, 1440p smears every pixel over 1.5 pixels. 1080p on 4K will look 'blocky' 1:2 upscale,
1440p looks more blurry. Again if you can't stand aliasing, then the blur from 1:1.5 upscaling can look nicer as it's basically another smoothing pass.
1080p on 1440p monitor is a 1:1.333 upscale, the worst.
Too bad CRT hit a dead end, much better tech to make all kinds of different resolutions and refresh rates look good. Modern displays kinda demand to be fed native resolutions, anything else looks bad.
As for guessing, experiments have determined that people can guess 'correctly' up to 90 pixels per degree. (As in more often right than wrong when asked to pick the better looking picture)
For a 50 inch TV, you should be able to guess correctly up to 4.9 ft from the TV for 4K, 7.3ft for 1440p, 9.8ft for 1080p.
For a large 43 inch Monitor, it's up to 4.2ft for 4K, 6.3ft for 1440p, 8.4ft for 1080p.
So you're both correct. You can't see any difference from 10ft, Pemalite should still be able to tell the difference between 1440p and 4K at 4ft.
However if it melts away for you between 3 and 4ft from 50 inch, you don't have better than 20/20 vision ;)