By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Eagle367 said:
SvennoJ said:

It all depends on the size of the screen you want to play on, or how much of your fov is filled up with it. VR being the largest as it tries to fill up your entire fov which is up to 150 degrees per eye. Sitting 8ft away from a 150" screen (projector range for now but screens keep getting bigger as well) you have a fov of about 76 degrees. Atm 37 degrees is more common 8ft away from a 65" tv.

Recommended sitting distance for 20/20 vision is based on 60 pixels per degree. 8ft from a 65" tv puts you at needing at least 2,220 pixels horizontally, thus higher than 1080p, but lower than 4K. However you can easily see improvements at up to double that, 120 pixels per degree, since rows of square pixels are more easily detected than analog pictures. So you might even have some benefit from 8K at that distance and size and/or still need good AA.

For fps other things come into play. The human eye tracks moving objects to get a clear picture, however on a screen objects make steps over the screen which makes it hard to follow them. The solution for this jarring effect so far has been to apply motion blur. It doesn't make it sharper, it's not how the human eye works but it's easier to watch a blurry streak than a stuttering object cross the screen. To make it like real life, moving objects need to make smaller steps more often. Ideally they only move 1 pixel per step. Depending on how fast the object goes and the resolution of the screen, the required fps for a moving object can go way past 1000 fps.

Of course there are also limits to what the human eye can track. When in a moving car, the road further away is perfectly sharp, while the closer you look you get to an area where you get flashes of sharp 'texture' (where the eye temporarily tracks or scans the road) to an area where it's all a motion blur. Per object frame rate is something for the future to replace per object motion blur so the eye can work naturally. (Actually it's nothing new, the mouse pointer animating independently from a 30 fps video playing underneath is common practice, as well as distant animations running at reduced speeds) Anyway the bigger the screen, the worse the effects of lower frame rates, or the higher the fps you need to present a stable moving picture. For cinema there have always been rules not to exceed certain panning speeds to still be able to make some sense out of the stuttering presentation.

So yes, there are diminishing returns when it comes to resolution and fps. Or the opposite, achieving the last 10% of 'realism' cost more than the first 90%.

What I was hinting at is that there are more ways to do ray tracing, and dedicated hardware can help or can hinder innovation. Or rather there is not a simple switch to add ray tracing to a game by turning the chip on. Plenty other things need to be done (which will slow down the rest) to make the best use of the ray tracing cores. But it will help. Software only ray tracing would severely restrict the resolution to make it feasible. (Or needing a lot of shortcuts making it far less impressive)

Anyway as long as I can still easily tell the difference between my window and the tv, not there yet :)

Interesting read. I guess I'm just a minimalist and don't care that much about the 10%. I mean I prefer switch over all other consoles for a reason

If you currently don't care about a difference in IQ from X1X level to Switch then yep we are at a point that any improvement in IQ is not necessary for you.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."