By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Eagle367 said:

I mean, that's all theoretical and on paper values. But look at the law if diminishing returns. How much more effective is 4k vs 12k vs 30k? Or how much of a difference does one notice between 60fos vs 120 vs 240 vs 480? In practical terms, is there an advantage to go in that direction or are we gonna go just for the sake of going?

It all depends on the size of the screen you want to play on, or how much of your fov is filled up with it. VR being the largest as it tries to fill up your entire fov which is up to 150 degrees per eye. Sitting 8ft away from a 150" screen (projector range for now but screens keep getting bigger as well) you have a fov of about 76 degrees. Atm 37 degrees is more common 8ft away from a 65" tv.

Recommended sitting distance for 20/20 vision is based on 60 pixels per degree. 8ft from a 65" tv puts you at needing at least 2,220 pixels horizontally, thus higher than 1080p, but lower than 4K. However you can easily see improvements at up to double that, 120 pixels per degree, since rows of square pixels are more easily detected than analog pictures. So you might even have some benefit from 8K at that distance and size and/or still need good AA.

For fps other things come into play. The human eye tracks moving objects to get a clear picture, however on a screen objects make steps over the screen which makes it hard to follow them. The solution for this jarring effect so far has been to apply motion blur. It doesn't make it sharper, it's not how the human eye works but it's easier to watch a blurry streak than a stuttering object cross the screen. To make it like real life, moving objects need to make smaller steps more often. Ideally they only move 1 pixel per step. Depending on how fast the object goes and the resolution of the screen, the required fps for a moving object can go way past 1000 fps.

Of course there are also limits to what the human eye can track. When in a moving car, the road further away is perfectly sharp, while the closer you look you get to an area where you get flashes of sharp 'texture' (where the eye temporarily tracks or scans the road) to an area where it's all a motion blur. Per object frame rate is something for the future to replace per object motion blur so the eye can work naturally. (Actually it's nothing new, the mouse pointer animating independently from a 30 fps video playing underneath is common practice, as well as distant animations running at reduced speeds) Anyway the bigger the screen, the worse the effects of lower frame rates, or the higher the fps you need to present a stable moving picture. For cinema there have always been rules not to exceed certain panning speeds to still be able to make some sense out of the stuttering presentation.

So yes, there are diminishing returns when it comes to resolution and fps. Or the opposite, achieving the last 10% of 'realism' cost more than the first 90%.

Pemalite said:

DonFerrari said:

Ray tracing on PS5 is through a dedicated chip so shouldn't impact much the rest.

Not a dedicated chip. It only has the one chip, the main SoC.
The Ray Tracing is done on dedicated Ray Tracing cores on the main chip, it's part of the GPU... Which is why Flops is a joke as Flops doesn't account for the Ray Tracing capabilities.

SvennoJ said:

There is no such thing as a ray tracing chip. There are many ways to do ray tracing, some faster, some with better quality. Some dedicated hardware can help but it's not magic. RTX cards still struggle with ray tracing. It will impact the rest, dedicated chip or not.

Not entirely accurate.
Historically what the industry has done was made dedicated DSP/ASIC/FPGA as a separate chip (Aka. Ray Processing Unit) that handled Ray Tracing duties, granted this was for more professional markets... But the point remains.
Dedicated Ray Tracing chips have existed even as far back as the late 90's/early 2000's.

https://en.wikipedia.org/wiki/Ray-tracing_hardware

What I was hinting at is that there are more ways to do ray tracing, and dedicated hardware can help or can hinder innovation. Or rather there is not a simple switch to add ray tracing to a game by turning the chip on. Plenty other things need to be done (which will slow down the rest) to make the best use of the ray tracing cores. But it will help. Software only ray tracing would severely restrict the resolution to make it feasible. (Or needing a lot of shortcuts making it far less impressive)

Anyway as long as I can still easily tell the difference between my window and the tv, not there yet :)

Last edited by SvennoJ - on 20 February 2020