Conina said:
960x540 is only one fourth (1/4) of 1980x1080. With perfect scaling (so if CPU, RAM, bandwith... aren't bottlenecks) you would only need a 2.3 Tflop-GPU (1/4 of 9.2 Tflops), not 4 Tflops (same architecture and everything else the same). Since scaling ain't perfect, you can expect factors of 3 - 5 when you reduce the resolution to 25%. Let's have a look at your own chart:
3.6x - 3.8x the performance by reducing the resolution to 25%, as expected. Yeah, you still suck at math. |
Yes if games were native 4k on Ps5, they would be 1080p on Series S. I'm pretty sure I told you that already and anyone who knows anything about game design will also tell you that native 4k is not gonna be a target for next gen console games as it's way too expensive. My guess is that 1440p and 1080p will be standard as that looks good enough on a tv and it frees up a ton of resources that developers can use elsewhere.
So, for arguments sake lets say a game is 1080p/30fps on ps5, in what kind of resolution do you think it would have to run on Series S to hit that same 30 fps?









