teigaga said:
Ermm... Ok. Until you care to elaborate in a meaningful response, I'm sticking with the widely akcnoweldged notion that higher resolutions are a GPU drain, hence PS4 Pro having 4.2Tflops versus 1.8, and rendering Horizon at 2x1080p instead of native 4k. Hence why several games utilising Pro have improved graphics at 1080p whilst having lower graphical settings in 4k (which isn't even native 4k) |
Fine. The response is as follows.
What if a game on the PS4 is built completely in FP16 or double precision? Your flops as you state them aren't an accurate representation as it is not being used.
Not to mention that you excluded the Render Output Pipelines, Texture Mapping Units, Geometry units, Caches, various fixed-function units, Bandwidth and Memory amount... They all play a role at running a game at a higher resolution you know.
Not to mention that AMD and nVidia engineer their GPU's so that the performance hit from running at a higher resolution tends not to be a linear decrease, so you cannot just double your flops and expect double the resolution.
So I shall ask again, Can you explain to me how your flops is somehow tied to a specific resolution?

www.youtube.com/@Pemalite








