HoloDust said:
drkohler said:
Well, DevUnit 40 (full use) CUs at 2GHz. PS5 36 CUs (yield!) at 40/36 * 2GHz = 2.22GHz.
You might get the idea they jacked up the clock in a last ditch effort to get to 10.3TFlops after seeing XSX at 12TFlops. On the other hand, the idea of coontinuously running the console in the thermal limit probably takes years of planning.
|
Is there any info on TMUs/ROPs? For either console?
|
I think its safe to assume both will have 64 ROPs. Their not hungry/powerfull enough to require more.
TMUs depends on the "core" counts of both.
Ex:
Xbox series X = 52 CU's (x64 shaders pr CU)= 3328 shaders = 208 TMUs? (16 shaders pr TMU) + 64 ROPs (these are just a fixed number)
While for a Playstation 5, that would look like:
Playstation 5 = 36 CU's (x64 shaders pr CU) = 2304 shaders = 144 TMUs? (16 shaders pr TMU) + 64 ROPs (these are just a fixed number)
^ this is just a educated guess ( I could be wrong)
How this relates to "Pixel Rate" or "Texture Rate" then becomes a issue of math, useing the clock speeds both GPUs are running at.
-------------------
Texture Filter Rate = Core Clock * TMUs.
"In the case of the GTX 980's GM204 chip, that would be 128 TMUs * 1126 = 144128. Note that the 1126 clock speed is measured as MHz, or millions of oscillations per second, so that'd actually be 128 * 1126MHz = 144.1GT/s; in other numbers, 128 * 1126 * (1000/s) = 144.1GT/s."
Xbox Series X: (useing 208 TMUs) = 208 x 1825 = 379,600 (379,6 GT/s)
Playstation 5: (useing 144 TMUs) = 144 x 2230 = 321,120 (321,1 GT/s)
Pixel Rate is more or less the same, its just ROPs x core clocks.
So Playstation 5 should have a advantage in Pixel Rate as I understand it, but have a lower Texture filter rate.
The rest is the Flops numbers, and memory bandwidth.
Playstation 5 is a slower GPU part than the series x.
Last edited by JRPGfan - on 18 March 2020