By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Why we cannot compare PS5 and Xbox series X directly, EDIT : add Resident Evil 3 remake and DOOM Eternal that run better on PS4 Pro as an example, also add EX- Crytek Developer and programer testimon

DonFerrari said:

Mark Cerny and Digital Foundry explained it, it isn`t a temporary boost. It is the regular use of the GPU and CPU, either of them can be kept to the maximum frequency all the time, sure he don`t say both would be at the same time, and that is where the change in frequency comes for one going down (few %) for the other to go the maximum.

Not only both of you are satisfied but also expect shift on how games are made when leveraging the SSD advantages.

Sony specifically mentioned AMD's Smart Shift technology.
I have a notebook that leverages similar technology.

Basically if the demand for the CPU or GPU is lesser... Then the other can clock up to it's maximum as it has the TDP available.

It cannot maintain both the CPU and GPU at maximum clocks indefinitely, otherwise it's not a "boost mode" at all and Sony shouldn't have even bothered to mention it.



--::{PC Gaming Master Race}::--

Around the Network
HollyGamer said:

OK guys another example is Resident Evil 3 Remake this games run slightly better resolution but worse frame rates on One X but run almost 60 fps on PS4 Pro with lower resolution the One X. Remember on paper both One X has 45 % advantage in terms FLOP ( plus One X has better memory setupand with UHD 4k). This is just shows in reality both machine has some thread off , some aspect can be run better some aspect can be run bad, they have plus and minus. Comparing PS5 and Series X will be even harder and imposible both even far closer than PS4 and Xbox One (41 %), and with both can produce native 4k and run VRS i believe both will run the same .

All this shows is that RE3 is poorly optimized for Scorpio. 

You can give a dev all the power in the world, it won’t make up for bad development.



LudicrousSpeed said:
HollyGamer said:

OK guys another example is Resident Evil 3 Remake this games run slightly better resolution but worse frame rates on One X but run almost 60 fps on PS4 Pro with lower resolution the One X. Remember on paper both One X has 45 % advantage in terms FLOP ( plus One X has better memory setupand with UHD 4k). This is just shows in reality both machine has some thread off , some aspect can be run better some aspect can be run bad, they have plus and minus. Comparing PS5 and Series X will be even harder and imposible both even far closer than PS4 and Xbox One (41 %), and with both can produce native 4k and run VRS i believe both will run the same .

All this shows is that RE3 is poorly optimized for Scorpio. 

You can give a dev all the power in the world, it won’t make up for bad development.

Not at all, Xbox One X run with DirectX api,  the most easiest API and well known for PC. This games are multiplat games that coming to PC so it's not that rocket science. Another games I forgot to include is Sekiro  that has the same problem with RE 3.  PS5 and Xbox S X is just 15% different  in TF number , while some other spec have some minus and plus. on Top of that now both run at full 4K and 60 fps with HDMI 2.1 VRR (variable refresh rate ) .  I dare to say it will be even pointless .  



If you think a game running a more powerful hardware at almost half the frame rate is anything other than poor optimization, idk what to tell you 😆



LudicrousSpeed said:
If you think a game running a more powerful hardware at almost half the frame rate is anything other than poor optimization, idk what to tell you 😆

more powerful in TF yes ( that also just 42 %) PS4 pro has rapid Math while One X does not. Graphic is not just TF there is more than that. Even Digital Foundry said PS4 pro are lack  of bandwidth compared to  One X on top of that One X has 4 GB more RAM. That is the  biggest disadvantage Pro over One X. PS5 will not have the same problem because even if Series X is  560 GB/Second ,  some RAM  modul run slower at 330 GB/S.  while PS5 has unified number speed with 448 GB/S across the memory setup and has the same 16GB amount of RAM the same with Series X. 



Around the Network

I read about half of what you said, failing to see any relation to a poorly optimized demo. Keep grasping at straws though.



Pemalite said:


Basically if the demand for the CPU or GPU is lesser... Then the other can clock up to it's maximum as it has the TDP available.

It cannot maintain both the CPU and GPU at maximum clocks indefinitely, otherwise it's not a "boost mode" at all and Sony shouldn't have even bothered to mention it.

Complete nonsense. Again you didn't listen to what Cerny said. Try again, starting around the 35 minute mark.

The PS5's maximum cpu and gpu clocks are UNKNOWN. The cpu is CAPPED at 3.5GHz. The gpu is CAPPED at 2.23GHz. These are the maximum frequencies allowed that guarantee correct operations inside the cpu and gpu, under all conditions. We have no idea how the cooling system (and power supply) was designed for what power dissipation limit. At worst, it was designed to just hold the 3.5/2.23GHz clocks (with rocket noise or not), at best it was designed to hold 4/2.5GHz clock levels (probably with rocket noise, those are some high frequencies). The proof is in the pudding, and we don't have any yet to eat.

When you place your PS5 into the fridge, it WILL indefinitely run games at the maximum allowed two frequencies as the cooling can handle max power without problems. The ability to shift power from the cpu to the gpu is always there, of course, but it will simply not take place due to the caps.

Now if you are the ranger in the Death Valley ranger station and decide to play a game around noon in the front yard, thing are different, there is a thermometer element hidden somewhere. Then all the frequency shifting takes place (incidentally, Cerny didn't say what happens when you really ARE in Death Valley locations. But so did he "miss to mention" critical stuff in other places). Who wins and who loses depends on what the game is doing at any moment in time, obviously. Don't expect to see significant drops, though. Cerny mentions a 10% in power drop only costs a few % in clock rates, so I'm guessing we won't likely see "bad" clock rates below the 2.1Ghz point on the gpu.



LudicrousSpeed said:
I read about half of what you said, failing to see any relation to a poorly optimized demo. Keep grasping at straws though.

Yes probably , but i see a lot not just one case. And that also means FLOPS does not help optimization at all. ANd developer can choose to run well on both . 



drkohler said:
Pemalite said:


Basically if the demand for the CPU or GPU is lesser... Then the other can clock up to it's maximum as it has the TDP available.

It cannot maintain both the CPU and GPU at maximum clocks indefinitely, otherwise it's not a "boost mode" at all and Sony shouldn't have even bothered to mention it.

Complete nonsense. Again you didn't listen to what Cerny said. Try again, starting around the 35 minute mark.

The PS5's maximum cpu and gpu clocks are UNKNOWN. The cpu is CAPPED at 3.5GHz. The gpu is CAPPED at 2.23GHz. These are the maximum frequencies allowed that guarantee correct operations inside the cpu and gpu, under all conditions. We have no idea how the cooling system (and power supply) was designed for what power dissipation limit. At worst, it was designed to just hold the 3.5/2.23GHz clocks (with rocket noise or not), at best it was designed to hold 4/2.5GHz clock levels (probably with rocket noise, those are some high frequencies). The proof is in the pudding, and we don't have any yet to eat.

When you place your PS5 into the fridge, it WILL indefinitely run games at the maximum allowed two frequencies as the cooling can handle max power without problems. The ability to shift power from the cpu to the gpu is always there, of course, but it will simply not take place due to the caps.

Now if you are the ranger in the Death Valley ranger station and decide to play a game around noon in the front yard, thing are different, there is a thermometer element hidden somewhere. Then all the frequency shifting takes place (incidentally, Cerny didn't say what happens when you really ARE in Death Valley locations. But so did he "miss to mention" critical stuff in other places). Who wins and who loses depends on what the game is doing at any moment in time, obviously. Don't expect to see significant drops, though. Cerny mentions a 10% in power drop only costs a few % in clock rates, so I'm guessing we won't likely see "bad" clock rates below the 2.1Ghz point on the gpu.

you might want to read another analysis at https://wccftech.com/sony-ps5-vs-xbox-series-x-analysis/.It's hard for me to process the article  ,it probably will add to a good discussion.



HollyGamer said:
LudicrousSpeed said:
If you think a game running a more powerful hardware at almost half the frame rate is anything other than poor optimization, idk what to tell you 😆

PS5 will not have the same problem because even if Series X is  560 GB/Second ,  some RAM  modul run slower at 330 GB/S.  while PS5 has unified number speed with 448 GB/S across the memory setup and has the same 16GB amount of RAM the same with Series X. 

Not true. That's one of the points Cerny missed to address. The XSX can use 10Gbytes of ram at 560 GB/s. That is the obvious place where textures, frame buffers and all the key stuff is allocated. The compiler/linker will make sure of that, all games, all the time. The PS4 only has 448GB/s. If 448 GB/s is enough for safe 4k/60Hz I'm really not sure. The games wil tell, but I think this is a gamble (simply for using lower priced ram chips) that might not pay off in the end. On the same games, the XSX will have more (native) pixels on screen. On the other hand, the PS5 will have the "better" pixels if all the ssd-tricks are used.