By using this site, you agree to our Privacy Policy and our Terms of Use. Close
DonFerrari said:
CrazyGPU said:

This is not wrong but it´s best case scenario for PS5. We know that the clocks of the XBOX are fixed. But the PS5 clocks will throttle. So the CPU won´t be at 3.5 most of the time when the GPU is at 2.23 Ghz. Also the GPU  uses much more power than the CPU. That means that a slowdown of the CPU doesn´t mean for sure that the GPU will be able to keep its 2.23 Ghz clock. My guess here is that the PS5 was going to be clocked at 9.2 as many rumors said but as Microsoft rose the bar with the 12.15 Teraflops XBOX, they decided to go for a variable frecuency that gives the idea of more than 10 Teraflops and show that the difference is not that big. Mu guess is that the real world difference is going to be 20%. That´s a big difference. ITs more than a PS4 (old architecture 1.84 teraflops).

Now, Will that 20% be that noticeable? Maybe. 60 to 50 fps or same fps with some downgraded shadows, less effects or motion blur here and there, or even more drops in variable resolution. I don´t think it wil be a game changer. When we see games running on XBOX one X and PS4 pro, the difference is greater in every way, but still the ps4 pro gives us a good experience.  

In my opinion Microsoft needs to show me more good games, and new IPs to make me feel that Im loosing something. Otherwise I´ll just accept that my machine is 20% slower but play the awesome games that companies like Naughty Dog or Santa Monica and others have for me. 

Yes Sony were able to make the entire Smartshift, cooling solution, decide to control by frequency with fixed power consumption and everything else in a couple days since MS had revealed the specs? Or do we give then a couple months for when the rumours were more thrustworthy?

The best you can have for "reactionary" is that Sony was expecting MS to go very high on CU count and have a high TF number and chose the cheaper route to put higher frequency, but that was decided like 2 years ago.

My only issue is that they wanted to overclock "anything" and decided to go with friggin AMD. We all know AMD wasn't as friendly towards increasing clock speeds as Intel, to whom overclocking is kind of their thing.