By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Why we cannot compare PS5 and Xbox series X directly, EDIT : add Resident Evil 3 remake and DOOM Eternal that run better on PS4 Pro as an example, also add EX- Crytek Developer and programer testimon

Pemalite said:
HollyGamer said:

more powerful in TF yes ( that also just 42 %) PS4 pro has rapid Math while One X does not. Graphic is not just TF there is more than that. Even Digital Foundry said PS4 pro are lack  of bandwidth compared to  One X on top of that One X has 4 GB more RAM. That is the  biggest disadvantage Pro over One X. PS5 will not have the same problem because even if Series X is  560 GB/Second ,  some RAM  modul run slower at 330 GB/S.  while PS5 has unified number speed with 448 GB/S across the memory setup and has the same 16GB amount of RAM the same with Series X. 

Rapid Packed Math cannot be used 100% of the time.
The Xbox One X still has the advantage in pure computational throughput.

drkohler said:

Complete nonsense. Again you didn't listen to what Cerny said. Try again, starting around the 35 minute mark.

The PS5's maximum cpu and gpu clocks are UNKNOWN. The cpu is CAPPED at 3.5GHz. The gpu is CAPPED at 2.23GHz. These are the maximum frequencies allowed that guarantee correct operations inside the cpu and gpu, under all conditions. We have no idea how the cooling system (and power supply) was designed for what power dissipation limit. At worst, it was designed to just hold the 3.5/2.23GHz clocks (with rocket noise or not), at best it was designed to hold 4/2.5GHz clock levels (probably with rocket noise, those are some high frequencies). The proof is in the pudding, and we don't have any yet to eat.

When you place your PS5 into the fridge, it WILL indefinitely run games at the maximum allowed two frequencies as the cooling can handle max power without problems. The ability to shift power from the cpu to the gpu is always there, of course, but it will simply not take place due to the caps.

Now if you are the ranger in the Death Valley ranger station and decide to play a game around noon in the front yard, thing are different, there is a thermometer element hidden somewhere. Then all the frequency shifting takes place (incidentally, Cerny didn't say what happens when you really ARE in Death Valley locations. But so did he "miss to mention" critical stuff in other places). Who wins and who loses depends on what the game is doing at any moment in time, obviously. Don't expect to see significant drops, though. Cerny mentions a 10% in power drop only costs a few % in clock rates, so I'm guessing we won't likely see "bad" clock rates below the 2.1Ghz point on the gpu.

No. You didn't listen to what Cerny said.
Try again when he starts talking about Smartshift.

The "Capped" CPU and GPU speeds are their "maximum".

They can and will potentially be lower than their maximum depending on component demands and TDP.

drkohler said:

Not true. That's one of the points Cerny missed to address. The XSX can use 10Gbytes of ram at 560 GB/s. That is the obvious place where textures, frame buffers and all the key stuff is allocated. The compiler/linker will make sure of that, all games, all the time. The PS4 only has 448GB/s. If 448 GB/s is enough for safe 4k/60Hz I'm really not sure. The games wil tell, but I think this is a gamble (simply for using lower priced ram chips) that might not pay off in the end. On the same games, the XSX will have more (native) pixels on screen. On the other hand, the PS5 will have the "better" pixels if all the ssd-tricks are used.

The next gen consoles have more "real world bandwidth" than the raw numbers will lead us to believe... There has been a multitude of improvements since the Xbox One and Playstation 4 launched on this front.

Also... A bit of stretch that the Playstation 5 will have "better pixels" when the Xbox Series X has more functional units to improve visual fidelity.

DonFerrari said:

The Smarshift in PS5 is specifically to give unused power from CPU to GPU, doesn't seem like it will be the same you have on notebook.

Correct. And that is exactly how it's done in a Notebook.


No one said PS4 pro more powerful then Xbox One X , also rapid math cannot be used all the time i agree, but developer can used all the time if they want 



Around the Network
Conina said:
HollyGamer said:

Not at all, Xbox One X run with DirectX api,  the most easiest API and well known for PC. This games are multiplat games that coming to PC so it's not that rocket science.

So all games using an DirectX API are good optimized on Xbox and PCs? Is that what you are saying?

no one said that, it's just that Direct X API are well known API and game developer are more familiar and available and used by PC since the 90's. Compared to low level GMX Playstation has, direct X more simple when it come to porting from PC to Xbox or vice versa and many engine support direct X. 



HollyGamer said:
Conina said:

So all games using an DirectX API are good optimized on Xbox and PCs? Is that what you are saying?

no one said that, it's just that Direct X API are well known API and game developer are more familiar and available and used by PC since the 90's. Compared to low level GMX Playstation has, direct X more simple when it come to porting from PC to Xbox or vice versa and many engine support direct X. 

So you not only throw Direct3D 12 + Direct3D 11 into one pot (which are fundamentally different) but ALL Direct3D APIs since the Nineties?



Pemalite said:
drkohler said:

Again you don't understand "maximum" in that context. Both cpu and gpu could run faster than 3.5GHz and 2.23GHz (obviously as the cpu in the XSX runs at 3.66GHz). The capping runs both cpu and gpu BELOW the maximum clock rates possible, in order to maintain the power limit set by the cooling (which I really want to see).

Citation needed.
If you are asserting that the Playstation 5 will operate it's CPU and GPU at higher than 3.5Ghz and 2.23Ghz respectably... Then that is a bold assertion and you need to prove it.

Why are you writing such nonsense after I explained that the clocks are CAPPED? Which part of "The capping runs both cpu and gpu BELOW the maximum clock rates possible" do you intentionally ignore? I thought it is pretty obvious that the cpu could run faster as the XSX runs it at 3.66GHz (3.8GHz without smt), so we know with absolute certainty that the cpu clock is capped below the maximum clock rate possible. I really have no clue as to why you constantly refuse to see this fact.



I can't wait till we finally see at least a tech demo of what Sony's SSD tech will mean for actual games and level design. Fact is Series X has roughly 15% gpu power, but is that really going to translate into that much of a difference? The SSD tech on ps5 is over twice as fast compared to MS's solution. So if Sony really is that confident in SSD being the key to the next generation, and developers make full use of it, wouldn't that mean ps5 games could potentially do things that Series X cannot?

If so, the raw specs aren't going to matter at all. It will be about which console will be used as the base console. If that's gonna be the ps5, not only will that hurt Series X performance in multiplatform games, Sony would also completely destroy their plans of reaching a wider audience through GP by targeting main stream pc, current gen consoles and probably Xcloud.



Around the Network

Pemalite, are you MS inclined?



drkohler said:
Pemalite said:

Citation needed.
If you are asserting that the Playstation 5 will operate it's CPU and GPU at higher than 3.5Ghz and 2.23Ghz respectably... Then that is a bold assertion and you need to prove it.

Why are you writing such nonsense after I explained that the clocks are CAPPED? Which part of "The capping runs both cpu and gpu BELOW the maximum clock rates possible" do you intentionally ignore? I thought it is pretty obvious that the cpu could run faster as the XSX runs it at 3.66GHz (3.8GHz without smt), so we know with absolute certainty that the cpu clock is capped below the maximum clock rate possible. I really have no clue as to why you constantly refuse to see this fact.

Are we even absolutely certain that both CPU parts are absolutely the same?

A Ryzen 7 3700X and a Ryzen 7 3800X are also specified for different base and boost clock speeds.



The clocks are capped, but the consumption is capped before the frequency is. Which is basically a first.

Mark Cerny, being Mark Cerny, was candidly honest with the performance you can expect from the PS5 and didn’t try to sugar coat it or inflate the numbers with theoretical values. He explained the whole damn thing to the world as if everyone had the capacity to grasp what he said, instead of going all big numbers is what’s important.



Pemalite said:
DonFerrari said:

So I misunderstood you saying the GPU also send unused power to CPU as well (on PS5 it seemed an indication that only CPU to GPU path for this was made).

Still when he said 10% less power could be achieved with dropping only couple percentage on the frequency and performance drop would be minimal then it remain to be seem how that will work when games release. Because right now could be damage control and it will drop more than the 2% (perhaps to the leaked 9.2).

SmartShift works in both directions.

And you are right that 10% less power might only reduce the clockrate by a small percentage... All processors have an "efficiency curve" and once you exceed that curve... A corresponding increase in clockrate can increase power consumption by orders of magnitude.

I.E. Polaris was pretty efficient at 1120Mhz core clock @150w TDP.
But what AMD later did was take that exact GPU, clock the GPU up to 1257Mhz with the RX 580 which is an increase of 137Mhz or 12.2%.

However TDP went from 150w to 185w which is an increase of 35W or 23.33%... So the GPU by extension was regarded as "less efficient".

AMD of course would still take that same chip, shrink it to 12nm (Which is basically a refined 14nm process anyway, fuck marketing bullshittery.) but boosted clockrates to 1469Mhz which is an increase of 31% over the RX 480 and 16.8% over the RX 580. - But TDP increased to 225W which is an increase of 50% over the RX 480 and 21.6% over the RX 580...

The fact is though, we don't know exactly what kind of hit to clockrates we are seeing with an accompanying decrease in TDP on the Playstation 5, we aren't privy to that information just yet.

drkohler said:

This was one of the points Cerny should definitely NOT have made in his talk, because it confuses people more than necessary. I guess the temptation of inserting a little pr with a buzzword was too high, so he added the Smartshift bit.

Because you know more than Cerny, right?
You are happy to use "Cerny said this" to backup your arguments... But then downplay Cerny's comments when they come into contradiction of your own.

drkohler said:

Again you don't understand "maximum" in that context. Both cpu and gpu could run faster than 3.5GHz and 2.23GHz (obviously as the cpu in the XSX runs at 3.66GHz). The capping runs both cpu and gpu BELOW the maximum clock rates possible, in order to maintain the power limit set by the cooling (which I really want to see).

Citation needed.
If you are asserting that the Playstation 5 will operate it's CPU and GPU at higher than 3.5Ghz and 2.23Ghz respectably... Then that is a bold assertion and you need to prove it.

drkohler said:

Now if you ran Pong! on the PS5, the power draw on the cpu would be almost nil and Smartshift could in theory send dozens of Watts to the gpu. On the other hand, to save power, the gpu would probably be downclocked anyways so as to cool the entire system. It's Pong!, after all.

That is not what Smartshift does.
It is an allocation of Power/Thermal limitations, not a reduction.

drkohler said:

As the PS5 cooling system is apparently laid out to let both cpu and gpu simultaneously run at capped speeds (NOT maximum speeds) under "non-Dealth Valley conditions", Smartshift does absolutely nothing (except probably downclocking the cpu to cool the system). When the Tempest hardware and the ssd hardware are in full operations, my guess is downclocking will occur with a possible Smartshift contribution to hold the gpu at capped speed.

Again. Smartshift isn't about downclocking. It's all about dynamic allocating.
https://www.amd.com/en/technologies/smartshift

drkohler said:

So when does Smartshift actually kick in? That is the part that was a little too fuzzy in Cerny's talk. He never answered that unasked question. Looking at the various statements he made about the whole power managment and what happens when and how long, there are a few open questions left.

We know exactly when Smartshift kicks in, that particular AMD technology is well documented at this point.

https://www.anandtech.com/show/15624/amd-details-renoir-the-ryzen-mobile-4000-series-7nm-apu-uncovered/4

Then either smartshift in PS5 works a little different or Mark Cerny just didn't mention energy could also flow from GPU to CPU (let's say some games decide to be much more CPU hungry than GPU, but considering the CPUs we had this gen then it is probable that next gen most games will use GPU full throttle the whole time and have CPU downgrade to not exceed the envelope).

On the operating at over those speeds what I understood of mark cerny was that both CPU and GPU could work over the frequencies put in PS5 but to ensure proper work they capped both at those frequencies. So yes the 3.5 and 2.23 are the limits that won't both be hit at the same time not that those are the base frequency which can be exceeded when putting one down.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Hynad said:
Pemalite, are you MS inclined?

PC.

But have PS4 and Xbox. Don't remember he being more favorable to one or another.

And Mark Cerny probably sugar coat some points like the 10% power consumption drop with 2% frequency drop with minimal performance drop, because if that was the case they would have already made the console 10% more economical and without performance losing anything =]



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."