By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Why we cannot compare PS5 and Xbox series X directly, EDIT : add Resident Evil 3 remake and DOOM Eternal that run better on PS4 Pro as an example, also add EX- Crytek Developer and programer testimon

HollyGamer said:
LudicrousSpeed said:

All this shows is that RE3 is poorly optimized for Scorpio. 
You can give a dev all the power in the world, it won’t make up for bad development.

Not at all, Xbox One X run with DirectX api,  the most easiest API and well known for PC. This games are multiplat games that coming to PC so it's not that rocket science.

So all games using an DirectX API are good optimized on Xbox and PCs? Is that what you are saying?



Around the Network
d21lewis said:
This thread title just rolls off the tongue!

with 10 roadblocks in the middle =p



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

I'm still struggling a bit with how SSD will revolutionize level design. I get that developers will no longer need to design "staging" areas so the next big section can load, or bother with black loading screens during fast travel.

But what kinds of new designs will we see as a result of this shift, that don't already exist? Or is this not about new designs, but allowing cutting-edge graphics, long draw distances, and seamless open worlds to exist simultaneously?

I am genuinely curious about this, so I appreciate your thoughts!



Veknoid_Outcast said:

I'm still struggling a bit with how SSD will revolutionize level design. I get that developers will no longer need to design "staging" areas so the next big section can load, or bother with black loading screens during fast travel.

But what kinds of new designs will we see as a result of this shift, that don't already exist? Or is this not about new designs, but allowing cutting-edge graphics, long draw distances, and seamless open worlds to exist simultaneously?

I am genuinely curious about this, so I appreciate your thoughts!

Don't forget the removal of corridors/stairs, more enemies on the screen, persistent worlds. But yes no one really said how it will revolution it yet.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

well we will see how this going to change anything, I doubt we will see much difference. if this is working then only in ps5 exlusive titles, I wonder how those games will performe if installed on a regular usb 3.0 external drive.



Around the Network
Veknoid_Outcast said:

I'm still struggling a bit with how SSD will revolutionize level design. I get that developers will no longer need to design "staging" areas so the next big section can load, or bother with black loading screens during fast travel.

But what kinds of new designs will we see as a result of this shift, that don't already exist? Or is this not about new designs, but allowing cutting-edge graphics, long draw distances, and seamless open worlds to exist simultaneously?

I am genuinely curious about this, so I appreciate your thoughts!

I could imagine much more transformative worlds.

Remember Death Stranding when you engage in a battle with a BT and the world transforms into a puddle of tar. Now imagine instead of a barren wasteland turning into tar with a few ruins here and there, an extremely dense and detailed forrest transforming into a large and bustling high-tech city in an instant.

Or imagine a world full of portals, where you can look through them and see a completely different world with completely new assets different from the world you are currently in and if you actually go through the portal you will be in the other world in an instant without loading times.


Well these are certainly no new design concepts, but it is not only about inventing something new, but also to enable an even larger group of developers to do these things without the need to create a highly optimized engine and thus spending enormous amounts of time and money to somehow overcome the obstacles of todays tech.



HollyGamer said:
LudicrousSpeed said:
If you think a game running a more powerful hardware at almost half the frame rate is anything other than poor optimization, idk what to tell you 😆

more powerful in TF yes ( that also just 42 %) PS4 pro has rapid Math while One X does not. Graphic is not just TF there is more than that. Even Digital Foundry said PS4 pro are lack  of bandwidth compared to  One X on top of that One X has 4 GB more RAM. That is the  biggest disadvantage Pro over One X. PS5 will not have the same problem because even if Series X is  560 GB/Second ,  some RAM  modul run slower at 330 GB/S.  while PS5 has unified number speed with 448 GB/S across the memory setup and has the same 16GB amount of RAM the same with Series X. 

Rapid Packed Math cannot be used 100% of the time.
The Xbox One X still has the advantage in pure computational throughput.

drkohler said:

Complete nonsense. Again you didn't listen to what Cerny said. Try again, starting around the 35 minute mark.

The PS5's maximum cpu and gpu clocks are UNKNOWN. The cpu is CAPPED at 3.5GHz. The gpu is CAPPED at 2.23GHz. These are the maximum frequencies allowed that guarantee correct operations inside the cpu and gpu, under all conditions. We have no idea how the cooling system (and power supply) was designed for what power dissipation limit. At worst, it was designed to just hold the 3.5/2.23GHz clocks (with rocket noise or not), at best it was designed to hold 4/2.5GHz clock levels (probably with rocket noise, those are some high frequencies). The proof is in the pudding, and we don't have any yet to eat.

When you place your PS5 into the fridge, it WILL indefinitely run games at the maximum allowed two frequencies as the cooling can handle max power without problems. The ability to shift power from the cpu to the gpu is always there, of course, but it will simply not take place due to the caps.

Now if you are the ranger in the Death Valley ranger station and decide to play a game around noon in the front yard, thing are different, there is a thermometer element hidden somewhere. Then all the frequency shifting takes place (incidentally, Cerny didn't say what happens when you really ARE in Death Valley locations. But so did he "miss to mention" critical stuff in other places). Who wins and who loses depends on what the game is doing at any moment in time, obviously. Don't expect to see significant drops, though. Cerny mentions a 10% in power drop only costs a few % in clock rates, so I'm guessing we won't likely see "bad" clock rates below the 2.1Ghz point on the gpu.

No. You didn't listen to what Cerny said.
Try again when he starts talking about Smartshift.

The "Capped" CPU and GPU speeds are their "maximum".

They can and will potentially be lower than their maximum depending on component demands and TDP.

drkohler said:

Not true. That's one of the points Cerny missed to address. The XSX can use 10Gbytes of ram at 560 GB/s. That is the obvious place where textures, frame buffers and all the key stuff is allocated. The compiler/linker will make sure of that, all games, all the time. The PS4 only has 448GB/s. If 448 GB/s is enough for safe 4k/60Hz I'm really not sure. The games wil tell, but I think this is a gamble (simply for using lower priced ram chips) that might not pay off in the end. On the same games, the XSX will have more (native) pixels on screen. On the other hand, the PS5 will have the "better" pixels if all the ssd-tricks are used.

The next gen consoles have more "real world bandwidth" than the raw numbers will lead us to believe... There has been a multitude of improvements since the Xbox One and Playstation 4 launched on this front.

Also... A bit of stretch that the Playstation 5 will have "better pixels" when the Xbox Series X has more functional units to improve visual fidelity.

DonFerrari said:
Pemalite said:

Sony specifically mentioned AMD's Smart Shift technology.
I have a notebook that leverages similar technology.

Basically if the demand for the CPU or GPU is lesser... Then the other can clock up to it's maximum as it has the TDP available.

It cannot maintain both the CPU and GPU at maximum clocks indefinitely, otherwise it's not a "boost mode" at all and Sony shouldn't have even bothered to mention it.

The Smarshift in PS5 is specifically to give unused power from CPU to GPU, doesn't seem like it will be the same you have on notebook.

Correct. And that is exactly how it's done in a Notebook.




--::{PC Gaming Master Race}::--

Pemalite said:
HollyGamer said:

more powerful in TF yes ( that also just 42 %) PS4 pro has rapid Math while One X does not. Graphic is not just TF there is more than that. Even Digital Foundry said PS4 pro are lack  of bandwidth compared to  One X on top of that One X has 4 GB more RAM. That is the  biggest disadvantage Pro over One X. PS5 will not have the same problem because even if Series X is  560 GB/Second ,  some RAM  modul run slower at 330 GB/S.  while PS5 has unified number speed with 448 GB/S across the memory setup and has the same 16GB amount of RAM the same with Series X. 

Rapid Packed Math cannot be used 100% of the time.
The Xbox One X still has the advantage in pure computational throughput.

drkohler said:

Complete nonsense. Again you didn't listen to what Cerny said. Try again, starting around the 35 minute mark.

The PS5's maximum cpu and gpu clocks are UNKNOWN. The cpu is CAPPED at 3.5GHz. The gpu is CAPPED at 2.23GHz. These are the maximum frequencies allowed that guarantee correct operations inside the cpu and gpu, under all conditions. We have no idea how the cooling system (and power supply) was designed for what power dissipation limit. At worst, it was designed to just hold the 3.5/2.23GHz clocks (with rocket noise or not), at best it was designed to hold 4/2.5GHz clock levels (probably with rocket noise, those are some high frequencies). The proof is in the pudding, and we don't have any yet to eat.

When you place your PS5 into the fridge, it WILL indefinitely run games at the maximum allowed two frequencies as the cooling can handle max power without problems. The ability to shift power from the cpu to the gpu is always there, of course, but it will simply not take place due to the caps.

Now if you are the ranger in the Death Valley ranger station and decide to play a game around noon in the front yard, thing are different, there is a thermometer element hidden somewhere. Then all the frequency shifting takes place (incidentally, Cerny didn't say what happens when you really ARE in Death Valley locations. But so did he "miss to mention" critical stuff in other places). Who wins and who loses depends on what the game is doing at any moment in time, obviously. Don't expect to see significant drops, though. Cerny mentions a 10% in power drop only costs a few % in clock rates, so I'm guessing we won't likely see "bad" clock rates below the 2.1Ghz point on the gpu.

No. You didn't listen to what Cerny said.
Try again when he starts talking about Smartshift.

The "Capped" CPU and GPU speeds are their "maximum".

They can and will potentially be lower than their maximum depending on component demands and TDP.

drkohler said:

Not true. That's one of the points Cerny missed to address. The XSX can use 10Gbytes of ram at 560 GB/s. That is the obvious place where textures, frame buffers and all the key stuff is allocated. The compiler/linker will make sure of that, all games, all the time. The PS4 only has 448GB/s. If 448 GB/s is enough for safe 4k/60Hz I'm really not sure. The games wil tell, but I think this is a gamble (simply for using lower priced ram chips) that might not pay off in the end. On the same games, the XSX will have more (native) pixels on screen. On the other hand, the PS5 will have the "better" pixels if all the ssd-tricks are used.

The next gen consoles have more "real world bandwidth" than the raw numbers will lead us to believe... There has been a multitude of improvements since the Xbox One and Playstation 4 launched on this front.

Also... A bit of stretch that the Playstation 5 will have "better pixels" when the Xbox Series X has more functional units to improve visual fidelity.

DonFerrari said:

The Smarshift in PS5 is specifically to give unused power from CPU to GPU, doesn't seem like it will be the same you have on notebook.

Correct. And that is exactly how it's done in a Notebook.


So I misunderstood you saying the GPU also send unused power to CPU as well (on PS5 it seemed an indication that only CPU to GPU path for this was made).

Still when he said 10% less power could be achieved with dropping only couple percentage on the frequency and performance drop would be minimal then it remain to be seem how that will work when games release. Because right now could be damage control and it will drop more than the 2% (perhaps to the leaked 9.2).



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Pemalite said:

No. You didn't listen to what Cerny said.
Try again when he starts talking about Smartshift.

The "Capped" CPU and GPU speeds are their "maximum".

This was one of the points Cerny should definitely NOT have made in his talk, because it confuses people more than necessary. I guess the temptation of inserting a little pr with a buzzword was too high, so he added the Smartshift bit.

Again you don't understand "maximum" in that context. Both cpu and gpu could run faster than 3.5GHz and 2.23GHz (obviously as the cpu in the XSX runs at 3.66GHz). The capping runs both cpu and gpu BELOW the maximum clock rates possible, in order to maintain the power limit set by the cooling (which I really want to see).

Now if you ran Pong! on the PS5, the power draw on the cpu would be almost nil and Smartshift could in theory send dozens of Watts to the gpu. On the other hand, to save power, the gpu would probably be downclocked anyways so as to cool the entire system. It's Pong!, after all.

As the PS5 cooling system is apparently laid out to let both cpu and gpu simultaneously run at capped speeds (NOT maximum speeds) under "non-Dealth Valley conditions", Smartshift does absolutely nothing (except probably downclocking the cpu to cool the system). When the Tempest hardware and the ssd hardware are in full operations, my guess is downclocking will occur with a possible Smartshift contribution to hold the gpu at capped speed.

So when does Smartshift actually kick in? That is the part that was a little too fuzzy in Cerny's talk. He never answered that unasked question. Looking at the various statements he made about the whole power managment and what happens when and how long, there are a few open questions left.



DonFerrari said:

So I misunderstood you saying the GPU also send unused power to CPU as well (on PS5 it seemed an indication that only CPU to GPU path for this was made).

Still when he said 10% less power could be achieved with dropping only couple percentage on the frequency and performance drop would be minimal then it remain to be seem how that will work when games release. Because right now could be damage control and it will drop more than the 2% (perhaps to the leaked 9.2).

SmartShift works in both directions.

And you are right that 10% less power might only reduce the clockrate by a small percentage... All processors have an "efficiency curve" and once you exceed that curve... A corresponding increase in clockrate can increase power consumption by orders of magnitude.

I.E. Polaris was pretty efficient at 1120Mhz core clock @150w TDP.
But what AMD later did was take that exact GPU, clock the GPU up to 1257Mhz with the RX 580 which is an increase of 137Mhz or 12.2%.

However TDP went from 150w to 185w which is an increase of 35W or 23.33%... So the GPU by extension was regarded as "less efficient".

AMD of course would still take that same chip, shrink it to 12nm (Which is basically a refined 14nm process anyway, fuck marketing bullshittery.) but boosted clockrates to 1469Mhz which is an increase of 31% over the RX 480 and 16.8% over the RX 580. - But TDP increased to 225W which is an increase of 50% over the RX 480 and 21.6% over the RX 580...

The fact is though, we don't know exactly what kind of hit to clockrates we are seeing with an accompanying decrease in TDP on the Playstation 5, we aren't privy to that information just yet.

drkohler said:

This was one of the points Cerny should definitely NOT have made in his talk, because it confuses people more than necessary. I guess the temptation of inserting a little pr with a buzzword was too high, so he added the Smartshift bit.

Because you know more than Cerny, right?
You are happy to use "Cerny said this" to backup your arguments... But then downplay Cerny's comments when they come into contradiction of your own.

drkohler said:

Again you don't understand "maximum" in that context. Both cpu and gpu could run faster than 3.5GHz and 2.23GHz (obviously as the cpu in the XSX runs at 3.66GHz). The capping runs both cpu and gpu BELOW the maximum clock rates possible, in order to maintain the power limit set by the cooling (which I really want to see).

Citation needed.
If you are asserting that the Playstation 5 will operate it's CPU and GPU at higher than 3.5Ghz and 2.23Ghz respectably... Then that is a bold assertion and you need to prove it.

drkohler said:

Now if you ran Pong! on the PS5, the power draw on the cpu would be almost nil and Smartshift could in theory send dozens of Watts to the gpu. On the other hand, to save power, the gpu would probably be downclocked anyways so as to cool the entire system. It's Pong!, after all.

That is not what Smartshift does.
It is an allocation of Power/Thermal limitations, not a reduction.

drkohler said:

As the PS5 cooling system is apparently laid out to let both cpu and gpu simultaneously run at capped speeds (NOT maximum speeds) under "non-Dealth Valley conditions", Smartshift does absolutely nothing (except probably downclocking the cpu to cool the system). When the Tempest hardware and the ssd hardware are in full operations, my guess is downclocking will occur with a possible Smartshift contribution to hold the gpu at capped speed.

Again. Smartshift isn't about downclocking. It's all about dynamic allocating.
https://www.amd.com/en/technologies/smartshift

drkohler said:

So when does Smartshift actually kick in? That is the part that was a little too fuzzy in Cerny's talk. He never answered that unasked question. Looking at the various statements he made about the whole power managment and what happens when and how long, there are a few open questions left.

We know exactly when Smartshift kicks in, that particular AMD technology is well documented at this point.

https://www.anandtech.com/show/15624/amd-details-renoir-the-ryzen-mobile-4000-series-7nm-apu-uncovered/4



--::{PC Gaming Master Race}::--