Forums - Gaming Discussion - PS5 vs XSeX: Understanding the Gap

DonFerrari said:
CrazyGPU said:

This is not wrong but it´s best case scenario for PS5. We know that the clocks of the XBOX are fixed. But the PS5 clocks will throttle. So the CPU won´t be at 3.5 most of the time when the GPU is at 2.23 Ghz. Also the GPU  uses much more power than the CPU. That means that a slowdown of the CPU doesn´t mean for sure that the GPU will be able to keep its 2.23 Ghz clock. My guess here is that the PS5 was going to be clocked at 9.2 as many rumors said but as Microsoft rose the bar with the 12.15 Teraflops XBOX, they decided to go for a variable frecuency that gives the idea of more than 10 Teraflops and show that the difference is not that big. Mu guess is that the real world difference is going to be 20%. That´s a big difference. ITs more than a PS4 (old architecture 1.84 teraflops).

Now, Will that 20% be that noticeable? Maybe. 60 to 50 fps or same fps with some downgraded shadows, less effects or motion blur here and there, or even more drops in variable resolution. I don´t think it wil be a game changer. When we see games running on XBOX one X and PS4 pro, the difference is greater in every way, but still the ps4 pro gives us a good experience.  

In my opinion Microsoft needs to show me more good games, and new IPs to make me feel that Im loosing something. Otherwise I´ll just accept that my machine is 20% slower but play the awesome games that companies like Naughty Dog or Santa Monica and others have for me. 

Yes Sony were able to make the entire Smartshift, cooling solution, decide to control by frequency with fixed power consumption and everything else in a couple days since MS had revealed the specs? Or do we give then a couple months for when the rumours were more thrustworthy?

The best you can have for "reactionary" is that Sony was expecting MS to go very high on CU count and have a high TF number and chose the cheaper route to put higher frequency, but that was decided like 2 years ago.

My only issue is that they wanted to overclock "anything" and decided to go with friggin AMD. We all know AMD wasn't as friendly towards increasing clock speeds as Intel, to whom overclocking is kind of their thing. 



It was Britain, it is America, tomorrow France and next year, the world... 

Warning: This poster has a very negative opinion of Sony and Nintendo, Idea Factory and companies Tecmo Koei, EA, BioWare, Blizzard, Treyarch, Infinity Ward, Kadokawa and Sega. If you have very positive views of these and a negative view of Microsoft or Bethesda Game Studios, AVOID ENGAGEMENT AT ALL COSTS! 

    Around the Network
    DonFerrari said:
    Pemalite said:

    Because of the hardware reveals.
    And unlike console gamers... I have been playing around with SSD's for over a decade.

    I think you meant PS5, not PS4.
    Either way, the Xbox Series X having an SSD that is half the speed of the PS5's is by all intents and purposes still stupidly fast.
    The PS5's is just twice as fast.

    Let's not downplay anything here, let's be realistic of what both consoles offer.

    There are some possible edge-case scenarios where the Xbox Series X could potentially close the bandwidth gap due to compression by a few percentage points, but we will need to see the real world implications of that... Because like I have alluded to prior, possibly even in another thread... Many data formats are already compressed and thus don't actually gain many advantages from being compressed again. (Sometimes it can have the opposite effect and increase sizes.)

    I am not religious. You seem to be getting upset over the specifications of certain machines? Might be a good idea to take a step back?

    Microsoft has similar propriety technology as the Playstation 5 on the compression front and the Xbox Series X also includes a decompression block, that is what we do know, it was what was in the reveal.
    https://news.xbox.com/en-us/2020/03/16/xbox-series-x-tech/

    Microsoft completely removes the burden from the Zen 2 cores.

    No. It is because people are enamored with their particular brand choice and cannot see where they potentially fall short or provide constructive criticism... Or just generally treading on the logical fallacy of hypothesis contrary to fact.

    The 2.23Ghz -is- a boost clock. Sony/Cerny specifically mentioned Smartshift. - Unless you are calling Cerny a liar?

    https://www.anandtech.com/show/15624/amd-details-renoir-the-ryzen-mobile-4000-series-7nm-apu-uncovered/4

    And I quote Digital Foundry which quoted Cerny:
    ""Rather than look at the actual temperature of the silicon die, we look at the activities that the GPU and CPU are performing and set the frequencies on that basis - which makes everything deterministic and repeatable," Cerny explains in his presentation. "While we're at it, we also use AMD's SmartShift technology and send any unused power from the CPU to the GPU so it can squeeze out a few more pixels.""

    https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

    It is shifting TDP from one part of the chip to the other to boost clockrates. It's a boost clock.

    If the Playstation 5 cannot maintain a 2.23Ghz GPU clock in conjunction with a 3.5Ghz CPU clock, whilst pegging the I/O, then by extension... That 2.23Ghz GPU clock is not the base clock, it is a boost clock, it is a best-case scenario.

    Pretty sure that is not my exact statement and you are taking it right out of context.

    The Xbox Series X has the CPU, GPU and Memory bandwidth advantages, it is likely to show an advantage more often than not... Just like the Xbox One X compared to the Playstation 4 Pro.

    In simpler titles which won't use 100% of either consoles capabilities... Those games will look identical. - And that happens every console generation, there are base Xbox One and Playstation 4 games with visual parity, right down to resolution and framerates.

    Big AAA exclusives are another kettle of fish and could make things interesting... But by and large, if graphics is the most important aspect, the Xbox Series X holds the technical edge due to the sheer number of additional functional units baked into the chip design.

    Brute forcing and using design tricks to get around hardware limitations will continue to exist because the SSD is still a limitation until it actually matches the RAM's bandwidth.

    Remember... We wen't from optical disks that could be measured in kilobytes per second to mechanical hard drives that could be measured in Megabytes per second... With an accompanying reduction in seek times... Did game design change massively? For the most part, not really... And we are seeing a similar jump in storage capability by jumping from mechanical drives that are measured in Megabytes per second to Gigabytes per second with an equally dramatic decrease in seek times.

    People tend to gravitate towards games that they like... Developers then design games around that, hence why something like Battle Royale happened and then every developer and it's pet dog jumped onto the bandwagon to make their own variant.

    Everyone copied Gears of Wars "Horde mode" as well at one point.

    That's not to say that SSD's won't provide benefits, far from it.

    I would not be making such a claim just yet.

    The Xbox Series X is a chip with dramatically more functional units... It is only in scenarios where the PS5 has the same number of ROPS, TMU's, Geometry units and so forth that it will be faster than the Xbox Series X due to it's higher clockrates... And usually those units are tied somewhat to the number of Shader groupings.

    The Xbox Series X could have the advantage on the GPU side across the board... The point I am making is that until we get the complete specs set, we just don't know yet.

    We do know the SSD is twice as fast as the Xbox Series X... Which is what people are clinging to at the moment as it's the only guaranteed superior metric.

    Precisely, we do need to account for everything. You can have the same Teraflops, but half the performance if the rest of the system isn't up to snuff.

    Only focusing on Teraflops or only focusing on the SSD is extremely 2Dimensional... And doing a disservice to the amount of engineering, research and development that Microsoft, Sony and AMD have put into these consumer electronic devices.

    It's more than just initial loading.

    Pretty sure the TMU's, ROPS haven't been revealed yet, don't count the chickens before the eggs have hatched.

    And rumor has it that the Xbox Series X could have 80 ROPS verses the PS5's 64 ROPS...
    https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482

    In RDNA AMD groups 1x Rasterizer with every 4x Render Back ends, obviously that can change with RDNA 2, but just some food for thought.
    https://www.amd.com/system/files/documents/rdna-whitepaper.pdf

    Which means that we could be looking at 20 Rasterizers verses 16.

    I think you are looking for the shader processors, they have tried to take every aspect of the GPU into account rather than a pure focus on flops.

    AMD has "claimed" (Salts, grains, kittens and all that) that RDNA 2.0 is 50% more efficient than RDNA 1... Which was the same jump we saw between Vega and RDNA 1.

    Graphics tasks are highly parallel... AMD was struggling with CU scaling because GCN had intrinsic hardware limits, it was an architectural limitation itself, we need to remember when AMD debuted GCN we were working with 32 CU's, AMD then stalled as the company's profits plummeted and AMD had to make cutbacks everywhere in order not to go bankrupt, so they kept milking GCN longer than anticipated in order to keep R&D and engineering costs as low as possible.

    Higher frequency isn't always cheaper.
    The higher in frequency you go, the more voltage you need to dump into the design... And one aspect of chip yields is that not all chips can hit a certain clock frequency at a certain voltage due to leakage and so forth, which means the number of usable chips decreases and the cost per-chip increases.

    It's actually a careful balancing act of chip size vs chip frequency. If you can get all the right cards in a row... You can pull off an nVidia Pascal and drive up clockrates significantly, however nVidia still had to spend a ton of transistors to reduce leakage and remove clockrate limiting bottlenecks from their design, but it paid off.

    Yep I understand it. But would be quite asinine to have the chip cost the same as Xbox to deliver 20% less don't you agree?

    And my point was more on, since clockrate increase isn't something so simple to do (even when Xbox One was reacting to being weaker than PS4, which was after both had revealed their specs the increase was quite small and only on CPU and also Xbox had a much bigger box so the cooling was likely easier to tweak for the increase in CPU clock) the whole use of smartshift, two controlers for TDP and all else wasn't something reactionary to MS.

    It was Sony thinking it was the best solution for them on the budget they had.

    Smartshift and the clockrates for the Playstation 5 were neither last-minute or reactionary, anyone who claims otherwise is a being a bit silly. The Xbox One got away with it because it's cooling solution was already over-engineered and the overclock wasn't that significant.

    --::{PC Gaming Master Race}::--

    The specs don't matter, since recent history shows that xbox have been more hype over substance thanks to their dwindling emphasis on first party exclusives over the years. I remember the xbox one presser like it was yesterday, and the talk of all these exclusives to come, but didn't. let's just say that the xbox is indeed more powerful, say 15%, it won;t matter, as across multiplats will be on par. Even pc development holds back for parity across platforms. As for exclusives, it comes down to budget, the talent of the team, ambition of the director and design team, the game engine, maximising efficiency within that engine and constant tweaks, while also maximising and being as efficient with the architecture of the console to be worked for, which is ongoing till the end. Sony have amazing talented studios who im my opinion are the best in the industry, So i expect sony, along with nintendo to continue the trend of great AAA titles. Microsoft, once again needs to prove alot. Having said all this, i haven't seen anything revolutionary /evolutionary, in terms of Ai or physics, or any true game changer to come from either new console. If next gen is mainly about 4k and 60fps with and better load times, then i'll be disaapointed. Even ray tracing on consoles isn't that great yet, or of a big deal. I hope some real tech demos and presentations to show unique gameplay only possible on next gen.



    DonFerrari said:
    Pemalite said:

    Because of the hardware reveals.
    And unlike console gamers... I have been playing around with SSD's for over a decade.

    I think you meant PS5, not PS4.
    Either way, the Xbox Series X having an SSD that is half the speed of the PS5's is by all intents and purposes still stupidly fast.
    The PS5's is just twice as fast.

    Let's not downplay anything here, let's be realistic of what both consoles offer.

    There are some possible edge-case scenarios where the Xbox Series X could potentially close the bandwidth gap due to compression by a few percentage points, but we will need to see the real world implications of that... Because like I have alluded to prior, possibly even in another thread... Many data formats are already compressed and thus don't actually gain many advantages from being compressed again. (Sometimes it can have the opposite effect and increase sizes.)

    I am not religious. You seem to be getting upset over the specifications of certain machines? Might be a good idea to take a step back?

    Microsoft has similar propriety technology as the Playstation 5 on the compression front and the Xbox Series X also includes a decompression block, that is what we do know, it was what was in the reveal.
    https://news.xbox.com/en-us/2020/03/16/xbox-series-x-tech/

    Microsoft completely removes the burden from the Zen 2 cores.

    No. It is because people are enamored with their particular brand choice and cannot see where they potentially fall short or provide constructive criticism... Or just generally treading on the logical fallacy of hypothesis contrary to fact.

    The 2.23Ghz -is- a boost clock. Sony/Cerny specifically mentioned Smartshift. - Unless you are calling Cerny a liar?

    https://www.anandtech.com/show/15624/amd-details-renoir-the-ryzen-mobile-4000-series-7nm-apu-uncovered/4

    And I quote Digital Foundry which quoted Cerny:
    ""Rather than look at the actual temperature of the silicon die, we look at the activities that the GPU and CPU are performing and set the frequencies on that basis - which makes everything deterministic and repeatable," Cerny explains in his presentation. "While we're at it, we also use AMD's SmartShift technology and send any unused power from the CPU to the GPU so it can squeeze out a few more pixels.""

    https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

    It is shifting TDP from one part of the chip to the other to boost clockrates. It's a boost clock.

    If the Playstation 5 cannot maintain a 2.23Ghz GPU clock in conjunction with a 3.5Ghz CPU clock, whilst pegging the I/O, then by extension... That 2.23Ghz GPU clock is not the base clock, it is a boost clock, it is a best-case scenario.

    Pretty sure that is not my exact statement and you are taking it right out of context.

    The Xbox Series X has the CPU, GPU and Memory bandwidth advantages, it is likely to show an advantage more often than not... Just like the Xbox One X compared to the Playstation 4 Pro.

    In simpler titles which won't use 100% of either consoles capabilities... Those games will look identical. - And that happens every console generation, there are base Xbox One and Playstation 4 games with visual parity, right down to resolution and framerates.

    Big AAA exclusives are another kettle of fish and could make things interesting... But by and large, if graphics is the most important aspect, the Xbox Series X holds the technical edge due to the sheer number of additional functional units baked into the chip design.

    Brute forcing and using design tricks to get around hardware limitations will continue to exist because the SSD is still a limitation until it actually matches the RAM's bandwidth.

    Remember... We wen't from optical disks that could be measured in kilobytes per second to mechanical hard drives that could be measured in Megabytes per second... With an accompanying reduction in seek times... Did game design change massively? For the most part, not really... And we are seeing a similar jump in storage capability by jumping from mechanical drives that are measured in Megabytes per second to Gigabytes per second with an equally dramatic decrease in seek times.

    People tend to gravitate towards games that they like... Developers then design games around that, hence why something like Battle Royale happened and then every developer and it's pet dog jumped onto the bandwagon to make their own variant.

    Everyone copied Gears of Wars "Horde mode" as well at one point.

    That's not to say that SSD's won't provide benefits, far from it.

    I would not be making such a claim just yet.

    The Xbox Series X is a chip with dramatically more functional units... It is only in scenarios where the PS5 has the same number of ROPS, TMU's, Geometry units and so forth that it will be faster than the Xbox Series X due to it's higher clockrates... And usually those units are tied somewhat to the number of Shader groupings.

    The Xbox Series X could have the advantage on the GPU side across the board... The point I am making is that until we get the complete specs set, we just don't know yet.

    We do know the SSD is twice as fast as the Xbox Series X... Which is what people are clinging to at the moment as it's the only guaranteed superior metric.

    Precisely, we do need to account for everything. You can have the same Teraflops, but half the performance if the rest of the system isn't up to snuff.

    Only focusing on Teraflops or only focusing on the SSD is extremely 2Dimensional... And doing a disservice to the amount of engineering, research and development that Microsoft, Sony and AMD have put into these consumer electronic devices.

    It's more than just initial loading.

    Pretty sure the TMU's, ROPS haven't been revealed yet, don't count the chickens before the eggs have hatched.

    And rumor has it that the Xbox Series X could have 80 ROPS verses the PS5's 64 ROPS...
    https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482

    In RDNA AMD groups 1x Rasterizer with every 4x Render Back ends, obviously that can change with RDNA 2, but just some food for thought.
    https://www.amd.com/system/files/documents/rdna-whitepaper.pdf

    Which means that we could be looking at 20 Rasterizers verses 16.

    I think you are looking for the shader processors, they have tried to take every aspect of the GPU into account rather than a pure focus on flops.

    AMD has "claimed" (Salts, grains, kittens and all that) that RDNA 2.0 is 50% more efficient than RDNA 1... Which was the same jump we saw between Vega and RDNA 1.

    Graphics tasks are highly parallel... AMD was struggling with CU scaling because GCN had intrinsic hardware limits, it was an architectural limitation itself, we need to remember when AMD debuted GCN we were working with 32 CU's, AMD then stalled as the company's profits plummeted and AMD had to make cutbacks everywhere in order not to go bankrupt, so they kept milking GCN longer than anticipated in order to keep R&D and engineering costs as low as possible.

    Higher frequency isn't always cheaper.
    The higher in frequency you go, the more voltage you need to dump into the design... And one aspect of chip yields is that not all chips can hit a certain clock frequency at a certain voltage due to leakage and so forth, which means the number of usable chips decreases and the cost per-chip increases.

    It's actually a careful balancing act of chip size vs chip frequency. If you can get all the right cards in a row... You can pull off an nVidia Pascal and drive up clockrates significantly, however nVidia still had to spend a ton of transistors to reduce leakage and remove clockrate limiting bottlenecks from their design, but it paid off.

    Yep I understand it. But would be quite asinine to have the chip cost the same as Xbox to deliver 20% less don't you agree?

    And my point was more on, since clockrate increase isn't something so simple to do (even when Xbox One was reacting to being weaker than PS4, which was after both had revealed their specs the increase was quite small and only on CPU and also Xbox had a much bigger box so the cooling was likely easier to tweak for the increase in CPU clock) the whole use of smartshift, two controlers for TDP and all else wasn't something reactionary to MS.

    It was Sony thinking it was the best solution for them on the budget they had.

    I agree. Form factor could also play a role, though. MS basically went with a mini pc design, which will be hard to fit in most people's tv setups and maybe Sony didn't want to take that route. But my guess is that the ps5 will be at least $100 cheaper

    Intrinsic said:
    Jranation said:
    Why is the acronym for Xbox Series X XseX? Sounds a bit rude.

    Don't blame me... tell MS to name their consoles better.

    X360
    XB1
    XB1S
    XB1SSAD
    XB1X
    XSX
    XSS?(lockhart)

    And the shocking thing is that they all have really cool project names. I wonder what they would call the eventual mid gen refresh. Xbox series X two? Xbox Series XX? Xbox series Xb? Xbox Series X 2024? Gotta wonder who picks these names. 

    Wrong X1 was actually Xbone

    duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

    http://gamrconnect.vgchartz.com/post.php?id=8808363

    Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

    http://gamrconnect.vgchartz.com/post.php?id=9008994

    Around the Network
    AsGryffynn said:
    DonFerrari said:

    Yes Sony were able to make the entire Smartshift, cooling solution, decide to control by frequency with fixed power consumption and everything else in a couple days since MS had revealed the specs? Or do we give then a couple months for when the rumours were more thrustworthy?

    The best you can have for "reactionary" is that Sony was expecting MS to go very high on CU count and have a high TF number and chose the cheaper route to put higher frequency, but that was decided like 2 years ago.

    My only issue is that they wanted to overclock "anything" and decided to go with friggin AMD. We all know AMD wasn't as friendly towards increasing clock speeds as Intel, to whom overclocking is kind of their thing. 

    Problem with intel is that although their CPU is good their GPU is shitty, and the APU would be even worse.

    duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

    http://gamrconnect.vgchartz.com/post.php?id=8808363

    Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

    http://gamrconnect.vgchartz.com/post.php?id=9008994

    goopy20 said:
    DonFerrari said:

    Yep I understand it. But would be quite asinine to have the chip cost the same as Xbox to deliver 20% less don't you agree?

    And my point was more on, since clockrate increase isn't something so simple to do (even when Xbox One was reacting to being weaker than PS4, which was after both had revealed their specs the increase was quite small and only on CPU and also Xbox had a much bigger box so the cooling was likely easier to tweak for the increase in CPU clock) the whole use of smartshift, two controlers for TDP and all else wasn't something reactionary to MS.

    It was Sony thinking it was the best solution for them on the budget they had.

    I agree. Form factor could also play a role, though. MS basically went with a mini pc design, which will be hard to fit in most people's tv setups and maybe Sony didn't want to take that route. But my guess is that the ps5 will be at least $100 cheaper

    Nahhh I'm quite happy with the minitower design of Xbox, seem quite sleek and pretty for me. I guess with some creativity most people will be able to fit it in their room.

    duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

    http://gamrconnect.vgchartz.com/post.php?id=8808363

    Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

    http://gamrconnect.vgchartz.com/post.php?id=9008994

    I hope that both parties offerings is not hard to do for devs,
    Standard solutions for multiplatform devs are very important, the best looking FIFA, the best looking CoD etc



     "I think people should define the word crap" - Kirby007

    Join the Prediction League http://www.vgchartz.com/predictions

    DonFerrari said:
    AsGryffynn said:

    My only issue is that they wanted to overclock "anything" and decided to go with friggin AMD. We all know AMD wasn't as friendly towards increasing clock speeds as Intel, to whom overclocking is kind of their thing. 

    Problem with intel is that although their CPU is good their GPU is shitty, and the APU would be even worse.

    And if "overclocking was intels thing" Intel wouldn't have disabled the ability on most of it's CPU's.



    --::{PC Gaming Master Race}::--

    Pemalite said:
    DonFerrari said:

    Problem with intel is that although their CPU is good their GPU is shitty, and the APU would be even worse.

    And if "overclocking was intels thing" Intel wouldn't have disabled the ability on most of it's CPU's.

    Thanks for complimenting.

    We could as well say that if Sony or MS really wanted the best GPU they should go to NVidia, that would miss of course that it is more expensive and don`t have CPU for an APU (but sure could cook something with Intel and NVidia for the best performance and don`t care about cost).



    duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

    http://gamrconnect.vgchartz.com/post.php?id=8808363

    Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

    http://gamrconnect.vgchartz.com/post.php?id=9008994