By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS5 To Feature Full Discrete GPU Not APU [RUMOUR]

Why? Why drive up costs needlessly?

Last edited by Leynos - on 25 May 2020

Bite my shiny metal cockpit!

Around the Network

I still have not seen anything official release on PS5 using an APU. Since seeing The Road to PS5 Presentation, I have been wondering if they were using a chiplet based SOC. The examples they showed, and and the mention of Smartshift made it seem like they were hinting towards that.

A chiplet based SOC with seperate CPU, GPU, and I/O Chiplets would be very interesting. Could be cheaper than a single Monolithic APU due to higher yields, and also offer better performance per chip.

Looking forward to the PS5 tear down Mark Cerny mentioned.



Stop hate, let others live the life they were given. Everyone has their problems, and no one should have to feel ashamed for the way they were born. Be proud of who you are, encourage others to be proud of themselves. Learn, research, absorb everything around you. Nothing is meaningless, a purpose is placed on everything no matter how you perceive it. Discover how to love, and share that love with everything that you encounter. Help make existence a beautiful thing.

Kevyn B Grams
10/03/2010 

KBG29 on PSN&XBL

KBG29 said:

A chiplet based SOC with seperate CPU, GPU, and I/O Chiplets would be very interesting.

Not going to happen. Waaaaay too expensive for a console. Chiplets are a cool thing for stuff like transaction servers that might have hundreds/thousands of processors that do very basic things, like answering questions.



drkohler said:
KBG29 said:

A chiplet based SOC with seperate CPU, GPU, and I/O Chiplets would be very interesting.

Not going to happen. Waaaaay too expensive for a console. Chiplets are a cool thing for stuff like transaction servers that might have hundreds/thousands of processors that do very basic things, like answering questions.

Please explain, because that goes against everything I have read about the reason AMD, Intel, and Nvidia are all moving towards Chiplets for all consumer and enterprise CPUs and GPUs, and away from Monolithic chips.

Cost and Yeilds were the main factors AMD spoke about when they revealed the where seperating the I/O from the main CPU die, why they have been investing heavily into Infinity Fabric, and why they are not throwing 16 or 32 cores on one massive die.

Apple has also been doing the same thing, adding more and more fixed function chips to aid the CPU and GPU, instead of throwing more silicone into a single monolithic chips.



Stop hate, let others live the life they were given. Everyone has their problems, and no one should have to feel ashamed for the way they were born. Be proud of who you are, encourage others to be proud of themselves. Learn, research, absorb everything around you. Nothing is meaningless, a purpose is placed on everything no matter how you perceive it. Discover how to love, and share that love with everything that you encounter. Help make existence a beautiful thing.

Kevyn B Grams
10/03/2010 

KBG29 on PSN&XBL

KBG29 said:
drkohler said:

Not going to happen. Waaaaay too expensive for a console. Chiplets are a cool thing for stuff like transaction servers that might have hundreds/thousands of processors that do very basic things, like answering questions.

Please explain, because that goes against everything I have read about the reason AMD, Intel, and Nvidia are all moving towards Chiplets for all consumer and enterprise CPUs and GPUs, and away from Monolithic chips.

Cost and Yeilds were the main factors AMD spoke about when they revealed the where seperating the I/O from the main CPU die, why they have been investing heavily into Infinity Fabric, and why they are not throwing 16 or 32 cores on one massive die.

Apple has also been doing the same thing, adding more and more fixed function chips to aid the CPU and GPU, instead of throwing more silicone into a single monolithic chips.

A Zen 2, 8 core CPU chiplet is about 76mm2 and the Radeon 5700 GPU die is about 251mm2, where the XBSX APU is about 360mm2. The XBSX APU would be close to 5X the size of the CPU chiplet and 1.5X as big as the GPU die, though with semi custom modifications that would differ somewhat. Either way SNY should end up with considerably more usable silicon overall, which could very well keep the price down. It may also be part of the reason why they can crank the GPU clocks so high. If the dies are separated on the same interposer, or completely separate on the mobo, that would also help with heat dispersion. I would also think this should make the PS5 upgrade version(s) down the road simpler and cheaper for SNY.



Around the Network
EricHiggin said:

Either way SNY should end up with considerably more usable silicon overall, which could very well keep the price down. It may also be part of the reason why they can crank the GPU clocks so high. If the dies are separated on the same interposer, or completely separate on the mobo, that would also help with heat dispersion. I would also think this should make the PS5 upgrade version(s) down the road simpler and cheaper for SNY.

Let me explain my "waaay too expensive" bit a little further.

First of all, "expensive" in the technological sense does not only mean $ spent, it also means power required (when chip designers talk, they usually think of the second part solely).

Chiplets come into play when you think in terms of a "construction kit" kind of building a whole series of systems. Depending on what the target is, you add chiplets to your designs until you reach your goal. If you have a goal in your mind like "Different (amounts of) cpus interact with different (amounts of) gpus", then chiplets is the future way to go, no question about that.

A console has one single goal, and one single goal only. It is the exact opposite of a "construction kit". It consists of a single set of masks for an SoC or apu, whatever you want to call its core piece. Any set of masks is a very expensive thing to calculate and manufacture, so you want to get away with as few sets as possible.

Now if you propose to make separate dies on an interposer, or even worse, separate on the mobo, the first thing is you need to have is three sets of masks. One for the cpu, one for the i/o (a rather simple set though), and one for the gpu. Notice in a console, none of those mask sets are off-the-shelf things, as all three dies are custom tailored. So on the $ spent side, we are already worse off with separate dies (we leave out the interposer thingie which also addds costs). Next is the fact that an i/o chiplet is just that, a thingie that basically has tranceivers that can send/receive data to/from the other chiplets. That thingie needs power to work. If you put your chiplets completely separate on the mobo, you can save the interposer, but you need a much more powerful i/o chip - as soon as you go "outside" of a chip, your tranceivers require a lot of power. Simply put: More distance=more power required.

Short summary: a single SoC requires less power than a chiplet design, is easier to "maintain consistency", and all things considered costs less to make (even if there might be a small redundancy loss when you have to toss away an entire chip if either cpu or less likely the gpu part malfunctions).

It will be interesting to see what's in store in a few years if there is another generation of consoles. I think we will likely still see SoCs with lots of coprocessors in them. Unless AMD starts building "coprocessor chiplets", something like i/o combined with Sonys ssd/Tempest coprocessors, and other yet to be invented sutff.



drkohler said:
EricHiggin said:

Either way SNY should end up with considerably more usable silicon overall, which could very well keep the price down. It may also be part of the reason why they can crank the GPU clocks so high. If the dies are separated on the same interposer, or completely separate on the mobo, that would also help with heat dispersion. I would also think this should make the PS5 upgrade version(s) down the road simpler and cheaper for SNY.

Let me explain my "waaay too expensive" bit a little further.

First of all, "expensive" in the technological sense does not only mean $ spent, it also means power required (when chip designers talk, they usually think of the second part solely).

Chiplets come into play when you think in terms of a "construction kit" kind of building a whole series of systems. Depending on what the target is, you add chiplets to your designs until you reach your goal. If you have a goal in your mind like "Different (amounts of) cpus interact with different (amounts of) gpus", then chiplets is the future way to go, no question about that.

A console has one single goal, and one single goal only. It is the exact opposite of a "construction kit". It consists of a single set of masks for an SoC or apu, whatever you want to call its core piece. Any set of masks is a very expensive thing to calculate and manufacture, so you want to get away with as few sets as possible.

Now if you propose to make separate dies on an interposer, or even worse, separate on the mobo, the first thing is you need to have is three sets of masks. One for the cpu, one for the i/o (a rather simple set though), and one for the gpu. Notice in a console, none of those mask sets are off-the-shelf things, as all three dies are custom tailored. So on the $ spent side, we are already worse off with separate dies (we leave out the interposer thingie which also addds costs). Next is the fact that an i/o chiplet is just that, a thingie that basically has tranceivers that can send/receive data to/from the other chiplets. That thingie needs power to work. If you put your chiplets completely separate on the mobo, you can save the interposer, but you need a much more powerful i/o chip - as soon as you go "outside" of a chip, your tranceivers require a lot of power. Simply put: More distance=more power required.

Short summary: a single SoC requires less power than a chiplet design, is easier to "maintain consistency", and all things considered costs less to make (even if there might be a small redundancy loss when you have to toss away an entire chip if either cpu or less likely the gpu part malfunctions).

It will be interesting to see what's in store in a few years if there is another generation of consoles. I think we will likely still see SoCs with lots of coprocessors in them. Unless AMD starts building "coprocessor chiplets", something like i/o combined with Sonys ssd/Tempest coprocessors, and other yet to be invented sutff.

I get that it wouldn't only be positives by to going to chiplets.

While we don't know for certain, it's heavily rumored that the reason the 'next gen' XB's are called "Series", is because they will be a series of hardware. XBSX and 'XBSS', and perhaps more at some point in time. If this was/is the case, then why would MS build an APU for the XBSX, unless the 'XBSS' was just a slightly cut down version of it? The rumors don't suggest that at all.

Yet on the other hand, as far as we know, the PS5 will be the sole console, at least until a Pro version later down the line probably. Now it could be an APU, but there seem to be signs that it could possibly be chiplet based, which would have it's positives but also negatives.

If this were to be the case, it would be backwards based on what you've said. Even if PS5 is a typical APU after all, then the question would remain as to why MS wouldn't use chiplets for a series of consoles, if that's actually the case.

Considering how often MS and SNY create the same general type of networks, services, consoles, etc, if MS is really launching a mid range and upper tier console, then it wouldn't be out of the realm of possibility for SNY to do the same. Especially if one of the brands are using chiplets, and even more so if the design is as future proof as it can be for later upgrade models.

AMD themselves also sell lower end APU's as well as lower end CPU's and GPU's. Why wouldn't AMD just have one or the other for general consumers? CPU + GPU only or APU only. It's not like AMD can't make worthy APU's considering what they've started with as to the Ryzen G Series. They made some pretty decent ones for the present gen consoles, as well as a beast for XBSX, though we've yet to see that one in action just yet.

Last edited by EricHiggin - on 29 May 2020

EricHiggin said:

Frank mentions the reason SmartShift is being marketed now, is because it's design features that are already in the APU's core design, are being implemented into discreet CPU and GPU solutions now. I can't help but wonder, with the PS4's, XB1's, and XBSX being APU's that we know for certain, which should naturally have SmartShift already, could the fact that SNY is marketing SmartShift for PS5, be a hint that it's actually going to be a discreet CPU and GPU? Basically a semi custom Ryzen 3700 and Radeon '6700'?

It's an APU. It's been confirmed. This debate is done and dusted, anything else is just conspiracies.

SmartShift is an AMD APU technology.

EricHiggin said:

It's also mentioned that in laptops, the Radeon 5600 was expected to be GTX 1660Ti performance that should boost to RTX 2060, but it's closer to RTX 2060 that can boost to RTX 2070. CPU intensive games drop performance to a 1660Ti level though. With the PS5 GPU basically being a Radeon '6700' (next gen RDNA 5700), the performance of that should be really darn impressive.

Impressive? Not really. But fantastic for a cost-sensitive device that sits on the upper mid-range of the PC market.

EricHiggin said:

Frank also points out how much money in general SmartShift can save on hardware design. Instead of beefing up your hardware, since games tend to favor the CPU or GPU, you can use SmartShift on 'lesser' hardware and still get beefier hardware like performance, while also saving money by not having to use beefier hardware that's not being put to full use. This could mean PS5 may perform closer to XBSX than we might assume, while also helping to keep the PS5 price down.

Nah.

SmartShift is basically a technology which allows for the allocation of TDP (Aka Power+Thermals) depending on need.
Having that uncapped and not needing SmartShift is the obvious preferred solution that results in superior performance.

The issue is, notebooks and consoles can only have a finite amount of cooling and power consumption due to form factor and where they are positioned (Cabinet underneath a TV for instance.)  that limits thermal dissipation.

Sony has opted to put a cap on total system performance and allow the developers to prioritize CPU or GPU performance based on need... Where-as Microsoft's approach is to give us all the performance, all the time.

Microsoft console has the CPU edge and the GPU edge, SmartShift doesn't change that bar in the slightest, is the difference going to be relevant? Probably not. But technically, without a doubt the Xbox Series X has the overall performance edge by about 20% in GPU compute scenarios.


EricHiggin said:

44:00 - 47:00 is where he is specifically asked about PS5. He makes a point about the current consoles having monolithic APU's and laptops having discreet CPU and GPU, and says because of that PS5 may have a custom implementation of SmartShift for an APU architecture. Though he also says, you can assume that based on how the consoles have been architected up until now. So does "up until now" mean that PS5 no longer follows that same monolithic APU design? He already mentioned SmartShift is in APU's as part of their core design, so why would SNY market that if PS5 had an APU?

This is just a random Youtube video, they aren't gospel, they just conform to certain confirmation biases.

EricHiggin said:

I wonder if SNY has access to this level of AMD tech and tool sets? (Not Epic but a Ryzen version anyway)

Epyc is a variant of Zen, which is the same as Ryzen.

AMD is open to building and supporting any semi-custom design requests if you have the wallet for it.

KBG29 said:
I still have not seen anything official release on PS5 using an APU. Since seeing The Road to PS5 Presentation, I have been wondering if they were using a chiplet based SOC. The examples they showed, and and the mention of Smartshift made it seem like they were hinting towards that.

A chiplet based SOC with seperate CPU, GPU, and I/O Chiplets would be very interesting. Could be cheaper than a single Monolithic APU due to higher yields, and also offer better performance per chip.

Looking forward to the PS5 tear down Mark Cerny mentioned.

Sony has gone with a conservative chip size this time around and invested in clockspeeds to increase performance. It's going to be an APU.

KBG29 said:

Please explain, because that goes against everything I have read about the reason AMD, Intel, and Nvidia are all moving towards Chiplets for all consumer and enterprise CPUs and GPUs, and away from Monolithic chips.


If that was the case, AMD's recent APU's would have used a chiplet design, but they don't.
The Xbox Series X would have also used a chiplet design, but it doesn't.

There is certainly a move to chiplets and/or chip stacking, but for APU's, we are a long way from that.



--::{PC Gaming Master Race}::--

Pemalite said:

Sony has gone with a conservative chip size this time around and invested in clockspeeds to increase performance. It's going to be an APU.

the PS5 APU size hasn't been revealed yet and to be honest I wouldn't be surprised if it's around the same size as the XSX one, as the I/O complex could be pretty massive judging by it's capabilities and it includes sram

@topic: that picture screems APU to me