By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - John Kodera: “PS4 Is Entering The Final Stages Of It’s Life-Cycle.”

Pemalite said:
Bofferbrauer2 said:

As if you weren't aware that GPU get clocked down to fit into the tight power consumption maximals of a console. Don't expect the PS5 to come with less than 56 CU, but higher clock speeds instead.

Uh. What? It's a balancing act. So yes I am aware.
Increasing the size of the chip by adding more CU's directly increases power consumption as every single transistor you add requires energy.

There is a reason why nVidia with Pascal decided not to blow out transistor counts and instead focused on driving the clockrates of it's chips up.
It was what offered the best performance/power consumption for a given chip size.

At 7nm (I hate using that term, because it's not a real 7nm process) you can use the extra TDP to drive up clock rates. (Provided you have the appropriate layout, transistor types etc'.)

 

Bofferbrauer2 said:

That won't work out unless Navi base clock is above 1800Mhz, which I very much doubt.

Why not?

There are Pascal chips that boost to 2ghz at 14nm. And that is before overclocking.

Even Pascals base clock is only around 1400Mhz, far from the 1800 I was talking about. It's a big step up from Maxwells 1000 Mhz, but not as huge as you make it out to be. 

They can clock faster, but the TDP is going up accordingly. This includes 2Ghz Models, just check their actual Power consumption. Hint: It's north of 300W in case of a 1080 (non-Ti)

And in case you didn't get it yet, GCN was never meant for such high clock speeds. Vega is mostly clocked too high for it's own good to limit the distance between themselves and NVidia. It's the only way they could do it due to the hard limit of 64 CU with Vega.  As a result the power consumption explodes. A Vega at 1200Mhz consumes much less than it does at 1400Mhz, where most consumer cards are clocked at. The Vega in the Ryzen APU is clocked less agressively and as a result, consumes much less power

Bofferbrauer2 said:

To get enough distance between themselves and the One X (40 CU @1172Mhz), 56 CU will already have to come with about 1500Mhz. That's already close to max clock rate for Vega and in 14nm definitely too much power consumption and heat for a console. In 7nm this should be much more feasible but will still draw a lot of power.

The Xbox One X is using an older Polaris derived part.
As someone who has a Polaris GPU and the Xbox One X I can assure you they are both inefficient, slow, mid-range hardware.

I mean shit... Neither have draw stream rasterization, primitive shaders or rapid packed math... Graphics Core Next in the console is simply slow, old and inefficient, The Xbox One X is no exception. - It is not overtly difficult to make big performance gains.
Heck AMD haven't even enabled draw stream rasterization in it's drivers and relegated primitive shaders for something for developers to opt-into via an API with Vega, those are efficiency gains going to waste.

Fact is... 64 CU's are enough for next gen, with ample clock rate and architectural refinement.

If Vega would be much faster than Polaris at same clock speed, I would agree. But a simulated Polaris at same clock speed is just marginally slower than Vega (though less powerhungry). There's a big reason why Vega is considered so disappointing.

Bofferbrauer2 said:

@bolded: Those are part of the Compute Units (unless you meant CPU Cache too)

False.
I suggest you look at this layout.


Oh ffs, that's a block Diagramm, not a layout!

And even then: You can see the L2 Cache, where do you think the L1 caches are? Yep, that's right: In the Compute Units. And the Graphics Pipeline in that diagramm is just the Front-end of a Compute Unit. What's marked with NCU are the cores of each Compute Units. 



Around the Network
DonFerrari said:
Pemalite said:

Not really.
You could have 64CU's @ 300mhz.
Or you could have 32CU's @ 600mhz.

They hypothetically would have the same output.
...But that also ignores things like memory buses, caches, various fixed function pipelines and so on.

Isn't the case that more CU would give you more connections and thusfore more bandwidth?

Not always. Because the CU's themselves aren't tied to the actual ROP's.
Vega 56 and Vega 64 have the same amount of ROPS, despite having different CU counts... The only reason why Vega 56 despite having less CU's has lower bandwidth is because of it's lower memory clock.

Bofferbrauer2 said:

Even Pascals base clock is only around 1400Mhz, far from the 1800 I was talking about. It's a big step up from Maxwells 1000 Mhz, but not as huge as you make it out to be.

Please use the quoting system properly. Otherwise I will not bother to reply in future.

And 1400mhz isn't a big step up from 1000mhz? 40% isn't a big step up? Seriously? This is in conjunction with more transistors as well?

1632mhz base on some 1080Ti's, but they spend the bulk of their time at a much higher clock rate.
Which is a 63% minimum increase over the 980Ti.

Bofferbrauer2 said:
They can clock faster, but the TDP is going up accordingly. This includes 2Ghz Models, just check their actual Power consumption. Hint: It's north of 300W in case of a 1080 (non-Ti)

Not always. You see Transistors have an efficiency curve, once you hit the right voltage, with the right frequency you get an optimal amount of performance per watt. It's a very simple concept.

The 1080Ti has a TDP of 250w, the 980Ti has a TDP of 250w.
Actual power consumption hasn't blown out like you imply either.
https://www.anandtech.com/show/11180/the-nvidia-geforce-gtx-1080-ti-review/16

That is not insignificant.

Bofferbrauer2 said:
And in case you didn't get it yet, GCN was never meant for such high clock speeds.

Graphics Core Next is an extremely modular design.
AMD took GCN and touched it up to clock higher with Vega, AMD actually spent the bulk of it's extra transistor budget over Fiji to achieving that, thus it was designed for high clock speeds.
But don't take my word for it: https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/2

Bofferbrauer2 said:
Vega is mostly clocked too high for it's own good to limit the distance between themselves and NVidia. It's the only way they could do it due to the hard limit of 64 CU with Vega.  As a result the power consumption explodes. A Vega at 1200Mhz consumes much less than it does at 1400Mhz, where most consumer cards are clocked at. The Vega in the Ryzen APU is clocked less agressively and as a result, consumes much less power

I have already touched on prior in this thread on why Vega is inefficient. It's not due to just the clock rates.
The entire Graphics Core Next architecture is inefficient regardless of clock or product segment.

Fact of the matter is... GPU's like Pascal are doing tile based rasterization, GCN isn't.
Again, don't take my word for it:
https://www.anandtech.com/show/10536/nvidia-maxwell-tile-rasterization-analysis
https://forum.beyond3d.com/threads/amd-vega-hardware-reviews.60246/page-59#post-1997699
https://forum.beyond3d.com/threads/amd-vega-hardware-reviews.60246/page-45#post-1995903


Bofferbrauer2 said:
Oh ffs, that's a block Diagramm, not a layout!

Correct. But the purpose of the block Diagram is to show which units are paired with what in a graphical layout.

Bofferbrauer2 said:

And even then: You can see the L2 Cache, where do you think the L1 caches are? Yep, that's right: In the Compute Units. And the Graphics Pipeline in that diagramm is just the Front-end of a Compute Unit. What's marked with NCU are the cores of each Compute Units.

There is more than just L1 caches.
The rest is just a rehash of crap I already know.



--::{PC Gaming Master Race}::--

Pemalite said:
DonFerrari said:

Isn't the case that more CU would give you more connections and thusfore more bandwidth?

Not always. Because the CU's themselves aren't tied to the actual ROP's.
Vega 56 and Vega 64 have the same amount of ROPS, despite having different CU counts... The only reason why Vega 56 despite having less CU's has lower bandwidth is because of it's lower memory clock.

Bofferbrauer2 said:

Even Pascals base clock is only around 1400Mhz, far from the 1800 I was talking about. It's a big step up from Maxwells 1000 Mhz, but not as huge as you make it out to be.

Please use the quoting system properly. Otherwise I will not bother to reply in future.

And 1400mhz isn't a big step up from 1000mhz? 40% isn't a big step up? Seriously? This is in conjunction with more transistors as well?

1632mhz base on some 1080Ti's, but they spend the bulk of their time at a much higher clock rate.
Which is a 63% minimum increase over the 980Ti.

Bofferbrauer2 said:
They can clock faster, but the TDP is going up accordingly. This includes 2Ghz Models, just check their actual Power consumption. Hint: It's north of 300W in case of a 1080 (non-Ti)

Not always. You see Transistors have an efficiency curve, once you hit the right voltage, with the right frequency you get an optimal amount of performance per watt. It's a very simple concept.

The 1080Ti has a TDP of 250w, the 980Ti has a TDP of 250w.
Actual power consumption hasn't blown out like you imply either.
https://www.anandtech.com/show/11180/the-nvidia-geforce-gtx-1080-ti-review/16

That is not insignificant.

Bofferbrauer2 said:
And in case you didn't get it yet, GCN was never meant for such high clock speeds.

Graphics Core Next is an extremely modular design.
AMD took GCN and touched it up to clock higher with Vega, AMD actually spent the bulk of it's extra transistor budget over Fiji to achieving that, thus it was designed for high clock speeds.
But don't take my word for it: https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/2

Bofferbrauer2 said:
Vega is mostly clocked too high for it's own good to limit the distance between themselves and NVidia. It's the only way they could do it due to the hard limit of 64 CU with Vega.  As a result the power consumption explodes. A Vega at 1200Mhz consumes much less than it does at 1400Mhz, where most consumer cards are clocked at. The Vega in the Ryzen APU is clocked less agressively and as a result, consumes much less power

I have already touched on prior in this thread on why Vega is inefficient. It's not due to just the clock rates.
The entire Graphics Core Next architecture is inefficient regardless of clock or product segment.

Fact of the matter is... GPU's like Pascal are doing tile based rasterization, GCN isn't.
Again, don't take my word for it:
https://www.anandtech.com/show/10536/nvidia-maxwell-tile-rasterization-analysis
https://forum.beyond3d.com/threads/amd-vega-hardware-reviews.60246/page-59#post-1997699
https://forum.beyond3d.com/threads/amd-vega-hardware-reviews.60246/page-45#post-1995903


Bofferbrauer2 said:
Oh ffs, that's a block Diagramm, not a layout!

Correct. But the purpose of the block Diagram is to show which units are paired with what in a graphical layout.

Bofferbrauer2 said:

And even then: You can see the L2 Cache, where do you think the L1 caches are? Yep, that's right: In the Compute Units. And the Graphics Pipeline in that diagramm is just the Front-end of a Compute Unit. What's marked with NCU are the cores of each Compute Units.

There is more than just L1 caches.
The rest is just a rehash of crap I already know.

Understood. But let's say that if you increase the number of CUs it also makes senso to increase bandwidth to feed them right? Or would you say design would chose more CUs just to cut down the clockrate and consumption but keep the same general processing capacity?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

2021 is my bet. CPU increase is mandatory.

Last edited by KingofTrolls - on 27 May 2018

Pemalite said: 

(Remember, consoles cannot afford high-end hardware!)

To a degree I guess. We are not talking about Nintendo, also Xbox One shows that power matters. I think the next gen will be a 180 turn in console philosophy, but we shall see.



Around the Network
KingofTrolls said:

Pemalite said: 

(Remember, consoles cannot afford high-end hardware!)

To a degree I guess. We are not talking about Nintendo, also Xbox One shows that power matters. I think the next gen will be a 180 turn in console philosophy, but we shall see.

I doubt Sony will go all in on an expensive, but extreamly capable PS5. Remember what happened with PS3. I would love to believe times have changed, and people would embrace a console that delivers an unprecedented power to price ratio. But, after Sony tried giving us an $800 PS3 for $500, and seeing it nearly destroy their brand, I just don't have faith in consumers.

PS3 started selling well after Sony stripped nearly all of the value out of the system. PS4 shot out of the gate with next to no games, and extreamly limited additional value. You can also look at the PS Vita and XBO for other examples where power and/or functionallity could not justify a slight bump in price.

 Sadley it appears no amount of power and functionallity can add enough value for people to justify even the slightest uptick in console pricing. Consumers just like t be ripped off, the less value you can give them per dollar, the more happy they are, and the faster your device flys off the shelves. 

Ultimately I would love it if next gen could start with multiple teirs and form factors. Seeing a $400 1080p entry level PS5 Home Console, a $600 PS5 Phone, and an $800 PS5 4K model would would be awesome. Unfortunately, instead of seeing it as an oppertunity to hop in where you are comfortable, I believe people would just complain about the high end unit costing so much, and in turn, not purchase any version of the system.

I think the only way Sony can offer more power is through a later enhanced edition like we saw with PS4 Pro. Get the PS5 out at $400, and let people go crazy building a nice baseline. Then once it is establihed, they can offer a high end model and a cheaper baseline model at the first major fabrication shrink. After seeing how successful PS4 Pro and Xbox One X have been, they should go all in on the PS5 Pro, and truly deliver for their core fan base. I believe that is the safest course of action. 



Stop hate, let others live the life they were given. Everyone has their problems, and no one should have to feel ashamed for the way they were born. Be proud of who you are, encourage others to be proud of themselves. Learn, research, absorb everything around you. Nothing is meaningless, a purpose is placed on everything no matter how you perceive it. Discover how to love, and share that love with everything that you encounter. Help make existence a beautiful thing.

Kevyn B Grams
10/03/2010 

KBG29 on PSN&XBL

DonFerrari said:

Understood. But let's say that if you increase the number of CUs it also makes senso to increase bandwidth to feed them right? Or would you say design would chose more CUs just to cut down the clockrate and consumption but keep the same general processing capacity?

Depends on the workload and how memory bandwidth starved the hardware was to begin with.
FuryX has more bandwidth than Vega 64 for example (512GB/s vs 483GB/s) but Vega 64 beats it every day of the week.

Sometimes a significant cut-down in bandwidth has a negligible performance hit.

With that in mind, more bandwidth is generally better.

Microsoft/Sony/AMD will have the data available to work out their bandwidth needs for their hardware anyway, so they will choose whatever offers the best price/performance.

KingofTrolls said:

To a degree I guess. We are not talking about Nintendo, also Xbox One shows that power matters. I think the next gen will be a 180 turn in console philosophy, but we shall see.

The Xbox One X isn't even using remotely high-end hardware.



--::{PC Gaming Master Race}::--

KBG29 said:

1.I doubt Sony will go all in on an expensive, but extreamly capable PS5. Remember what happened with PS3.

2.Ultimately I would love it if next gen could start with multiple teirs and form factors. Seeing a $400 1080p entry level PS5 Home Console, a $600 PS5 Phone, and an $800 PS5 4K model would would be awesome.

PS3 was another story. It was multiplatform-wise and games wise in general ...  it looked worse than 360 despite huuuuge price difference. Early PS3 games were PS2 games with bumped graphics, now compare it to Gears of War, online system etc. Thats the point. Also PS2 alone was a huge competitor.

The audience - in early days - saw Cell and Blu ray as misinvestment. Early PS3 user feel no difference with his hard disk, because 360 games were insta play. Simple as that.

2. No no no. PS4 Phone is a thing nobody wants, also nobody wants to feel like 2nd class consumer when he opens his/her brandly new console. Hard to explain it to me, just look at Indian cars Tata. They got awesome price/quality offering, but were mostly viewed as " car for poorers". That kill the sales.

Console gamers are not PC gamers. Any kind of segretation is very dangerous and create rightfylly backslash. I saw ur post here from time to time and I can say u are a minority enthustiast, that's all.

Last edited by KingofTrolls - on 28 May 2018

Pemalite said:

The Xbox One X isn't even using remotely high-end hardware.

True, but I was talking about OG Xbox One. Nobody wants to deal with inferior hardware for 5+ years. Yes, mid gens mixed this a little but point generally stands. Better hardware with proven, clearly visible  advantages should justify 50-100 higher price tag over competitor.

edit - and one more thing, console bussines model expands to royalties, services etc. A potential loss of sales in early days is not as dramatic as it used to be.

Last edited by KingofTrolls - on 28 May 2018

KBG29 said:

I doubt Sony will go all in on an expensive, but extreamly capable PS5. Remember what happened with PS3. I would love to believe times have changed, and people would embrace a console that delivers an unprecedented power to price ratio. But, after Sony tried giving us an $800 PS3 for $500, and seeing it nearly destroy their brand, I just don't have faith in consumers.

It's not the consumers' faults, the corporation has to convince people that their product is worth getting.