Quantcast
(Update) Rumor: PlayStation 5 will be using Navi 9 (more powerful than Navi 10), new update Jason Schreier said Sony aim more then 10,7 Teraflop

Forums - Gaming Discussion - (Update) Rumor: PlayStation 5 will be using Navi 9 (more powerful than Navi 10), new update Jason Schreier said Sony aim more then 10,7 Teraflop

How accurate this rumored is compared to the reality

Naah 26 35.62%
 
Its 90% close 14 19.18%
 
it's 80% close 8 10.96%
 
it's 70% close 5 6.85%
 
it's 50% close 13 17.81%
 
it's 30% close 7 9.59%
 
Total:73
Bofferbrauer2 said:

I know that fully well.

That being said, AMD was forced to pay up for a specific amount of wafers from Globalfoundries upfront in 2015, whether they'd actually need them or not (and even pay extra if they produce chips at other foundries). Thus, AMD sold their chips to Sony and Microsoft at the lowest possible price for them to ensure they don't need to pay for unused wafers. However, since GF doesn't have a 7nm process, the deal got eased a lot and the Ryzen/Epyc chips already fulfill it, thus having no need to go as low with the price as they did this gen.

Thats not how that happened.... 

AMD had contracted a certain number of chips with GF. This is normal with thee chip contracts for everyone. The issue was that TSCM moved onto a newer process earlier than GF and AMD and its customers needed to be on tat new process.  So even though AMD and their customers had shifted onto a new process AMD was now in breach of contract with GF since they now needed  far fewer wafers from them.

They never were now selling their chips to sony and MS for some bargain bin price. I really don't know where you heard that.



Around the Network
BraLoD said:
Mummelmann said:

Pretty much, 24GB is overkill anyway, any game pushing those kinds of assets will require much more overall oomph than a 500$ mainstream device can pull off any time soon.

The rumor says 20GBs for games plus 4GBs of cheaper ram for the OS.

But even so, it'll have 24GBs for games alone.

Is 24GBs overkill for current gen? Of course it is.

Will it be for next gen, I don't think so.

If they release 2020 they'll have to be relevant up until 2026/2027, that's where it'll make the biggest difference.

As soon as next gen releases game development itself will climb another notch and all games will be asking more of their systems than they do now. It's always like this, as soon as the PS5 is out 4K everything will be the baseline for game development, even if we already have 4K for years, the real push start when the new consoles come. Look up to the PC spec bumps when the next gen starts getting to rol, you have probably already noticed it, tho.

So we'll have that big next gen bump in game development itself plus at least 6 years of evolution for the consoles to keep up with and 16GB of RAM suddenly become a bottleneck that consoles would like to avoid.

IMO there is no chance for the PS5 to release with less than 20GBs of RAM (for games), zero.

16 gig is more than enough for 4k. Anything else just adds to the overall cost of the console.



OdinHades said:
Mummelmann said:

Pretty much, 24GB is overkill anyway, any game pushing those kinds of assets will require much more overall oomph than a 500$ mainstream device can pull off any time soon.

I don't think so. On PC video cards with 8 GB can already be a limiting factor at full HD. Those new consoles will want to push 4K textures and we're talking about shared memory here. So I think 24 GB is the minimum for a console to be somewhat relevant in the next 6 years or so. 32 GB would be better. Anything below that will be a serious bottleneck in the future. Not today and not tomorrow, but soon enough.

16 gig is more than enough for 4k even on pc. 



Screenshot said:
OdinHades said:

I don't think so. On PC video cards with 8 GB can already be a limiting factor at full HD. Those new consoles will want to push 4K textures and we're talking about shared memory here. So I think 24 GB is the minimum for a console to be somewhat relevant in the next 6 years or so. 32 GB would be better. Anything below that will be a serious bottleneck in the future. Not today and not tomorrow, but soon enough.

16 gig is more than enough for 4k even on pc. 

16 GB system memory, yes. But if we're talking about shared memory that would be something like 8 GB system memory + 8 GB VRAM. That wouldn't be enough for gaming in 4K for the next 6 years or so.



Official member of VGC's Nintendo family, approved by the one and only RolStoppable. I feel honored.

Intrinsic said:
Bofferbrauer2 said:

I know that fully well.

That being said, AMD was forced to pay up for a specific amount of wafers from Globalfoundries upfront in 2015, whether they'd actually need them or not (and even pay extra if they produce chips at other foundries). Thus, AMD sold their chips to Sony and Microsoft at the lowest possible price for them to ensure they don't need to pay for unused wafers. However, since GF doesn't have a 7nm process, the deal got eased a lot and the Ryzen/Epyc chips already fulfill it, thus having no need to go as low with the price as they did this gen.

Thats not how that happened.... 

AMD had contracted a certain number of chips with GF. This is normal with thee chip contracts for everyone. The issue was that TSCM moved onto a newer process earlier than GF and AMD and its customers needed to be on tat new process.  So even though AMD and their customers had shifted onto a new process AMD was now in breach of contract with GF since they now needed  far fewer wafers from them.

They never were now selling their chips to sony and MS for some bargain bin price. I really don't know where you heard that.

AMD got forced to buy a specific amount of wafers from GF per year, what the reasons for that are is irrelevant, they had to pay even if AMD didn't need or couldn't sell any more chips from any additional wafers.

And yeah, they got bargain prices. AMD got around 100$ for the OG PS4 chip (the article write must think AMD produces hem out of thin air at no cost), and that's also what the chip did more or less cost in production. So not, AMD did not gain much from it, but was good enough to stay afloat.

Besides, NVidia said themselves they weren't interested because the margins were way too small. Like the article details, NVidia made about 10$ per PS4, which isn't very much and barely worth the work. The X1 was readily available and in surplus, so they had nothing to loose from the Nintendo deal as they needed to do zero work for it. But they stated themselves they won't do custom chips anymore, which also limits any upgrade to the Switch to an X2 unless they change their mind on it.



Around the Network
OdinHades said:
Screenshot said:

16 gig is more than enough for 4k even on pc. 

16 GB system memory, yes. But if we're talking about shared memory that would be something like 8 GB system memory + 8 GB VRAM. That wouldn't be enough for gaming in 4K for the next 6 years or so.

Errrrr..... no. Just no. What in hell are you doing with 8GB of RAM for the CPU? CPU code just doesn't require that much RAM. The reason the GPU ends up taking as much RAM as it does is because of textures and stuff like that. Basically images.

Bofferbrauer2 said:

AMD got forced to buy a specific amount of wafers from GF per year, what the reasons for that are is irrelevant, they had to pay even if AMD didn't need or couldn't sell any more chips from any additional wafers.

And yeah, they got bargain prices. AMD got around 100$ for the OG PS4 chip (the article write must think AMD produces hem out of thin air at no cost), and that's also what the chip did more or less cost in production. So not, AMD did not gain much from it, but was good enough to stay afloat.

Besides, NVidia said themselves they weren't interested because the margins were way too small. Like the article details, NVidia made about 10$ per PS4, which isn't very much and barely worth the work. The X1 was readily available and in surplus, so they had nothing to loose from the Nintendo deal as they needed to do zero work for it. But they stated themselves they won't do custom chips anymore, which also limits any upgrade to the Switch to an X2 unless they change their mind on it.

What are you saying bruh? Like do you understand how chip pricing works at all??? 

AMD gettig $100 for ach chip they give to sony isn't them selling them at bargain prices at all. Thats them selling them at "bulk/OEM" pricing which is totally normal when any company puts in orders in the region of millions. 

Take the 3600G or instance, say AMD sells that at retail for $220, that pansout like this... the actual cost of making each of those chips (what AMD pays to the foundry) is like $30/$40. Then AMD will add their markup to account for things like yields, profits, packaging and shipping..etc. At this point the chip comes up to around $170. Then they put their MSRP sticker price of $220 so te retailers make their own ut too.

If tht chip was going into a console, first off the console manufacturer will pay a sizeable sum to "customize" their chip. This reduces how much AMD spends on R&D for that chip and nothing stops them from taking elements of that chips design into their general product line. Then AMD is not worrying about costs like packaging, shiping, marketing and there isn't a retailer cut either. AMD also isn't worrying about yields as that will be something sony/ms absorbs. 

So selling each chip for $100 wilbe them making a good deal amount of money.

I don't even get how any of this is relevant..... are you saying that AMD is somehow not going to be selling chips at that prices anymore because they are doing well now? Well if that is what you are saying then you are just wrong. There is a reason why even Apple only puts AMD GPUs in their computers. And Nvidia is just nonsense with regards to the kinda hardware that works for consoles. Not only are they resistant to drop prices, they also just don't make APUs (that aren't ARM based). So sony/ms using them will mean they 'must" build a discrete cpu/gpu system.



Intrinsic said:
OdinHades said:

16 GB system memory, yes. But if we're talking about shared memory that would be something like 8 GB system memory + 8 GB VRAM. That wouldn't be enough for gaming in 4K for the next 6 years or so.

Errrrr..... no. Just no. What in hell are you doing with 8GB of RAM for the CPU? CPU code just doesn't require that much RAM. The reason the GPU ends up taking as much RAM as it does is because of textures and stuff like that. Basically images.

Bofferbrauer2 said:

AMD got forced to buy a specific amount of wafers from GF per year, what the reasons for that are is irrelevant, they had to pay even if AMD didn't need or couldn't sell any more chips from any additional wafers.

And yeah, they got bargain prices. AMD got around 100$ for the OG PS4 chip (the article write must think AMD produces hem out of thin air at no cost), and that's also what the chip did more or less cost in production. So not, AMD did not gain much from it, but was good enough to stay afloat.

Besides, NVidia said themselves they weren't interested because the margins were way too small. Like the article details, NVidia made about 10$ per PS4, which isn't very much and barely worth the work. The X1 was readily available and in surplus, so they had nothing to loose from the Nintendo deal as they needed to do zero work for it. But they stated themselves they won't do custom chips anymore, which also limits any upgrade to the Switch to an X2 unless they change their mind on it.

What are you saying bruh? Like do you understand how chip pricing works at all??? 

AMD gettig $100 for ach chip they give to sony isn't them selling them at bargain prices at all. Thats them selling them at "bulk/OEM" pricing which is totally normal when any company puts in orders in the region of millions. 

Take the 3600G or instance, say AMD sells that at retail for $220, that pansout like this... the actual cost of making each of those chips (what AMD pays to the foundry) is like $30/$40. Then AMD will add their markup to account for things like yields, profits, packaging and shipping..etc. At this point the chip comes up to around $170. Then they put their MSRP sticker price of $220 so te retailers make their own ut too.

If tht chip was going into a console, first off the console manufacturer will pay a sizeable sum to "customize" their chip. This reduces how much AMD spends on R&D for that chip and nothing stops them from taking elements of that chips design into their general product line. Then AMD is not worrying about costs like packaging, shiping, marketing and there isn't a retailer cut either. AMD also isn't worrying about yields as that will be something sony/ms absorbs. 

So selling each chip for $100 wilbe them making a good deal amount of money.

I don't even get how any of this is relevant..... are you saying that AMD is somehow not going to be selling chips at that prices anymore because they are doing well now? Well if that is what you are saying then you are just wrong. There is a reason why even Apple only puts AMD GPUs in their computers. And Nvidia is just nonsense with regards to the kinda hardware that works for consoles. Not only are they resistant to drop prices, they also just don't make APUs (that aren't ARM based). So sony/ms using them will mean they 'must" build a discrete cpu/gpu system.

Not to forget on PC the 8GB "for CPU" have a lot to do with OS.

So when a console let's say come with 16GB for CPU+GPU in consoles (maybe 4GB for CPU and 12GB for GPU) plus 4GB for OS it will have all that it need.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

OdinHades said:
Screenshot said:

16 gig is more than enough for 4k even on pc. 

16 GB system memory, yes. But if we're talking about shared memory that would be something like 8 GB system memory + 8 GB VRAM. That wouldn't be enough for gaming in 4K for the next 6 years or so.

My point was never what was required for 4K gaming; it was about mainstream priced hardware being able to pull it off. Proper 4K gaming on mainstream devices and with decent frame rates and effects is still a long way off. 20GB of GDDR6 will be expensive as hell in and on itself, especially given that the final build will need to be ready within 9-12 months, most likely, as it's unlikely that Sony will wait all that long before releasing a new PS.

And even if, by some miracle, one managed to pack a mainstream 500$ device with a heap of GDDR6 memory, shared or otherwise, the true bottlenecks would be the rest of the build where they'd inevitably need to save a ton on cheaper solutions, especially if rumors of BC are to be believed. A RTX 2080 with 8 GB memory currently sits at around 750-850$ alone, it can do 4K at good frame rates in most games (65-80 and above is good in my opinion). To get performance at nearly that level in a mainstream box for 500$ within a year or so is quite simply not happening.

Consoles aren't future proof, that's the whole point in all of this. Any console released today will be hopelessly outdated long before it's replaced. Seeing the state of streamed 4K content on TV right now, one can imagine the time it will take before games with rendered assets will reach an acceptable point on any mainstream device. Proper, stable 4K gaming has only been possible on high-end PC's for a couple of years as it is. The 1080 Ti struggled to creep past the 60 mark on fps in most titles.

Again, my point was never what will be required in the future, rather that what will be required in the future will not be met by upcoming consoles, not even close if they want any sort of approachable price point. As far as limits on video memory, it's hard to find ways to strain a modern high-end GPU with 11GB memory or more, even my aging 980 Ti with only 6GB is still doing okay, albeit at 1440p resolutions and not 4K. And, one last time; if high-end GPU's right now are getting on nicely with their allotted memory, there's no way that any affordable machine in 2020 or so will release with similar or better specs.



End of 2016 hardware sales:

Wii U: 15 million. PS4: 54 million. One: 30 million. 3DS: 64.8 million. PSVita: 15.2 million.

Mummelmann said:
OdinHades said:

16 GB system memory, yes. But if we're talking about shared memory that would be something like 8 GB system memory + 8 GB VRAM. That wouldn't be enough for gaming in 4K for the next 6 years or so.

My point was never what was required for 4K gaming; it was about mainstream priced hardware being able to pull it off. Proper 4K gaming on mainstream devices and with decent frame rates and effects is still a long way off. 20GB of GDDR6 will be expensive as hell in and on itself, especially given that the final build will need to be ready within 9-12 months, most likely, as it's unlikely that Sony will wait all that long before releasing a new PS.

And even if, by some miracle, one managed to pack a mainstream 500$ device with a heap of GDDR6 memory, shared or otherwise, the true bottlenecks would be the rest of the build where they'd inevitably need to save a ton on cheaper solutions, especially if rumors of BC are to be believed. A RTX 2080 with 8 GB memory currently sits at around 750-850$ alone, it can do 4K at good frame rates in most games (65-80 and above is good in my opinion). To get performance at nearly that level in a mainstream box for 500$ within a year or so is quite simply not happening.

Consoles aren't future proof, that's the whole point in all of this. Any console released today will be hopelessly outdated long before it's replaced. Seeing the state of streamed 4K content on TV right now, one can imagine the time it will take before games with rendered assets will reach an acceptable point on any mainstream device. Proper, stable 4K gaming has only been possible on high-end PC's for a couple of years as it is. The 1080 Ti struggled to creep past the 60 mark on fps in most titles.

Again, my point was never what will be required in the future, rather that what will be required in the future will not be met by upcoming consoles, not even close if they want any sort of approachable price point. As far as limits on video memory, it's hard to find ways to strain a modern high-end GPU with 11GB memory or more, even my aging 980 Ti with only 6GB is still doing okay, albeit at 1440p resolutions and not 4K. And, one last time; if high-end GPU's right now are getting on nicely with their allotted memory, there's no way that any affordable machine in 2020 or so will release with similar or better specs.

except 60fps is hardly something console gaming requires or cares for most genres.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Intrinsic said:

Bofferbrauer2 said:

AMD got forced to buy a specific amount of wafers from GF per year, what the reasons for that are is irrelevant, they had to pay even if AMD didn't need or couldn't sell any more chips from any additional wafers.

And yeah, they got bargain prices. AMD got around 100$ for the OG PS4 chip (the article write must think AMD produces hem out of thin air at no cost), and that's also what the chip did more or less cost in production. So not, AMD did not gain much from it, but was good enough to stay afloat.

Besides, NVidia said themselves they weren't interested because the margins were way too small. Like the article details, NVidia made about 10$ per PS4, which isn't very much and barely worth the work. The X1 was readily available and in surplus, so they had nothing to loose from the Nintendo deal as they needed to do zero work for it. But they stated themselves they won't do custom chips anymore, which also limits any upgrade to the Switch to an X2 unless they change their mind on it.

What are you saying bruh? Like do you understand how chip pricing works at all??? 

AMD gettig $100 for ach chip they give to sony isn't them selling them at bargain prices at all. Thats them selling them at "bulk/OEM" pricing which is totally normal when any company puts in orders in the region of millions. 

Take the 3600G or instance, say AMD sells that at retail for $220, that pansout like this... the actual cost of making each of those chips (what AMD pays to the foundry) is like $30/$40. Then AMD will add their markup to account for things like yields, profits, packaging and shipping..etc. At this point the chip comes up to around $170. Then they put their MSRP sticker price of $220 so te retailers make their own ut too.

If tht chip was going into a console, first off the console manufacturer will pay a sizeable sum to "customize" their chip. This reduces how much AMD spends on R&D for that chip and nothing stops them from taking elements of that chips design into their general product line. Then AMD is not worrying about costs like packaging, shiping, marketing and there isn't a retailer cut either. AMD also isn't worrying about yields as that will be something sony/ms absorbs. 

So selling each chip for $100 wilbe them making a good deal amount of money.

I don't even get how any of this is relevant..... are you saying that AMD is somehow not going to be selling chips at that prices anymore because they are doing well now? Well if that is what you are saying then you are just wrong. There is a reason why even Apple only puts AMD GPUs in their computers. And Nvidia is just nonsense with regards to the kinda hardware that works for consoles. Not only are they resistant to drop prices, they also just don't make APUs (that aren't ARM based). So sony/ms using them will mean they 'must" build a discrete cpu/gpu system.

@bolded: We don't even know if that's a true chip (and at 20CU, I really doubt it, especially considering it will be totally bandwith starved even with DDR4 4000). But I digress.

The actual cost depends how much AMD has to pay per wafer, divided by how many chips on that wafer are salvageable for that purpose. So let's say a wafer cost 1000$ (I'm just making up a price here), 20 such chips would fit on it but only 10 would be fully functioning, the others would have to be sold as either 3400G or binned entirely due to defects. In this case AMD would certainly charge at least 100$ on the 3600G to cover the costs already, and use the 3400G for winnings.

However, on a console that's not possible, hence why the PS4 has 2 deactivated CU to improve the yield rate.

@italic: These costs are not always covered, I can remember that the cost of some chips were actually worked into the yearly contracts instead of receiving a sum early on. And considering AMD didn't seem to have gotten any Lump sum (if they did, it doesn't show up in the financial reports at least), I do think they have to cover for those expenses with the chip sales.

@underlined: Well, no, I'm not saying that they won't do it anymore, but rather that they are not obliged to do so anymore to have any sizable income at all.

At the time when the PS4/XBO came out, AMD CPUs were doing very badly and were the laughingstock of the industry. They just released Hawaii, but had much problems keeping up with NVidias updated Kepler (GeForce 700 series), so earnings were breaking away left and right and could only really compete over the price. As a result their profit margin plummeted, and still is awfully low for the sector (it's under 50% while Intel and NVidia are close to or above 70%; at the time it even dropped below 30%, which is bad in any sector). All this made that AMD was desperate for some stable income, which made Sony and Microsoft holding all the cards during the price negotiations. But that won't be the case this time, and AMD will squeeze some winnings out of the chips.

Also, as a side note, you give the costs at 30-40$. Tell me how that works if about half of the sales are from console chips (which was true in 2016) yet the profit margin is at only 24%? Do you think AMD sold their other chips all below production price? And how could that be, considering most chips cost much more than the one in the PS4? Or do you think they had such an R&D expense that it covers half the expenses before wages and taxes? Just saying that your price is off, it may be well below 100$ by then, but I don't think anywhere close to the numbers you're putting there, more like 60-80$. Don't forget that 350mm2 ain't exactly a small chip (a 10 core Skylake-X is only 322mm2, for instance) and that such a big chip normally sells at quite some higher prices for reasons detailed above.

Your Apple example is a bit special, they use it due to OGL and OCL capabilities, where NVidia is weaker than AMD and generally has been like that. Them being cheaper than NVidia is only icing on the cake. But that's going to change soon anyway, considering that Apple wants to design all their chips in-house and are migrating everything to ARM.