By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Sources: AMD Has Created Navi FOR Sony’s PlayStation 5.

Pemalite said:
Trumpstyle said:

Hi, sorry for late respond

What I meant with my previous post is you can have 8x 1,5gb ram sticks (or 4x 3gb ram sticks) for 256-bit buswidth for full bandwidth performance without any penalty. This can be used for 12GB of ram or even 24gb of ram with clamshell mode. This was not possible with GDDR5 and is not similiar to the 7950/7970 or 970 situation.

The point your missing is that 12 Gigabit chips aren't in the pipeline... They have been defined by JEDEC as a standard, that's pretty much it, they might not ever exist.
Samsung for instance -is- mass producing 16 Gigabit chips.

https://www.anandtech.com/show/12338/samsung-starts-mass-production-of-gddr6-memory

If something isn't being manufactured, then it's likely not going to be implemented in GPU's or Consoles, it's that simple.

Hynix for instance is going even lower capacity than Samsung for it's initial run and leveraging 8 Gigabit chips.
https://www.anandtech.com/show/12345/sk-hynix-lists-gddr6-memory-as-available-now

Now to convert Bit's into Bytes you divide by 8.
Thus...
12Gb/8 = 1.5GB.
16Gb/8 = 2GB.
8Gb/8 = 1GB.

Trumpstyle said:

So 12, 16 or 24 GB of ram is possible for next-gen consoles in a 256-bit buswidth.  I also recommend you don't read wccftech, fudzilla and tweaktown, they just make up stuff.

The Geforce 970 situation was well documented throughout the entire PC tech-sphere, so in this instance wccftech is reliable.
But hell. Here is Anand's take on the same issue.

https://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation

BTW, 12Gb chips aren't even particularly useful or practical for ECC configurations, as these ones use typically 72bit to add ECC capabilities to a 64bit word, and that's simply achieved adding 8bit to those 64bit. Even thinking to use just one 12Gb chip to provide all the necessary bits one can easily see it's not the right size, as 72/64=9/8 ratio, while 12GB can't be divided exactly by 9 to assign 8 parts to data and one part to parity.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


Around the Network

I think this whole memory talk is being overly complicated. The way I see it is this

GDDR5 currently has a chip peak capacity of 1GB. So if you want 8GB/12GB in whatever application you are doing you need 8/12 chips on the board respectively. Its possible to have them arranged in a clamshell format (having a chip on either side of the board which literally means you can double the density of chips you can have in the same sized board but this adds complications and cost to overall design and efficiency isses to the chips performance. Not enough to break things but enough to ideally be avoided if they can.

GDDR6 is all of the abpve but with a higher chip capacity of 2GB/chip, more pins allowing for higher bandwith per chip and thats basically it.

So theoretically, if say the PS4 is beiing built with GDDR6, using the same 8 chip array we see in the PS4slim/PS4pro boards to get 8GB of GDDr5 we would instead have 16GB of GDDR6. The XB1X will have 24GB of GDDR6. Bandwiths will all be significantly higher too and i think its as simple as that.

And this is going with the most straightforward design implementation of GDDR as opposed to any complex clamshell nonsense, which in truth can double capacity.

I don't know why some would think its unlikely then to have as much as 24GB of GDDR6 in the PS5/XB2.



I don't care what they bring but it better be backwards compatible with my PS4 library! Make it happen Sony



Alby_da_Wolf said:
Pemalite said:

The point your missing is that 12 Gigabit chips aren't in the pipeline... They have been defined by JEDEC as a standard, that's pretty much it, they might not ever exist.
Samsung for instance -is- mass producing 16 Gigabit chips.

https://www.anandtech.com/show/12338/samsung-starts-mass-production-of-gddr6-memory

If something isn't being manufactured, then it's likely not going to be implemented in GPU's or Consoles, it's that simple.

Hynix for instance is going even lower capacity than Samsung for it's initial run and leveraging 8 Gigabit chips.
https://www.anandtech.com/show/12345/sk-hynix-lists-gddr6-memory-as-available-now

Now to convert Bit's into Bytes you divide by 8.
Thus...
12Gb/8 = 1.5GB.
16Gb/8 = 2GB.
8Gb/8 = 1GB.

The Geforce 970 situation was well documented throughout the entire PC tech-sphere, so in this instance wccftech is reliable.
But hell. Here is Anand's take on the same issue.

https://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation

BTW, 12Gb chips aren't even particularly useful or practical for ECC configurations, as these ones use typically 72bit to add ECC capabilities to a 64bit word, and that's simply achieved adding 8bit to those 64bit. Even thinking to use just one 12Gb chip to provide all the necessary bits one can easily see it's not the right size, as 72/64=9/8 ratio, while 12GB can't be divided exactly by 9 to assign 8 parts to data and one part to parity.

Okej, you guys are very funny dudes. Very nice trolling, should have caught it earlier. 1,5/3 Gb ram sticks is already widely used in the smartphone market and is now coming to the discrete gpu market. Pemalite giving me a link to guess estimate from anandtech of the gddr5x specifications gave him away.

https://www.micron.com/products/dram/lpdram/lpdram-part-catalog#/density%5B%5D=12Gb&density%5B%5D=24Gb

And very soon for Gpu market, so enjoy :)



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Trumpstyle said:

Okej, you guys are very funny dudes. Very nice trolling, should have caught it earlier. 1,5/3 Gb ram sticks is already widely used in the smartphone market and is now coming to the discrete gpu market. Pemalite giving me a link to guess estimate from anandtech of the gddr5x specifications gave him away.

https://www.micron.com/products/dram/lpdram/lpdram-part-catalog#/density%5B%5D=12Gb&density%5B%5D=24Gb

And very soon for Gpu market, so enjoy :)

Nope. You are still wrong. I won't point out the L in LPDDR. Or the fact it is a separate technology from GDDR6 entirely.

Alby_da_Wolf said:

BTW, 12Gb chips aren't even particularly useful or practical for ECC configurations, as these ones use typically 72bit to add ECC capabilities to a 64bit word, and that's simply achieved adding 8bit to those 64bit. Even thinking to use just one 12Gb chip to provide all the necessary bits one can easily see it's not the right size, as 72/64=9/8 ratio, while 12GB can't be divided exactly by 9 to assign 8 parts to data and one part to parity.

There are ways around that problem, but I won't get into it here as the discussion would quickly go off topic.




www.youtube.com/@Pemalite

Around the Network
Trumpstyle said:
Alby_da_Wolf said:

BTW, 12Gb chips aren't even particularly useful or practical for ECC configurations, as these ones use typically 72bit to add ECC capabilities to a 64bit word, and that's simply achieved adding 8bit to those 64bit. Even thinking to use just one 12Gb chip to provide all the necessary bits one can easily see it's not the right size, as 72/64=9/8 ratio, while 12GB can't be divided exactly by 9 to assign 8 parts to data and one part to parity.

Okej, you guys are very funny dudes. Very nice trolling, should have caught it earlier. 1,5/3 Gb ram sticks is already widely used in the smartphone market and is now coming to the discrete gpu market. Pemalite giving me a link to guess estimate from anandtech of the gddr5x specifications gave him away.

https://www.micron.com/products/dram/lpdram/lpdram-part-catalog#/density%5B%5D=12Gb&density%5B%5D=24Gb

And very soon for Gpu market, so enjoy :)

I don't really get what you are saying but I can assure you....

LPDDR3/4 is very very very evry very diffferent from GDDR5/6.

And whoever goes with 1.3GB would be slamming themselves into a hole cause they would be the only ones using it which would make it more expensive for them than say using 2GB chips that the rest of the industry is using; thats economy of scale for you right there.

But this is all moot, next gen will at the very worst use 2GB chips of GDDR6 and would total either 16GB or 24GB depending on if they go with 8/12 chips on the board. It wouldn't even surprise me if they used two sepertae pools of Ram, 20GB of GDDR6 (10 2GB chips) for games/applications and 4-8GB of LPDDR4 ram for the OS.



Pemalite said:

[...]

There are ways around that problem, but I won't get into it here as the discussion would quickly go off topic.

I know, my point just was that 12Gb chips don't add a particularly efficient alternative to current ECC implementations. Agree on not diverting the discussion further.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


CGI-Quality said:

Some of you are expecting 24GB of G6? That's 16GB more RAM over the Pro. Not happenin'.

I think you are mistaken. Right now it costs sony around $6.50 per 1GB GDDR5 chip. Meaning around $52 for the 8GB we find in the PS4 today. In 2013 it cost sony around $90 for the ram. 

even if each GDDR6 module is more expensive (as it should be), at worst it won't go from costing sony around $6.50 for GDDR5 to costing them $12 per GDDR6 module. That would make it even more expensive in 2020 than GDDR5 was in 2013. I am willing to hazard a guess that it will cost sony no more than $10/module in 2020 for GDDR6. So getting 24GB which would mean they are spending as much as $120 on memory alone isn't too far fetched.

It could all be significantly lower if cost of GDDR6 is around $7/$8 in 2020. Or sony/MS is able to negotiate better deals on their orders considerring the quantities they would be going for.



CGI-Quality said:

I want you to think about that realistically. You expect a 16GB increase on RAM (with a RAM type that hasn't even been released on PC GPUs). Pemalite and I have tried to get some of you tom understand how this works. Only way you're getting anything near that is if they use older memory (like G5X).

I get your enthusiasm, but I'd temper my expectations.

By next year, i expect the majority of the GPUs on the market to be using GDDR6. And that is still a full year before the PS5/XB2 is expected to release. 

And I am not even saying I expect the PS5/XB2 to have 24GB of Ram, but I am saying its feasible. GDDR6 is not HBM, its just the next iteration of GDDR memory, same way as going from DDR3 to DDR4. And the way those things gom when GDDR6 hits the market proper which by all indications should be next year the shift from GDDR5 would be quick.

And you keep saying 16GB increase from this or that as if you are unware of the usual jumps across generations. Don't forget we went from 512MB to 8GB. Hell even the XB1 went from 8GB to 12GB after 3years on the market.

I think I have made it clear in my previous posts that I expect anything from 16GB-24GB in next gen cause I know its possible and not some silly pipe dream like those that expect as much as 32GB. I even even talked about 16GB-20GB GDDR6 coupled with a 4-6GB LPDDR4 chip as possible alternatives if GDDR6 ends up being too costly.

Guess we can just wait and see though, I just don't think 24GB of GDDR6 in 2020 is impossible unless of course we dont end up with GDDR6 in over 805 of PC GPUs next year.



CGI-Quality said:

G6 doesn't have to be HBM (not sure why you made the connection, anyway). Next, yeah, I do keep saying that a 16GB increase is unprecedented (512MB - 8GB is still not a 16GB jump). You, like a few others, continue to ignore costs (no matter how you spin it, you have). The market is not there for such increases, especially with this new technology. Those machines are already well underway. We don't even have 16GB consumer-grade GPUs. We're going to suddenly get a 24GB console in a matter of two years? Doubtful.

So, as "feasible" as it could be (no one said impossible), I don't expect it. 

I made the connection to HBM because of just how much more expensive and complicated HBM is to GDDR5. 

And I actually did not ignore costs.... I can't say with any emphasis how much more GDDR6 will cost over GDDR5, but neither can you. What I do know is that as of 2017 GDDR5 was estimated to cost OEMs around $6.50/GB. I also know that the cost of the PS4s GDDR5 ram was estimated to be around $88. Which in 2013 meant sony was spending $5.50/512MB chip. 

That means that sony was spending $11/GB in 2013 and as little as $6.50/GB  now. So the qestion is really how much more costly will GDDR6 in 2020 be to GDDR5x. I strongly doubt that GDDR6 will cost more than $18/module(2GB) in 2020. And lets not forget there are even different types of GDDR6 with regards to how much bandwith they would allow per pin. Which means that not all 24GB of GDDR6 is the same......

And while those consoles are well underway as far as R&D goes, I am sure you know that multiple interations of the hardware is designed during R&D.