By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Mark Cerny Explains How the PS4's 8 GB GDRR5 RAM and Bus Work and Why They Were Chosen

Scoobes said:
Bizarre to see someone so honest about their experiences. Really shows how Sony employees have learnt from the mistakes of PS3.


I think part of why he is so honest is because hes not really a Sony employee.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

Around the Network
ethomaz said:
platformmaster918 said:
I kinda wish they could get that bandwidth though haha. I guess Xbox will be tougher to use RAM if he could've had all that and decided to go against it even with dat bandwidth for developer friendliness' sake.

eDRAM give more bandwidth than eSRAM but there are cons too.

I think if MS have the option to change to GDDR5 without delay the Xbone then they should have done that but at the end they projected with DDR4 due the 4GB limitatation of GDDR5... Sony really had lucky in this part.

I expect the XB1 couldn't even use GDDR5 even if you dropped it in. Reading the fine print, the PS4's processor is ever slightly faster, so the additional throughput would be irrelevant for a processor which wasn't designed to use it.

The unified memory pool will definitely make the PS4 attractive to devs, but it means this generation has some very odd choices. GDDR5 is not quite as good as GDDR3 for OS and CPU use, so if you really want to push the envelope of the CPU for an enemy AI or something, your best platform choice is the Wii U, which has both GDDR3 memory and CPU cores with significant caches.

If you just want to mash X to compile, the PS4 is the no-brainer choice. Sony really did a great job on streamlining development, which I'm sure is why they focus so much on Indy stuff: their console is no question the best for indy developers.

I imagine the XB1 will be the graphical powerhouse of the generation between the eSRAM and cloud computing. I just wonder with graphical diminishing returns if that means anything this gen.



Egann said:

I expect the XB1 couldn't even use GDDR5 even if you dropped it in. Reading the fine print, the PS4's processor is ever slightly faster, so the additional throughput would be irrelevant for a processor which wasn't designed to use it.

The unified memory pool will definitely make the PS4 attractive to devs, but it means this generation has some very odd choices. GDDR5 is not quite as good as GDDR3 for OS and CPU use, so if you really want to push the envelope of the CPU for an enemy AI or something, your best platform choice is the Wii U, which has both GDDR3 memory and CPU cores with significant caches.

If you just want to mash X to compile, the PS4 is the no-brainer choice. Sony really did a great job on streamlining development, which I'm sure is why they focus so much on Indy stuff: their console is no question the best for indy developers.

I imagine the XB1 will be the graphical powerhouse of the generation between the eSRAM and cloud computing. I just wonder with graphical diminishing returns if that means anything this gen.

You're right on that you can't just "drop" in GDDR5 memory with the Xbox One, the memory controller (Which is part of the CPU these days) needs to support it, hence the entire APU would need to be respun in order to accomidate GDDR5 memory.

As for the PS4 CPU being "slightly faster". - Well, DDR3 has 20% lower latency over GDDR5 (10ns vs 12ns), so that's going to be a slight (And possibly only?) advantage the Xbox One has over the PS4, which should benefit the Xbox's CPU performance, especially when there is a cache miss.

In the end though, it won't matter, the PS4's and Xbox One's CPU's are incredibly slow to begin with, the advantage the PS4 will have is that it can offload some CPU tasks onto the GPU.

Also, I think you meant DDR3 not GDDR3, there are some differences between the technology.



--::{PC Gaming Master Race}::--

Pemalite said:

Also, I think you meant DDR3 not GDDR3, there are some differences between the technology.

It's not like it's easy to find  descriptions of these technologies for layity. I know what a NOT gate is, but beyond that all I can do is read wikipedia and make SWAGs.



I really enjoyed watching this couple of days ago. Part where he talks how devs wanted unified RAM pool and decisions they made in retrospect was particularly interesting. Also devs insistence on no exotic hardware - "if there's a hardware out there that can do real time ray tracing - we don't want it" - anyone remembers those old rumors on Intel pitching Larabee to console manufacturers?

Seems like devs really wanted straight forward hardware that is easy to work with, after all the pain that was 6th to 7th gen jump, and Sony delivered - I can see why they praise PS4 so much.



Around the Network

Egann said:

I expect the XB1 couldn't even use GDDR5 even if you dropped it in. Reading the fine print, the PS4's processor is ever slightly faster, so the additional throughput would be irrelevant for a processor which wasn't designed to use it.

The unified memory pool will definitely make the PS4 attractive to devs, but it means this generation has some very odd choices. GDDR5 is not quite as good as GDDR3 for OS and CPU use, so if you really want to push the envelope of the CPU for an enemy AI or something, your best platform choice is the Wii U, which has both GDDR3 memory and CPU cores with significant caches.

If you just want to mash X to compile, the PS4 is the no-brainer choice. Sony really did a great job on streamlining development, which I'm sure is why they focus so much on Indy stuff: their console is no question the best for indy developers.

I imagine the XB1 will be the graphical powerhouse of the generation between the eSRAM and cloud computing. I just wonder with graphical diminishing returns if that means anything this gen.

You are making a lot of confusion here.

GDDR5 and GDDR3 are the same type of memory RAM... the same used in graphic cards... GDDR5 is a evolution of the GDDR3... so eveything done in GDDR5 is better than GDDR3.

So I think are talking about DDR3... the same memory used by Xbone... and the better use for OS and CPU are related at the latencies.

But I need to explain something important for your guys...

It's not the memory that defines latencies... GDDR5 and DDR3 are the same... what make they have different latencies is the memory controller implementated... and the memory controller of video cards is not optimized for low latencies.

Another surprise here.... the memory controller for DDR3 in AMD APUs have higher latencies than the memory controller of GDDR5 in nVIDIA cardas... yeah, the GDDR5 in GTT 680 have lower latency than the DDR3 used in A10-5800k.

So DDR3 or GDDR5 will not change the letencies in Xbone or PS4... what will change is the memory controller implemented in the APU and in this case the memory controller are so similar that I don't expect difference bigger than 2ns in latency for both consoles.

And a hint... AMD will give the options to use GDDR5 in the next gen APUs for the desktop market.



A question , does that mean that the PS4 is easier to develop for than the One ?



Pemalite said:

You're right on that you can't just "drop" in GDDR5 memory with the Xbox One, the memory controller (Which is part of the CPU these days) needs to support it, hence the entire APU would need to be respun in order to accomidate GDDR5 memory.

As for the PS4 CPU being "slightly faster". - Well, DDR3 has 20% lower latency over GDDR5 (10ns vs 12ns), so that's going to be a slight (And possibly only?) advantage the Xbox One has over the PS4, which should benefit the Xbox's CPU performance, especially when there is a cache miss.

In the end though, it won't matter, the PS4's and Xbox One's CPU's are incredibly slow to begin with, the advantage the PS4 will have is that it can offload some CPU tasks onto the GPU.

Also, I think you meant DDR3 not GDDR3, there are some differences between the technology.

The latency comparison is not true.

Like I explained before the latencies is defined by the memory controller and I give even a example where the GDDR5 have lower latency than DDR3 (GTX 680 vs A10-5800k).

You don't know what the latency of GDDR5 on PS4 and what the latency of DDR3 on Xbone... so we can't make any assumption here.

And 10ns vs 12ns theoretical (in pratical the latency is ~40ns) won't give you noticeable difference in CPU taks.



I think he did a good job with the PS4, I'm just glad the guy tha made the PS3 isn't coming back lol.



I just saw the video ,by a mistake on youtube , but the first 2 minutes of watching i was hocked ,then watched the whole thing hehehe :)

So thumbs up for this topic ;)