DDR3 is still slower
VITA 32 GIG CARD.250 GIG SLIM & 160 GIG PHAT PS3
I don't see how DDR5 RAM is the new Cell.
SCE isn't using it as a marketing bullet point on account of the fact that most consumers don't know and won't care about the differences.
SCE spent an assload of money developing the CBE and tried to use that as a marketing point to techies. They didn't invent DDR5 memory or its use by video cards.
DDR5 is what's used in top end VGA cards for the performance boost. The DDR5 in the PS4 is split between the CPU and GPU. In other words, it's used by the GPU.
We don't feel the need to defend the use of cheaper DDR3 memory in lower end VGA cards so why are we doing this now?
If the rumour about MS being forced to downclock it compared to what initially planned is true, slower RAM shouldn't hurt it anymore.
|
drkohler said: That link "example" has me puzzled ever since I saw it. Unfortunately it leads to people falsely "adding up numbers" like you correctly mention. 1. They couldn't figure out where the four dma controllers go so they put them idle at the bottom, without showing what they connect to. The display/Scan box should be drawn connected to the esram, of course, not placed in limbo (and the video encoder likely sits there, too). 2. Read command of the gpu is shown as 4G/s. But where are they reading from? Either from esram or from dram, so these bandwidth numbers are already allocated for at all times... 3. The gpu memory controller shows 0.5G/s write + 5.5G/s "read from dram". But where are they reading from? Either from esram or dram, so these bandwidth numbers are already allocated for at all times... 4. The gpu memory controller shows 102G/s r/w to esram and 42G/s r/w to dram. If that were true, there woud be two physically separated (256bit) data and (34bit) address busses, in addition to the physically separated 37G(s data/address bus to the gpu and another 25G/s data/address bus to the Northbridge. That is one hell of a bus layout. Congrats to the hardware engineers (or more likely, this diagram is nonsense). All in all, this diagram is more puzzling than revealing.. |
I think that diagram is "guesses" based in what information the guy have from Durango.
One of the questions that I have is about the DataMoves...
The DataMove have direct bus connect to all memory (eSRAM and MainRAM)? It the point is to free up the CPU/CPU from these tasks then it needs to have access between eSRAM and MainRAM to move data without need to ask to CPU... but how that works? Because how can it move data without know what CPU or GPU is working or doing.
That's a example...
| Scoobes said: Seriously dude, just give up. You're obviously wrong and this thread is going in circles. Your OP is simply a case of confirmation bias. |
I am not sure what you think my OP is stating. I'm just saying that Xbox One would not be much more powerful with GDDR5 RAM due to specific APU design and lower GPU power compared to PS4.
I am not sure why everyone thinks I'm saying DDR3 > GDDR5 or X1 > PS4... I'm clearly not saying that.
I really wasn't expecting such a shitstorm, just wanted to end the posts saying..."If Xbox One used GDDR5 RAM it would be much more powerful". It really wouldn't. 2133 MHz DDR3 + sRAM is plenty for the 33% lower spec'd GPU in the X1.
Maybe a better thread than this is the XB1 doesn't need to use 3GB of memory for the UI/OS, regardless of whether it's GDDR3 or GDDR5.
| greenmedic88 said: Maybe a better thread than this is the XB1 doesn't need to use 3GB of memory for the UI/OS, regardless of whether it's GDDR3 or GDDR5. |
Why doesn't it need to use 3GB for UI and OS? I like multitasking, preloaded games, insta resume, fast switching, pinning apps...


| petalpusher said: There s no latency advantage with DDR3, it's a myth. In fact GDDR5 has better latency, it's just better memory chip s who can handle a higher multiplier, the main difference between the two type of memory, is with DDR3 you multiply the base clock by 2 and 4 with GDDR5. GDDR5 base clock is also higher. It's simple math XB1 = 1066 x 2 x 256 / 8 = 68 GB/s PS4 = 1375 x 4 x 256 / 8 = 176 GB/s |
Actually there is a latency advantage with DDR3, however it's only about 20% I.E. 10ns for DDR3, 12ns for GDDR5. - Pretty tiny.
The Irony is though, Microsoft went with DDR3 because it's cheaper to manufacture, however the price is going to be increasing here-on-out as the PC transitions over to DDR4.
That's in stark contrast to GDDR5, which isn't used in low-end cards which actually sell in high-volumes, once that happens scale of economies should kick in and prices should drop farther, that is untill the PC transitions over to GDDR6.

www.youtube.com/@Pemalite
|
disolitude said: I am not sure what you think my OP is stating. I'm just saying that Xbox One would not be much more powerful with GDDR5 RAM due to specific APU design and lower GPU power compared to PS4. I am not sure why everyone thinks I'm saying DDR3 > GDDR5 or X1 > PS4... I'm clearly not saying that. I really wasn't expecting such a shitstorm, just wanted to end the posts saying..."If Xbox One used GDDR5 RAM it would be much more powerful". It really wouldn't. 2133 MHz DDR3 + sRAM is plenty for the 33% lower spec'd GPU in the X1. |
The APU design used by MS can be debatable but even a low-end GPU using GDDR5 to increase the bandwidth almost double the GPU performance... AMD words... not mine.
|
Pemalite said: Actually there is a latency advantage with DDR3, however it's only about 20% I.E. 10ns for DDR3, 12ns for GDDR5. - Pretty tiny. |
I think MS went with DDR3 because they need 8GB RAM and use 32 chips of GDDR5 was out of question one or two years ago... Sony wanted 4GB RAM so 16 chips of GDDR5 was fine.
Both created their project/design with these poins in mind.
PS4: 4GB RAM
Xbone: 8GB RAM
What happened? The industry released dense GDDR5 chips (512MB each) and Sony turned the tables without change one point of the original project/design.
And MS? After one or more year creating a alternative to avoid the low bandwidth of DDR3 (eSRAM on chip, DataMoves, etc) can't spend one more year to change the project to GDDR5 and delay the Xbone for a at least half year... so they maintain the original project/design and all the issues/trouble with it.
Sony had lucky but lucky is in the side o champions too.