By using this site, you agree to our Privacy Policy and our Terms of Use. Close
BlueFalcon said:

I posted this in another thread but I thought it's worth of consideration in this thread too.

DDR3 of Xbox 720 vs. GDDR5 of PS4 - the impact on the GPU performance.

 

If you take the rumored level of performance in the Xbox 720 that is estimated at around HD7770Ghz in floating point, recall that this GPU normally has 4500mhz GDDR5 over 128-bit bus feeding it. Replace GDDR5 with DDR3-2133mhz over the same 128-bit bus and your GPU's memory bandwidth will fall from 72GB/sec to just 34 GB/sec!!! Guess what happens? Your GPU's performance will fall 40-50% with this reduction in memory bandwidth.

This likely explains why MS is going to use eSRAM/eDRAM for the GPU because 8GB of DDR3 is going to be shared with the GPU (i.e., the GPU is going to be severely memory bandwidth bottlenecked).

In simplest terms, you can take a 1.2Tflop GPU and compare it to a 1.84Tflop GPU on the same GPU architecture, but if the 1st GPU's memory bandwidth just got neutured by DDR3, the performance of that GPU will drop like a rock! Effectively, you are no longer comparing a 1.2Tflop GPU to a 1.8Tflop one because the former GPU can't work at full capacity any longer as it's memory bandwidth bottlenecked.

Example for an Nvidia GPU -- GT 640's Floating Point performance is fixed but look at the dramatic effect of swapping out DDR3 for GDDR5 on actual gaming performance.

GT 640 DDR3 = 46 VP

GT 640 GDDR5 = 68 VP (+48% faster) by just swapping out DDR3 for GDDR5. 

http://alienbabeltech.com/abt/viewtopic.php?p=41174

Example for an AMD GPU:

3DMark11

HD6670 GDDR3 = 1594 marks

HD6670 GDDR5 = 2479 marks (+55%)

http://www.overclockers.com/forums/showthread.php?t=710062

Memory bandwidth for the GPU is like vital performance enhancing nutrients for a sports athlete. 

 

On paper, things are looking MUCH worse for the Xbox 720's GPU. Not only is the GPU's floating point rumored to be 50% less than that of PS4, but the actual memory subsystem that feeds the GPU is dramatically inferior to PS4's rumored 4GB GDDR5 setup. The performance difference between an HD7770Ghz DDR3 and an 85% HD7870 GDDR5 is going to be more than double in graphical capability, not 50%, because memory bandwidth is required to keep the GPU working at full capacity (specifically the memory bandwidth feeds the GPU's ROPs).

The Render Output Unit, often abbreviated as "ROP", and sometimes called (perhaps more properly) Raster Operations Pipeline, is one of the final steps in the rendering process of modern 3D accelerator boards. If you neuter GPU's memory bandwidth, you neuter the ROPs and thus the final stage in the graphical rendering process stage. It's like putting Toyota Prius tires on a 700hp rear-wheel drive supercar. 32MB of eSRAM is not going to be the answer to this problem because it's not sufficient enough in size.

 

In practice, Xbox 720's DDR3 memory bandwidth seems to be estimated at just 68GB/sec. 1.76Tflops HD7850 has 154GB/sec:

http://www.gpureview.com/show_cards.php?card1=678&card2=677

If Sony retains GDDR5 for its GPU, Xbox 720's GPU is going to be significantly slower. Also, if we look at the diagram for rumored Xbox 720 specs, DDR3 has to communicate through the NorthBridge prior to reaching the GPU. That introduces additional latencies. If PS4 has GDDR5 and at least some of it is dedicated to the GPU, not only will the memory subsystem feeding the GPU will be miles faster than Xbox 720's, but the GPU will be able to access it much quicker (much like it does on a real modern graphics card where the GPU communicates with GDDR5 directly on the same PCB).

Now take the Xbox 720's GPU with its rumored 50% lower GPU power and consider that it also appears crippled by shared DDR3 memory, it stands to reason that PS4's GPU with its dedicated GDDR5 will mop the floor with it.  There is a reason all high-end GPUs on the PC use dedicated GDDR5.....so far I am not impressed with these Xbox 720 specs.

@ SuperChunk, 

In your PS4/Orbis chart, the GPU is rumored to be an 800mhz clocked 18 Compute Units HD7970M part (or a 20% downclocked, 10% Compute Unit cut down HD7870 desktop part). You have it at 850mhz 20 Compute Units which are actual specs of the full-fledged HD7970M.

Yeah so i don't know why neogaf people think it will HD7770 level,in fact,even they think about mid/low range,at least use retail HD8000

I don't know Sony,but Durango clearly will use HD8000

And most wtf thing is some people in neogaf said Durango will have some magic assist hardware to make their "rumored HD7770" on par or even little better than PS4's "rumored 85% HD7870/7970M" lol