There is a real problem about misinformation, take this post as example:
| Hynad said: Core Clock Frequency Xbox 360 - 500 MHz PS3 - 500 MHz Triangle Setup Xbox 360 - 500 Million Triangles/sec PS3 - 250 Million Triangles/sec Vertex Shader Processing (Vertex ALU x Clock / 4) Xbox 360 - 6.0 Billion Vertices/sec (using all 48 Unified Pipelines) Xbox 360 - 2.0 Billion Vertices/sec (using 16 of the 48 Unified Pipelines) Xbox 360 - 1.5 Billion Vertices/sec (using 12 of the 48 Unified Pipelines) Xbox 360 - 1.0 Billion Vertices/sec (using 8 of the 48 Unified Pipelines) PS3 - 1.0 Billion Vertices/sec Filtered Texture Fetch Xbox 360 - 8.0 Billion Texels/sec PS3 - 12.0 Billion Texels/sec Vertex Texture Fetch Xbox 360 - 8.0 Billion Texels/sec PS3 - 4.0 Billion Texels/sec Pixel Shader Processing with 16 Filtered Texels Per Cycle (Pixel ALU x Clock) Xbox 360 - 24.0 Billion Pixels/sec (using all 48 Unified Pipelines) Xbox 360 - 20.0 Billion Pixels/sec (using 40 of the 48 Unified Pipelines) Xbox 360 - 18.0 Billion Pixels/sec (using 36 of the 48 Unified Pipelines) Xbox 360 - 16.0 Billion Pixels/sec (using 32 of the 48 Unified Pipelines) PS3 - 16.0 Billion Pixels/sec Pixel Shader Processing without Textures (Pixel ALU x Clock) Xbox 360 - 24.0 Billion Pixels/sec (using all 48 Unified Pipelines) Xbox 360 - 20.0 Billion Pixels/sec (using 40 of the 48 Unified Pipelines) Xbox 360 - 18.0 Billion Pixels/sec (using 36 of the 48 Unified Pipelines) Xbox 360 - 16.0 Billion Pixels/sec (using 32 of the 48 Unified Pipelines) PS3 - 24.0 Billion Pixels/sec Multisampled Fill Rate Xbox 360 - 16.0 Billion Samples/sec (8 ROPS x 4 Samples x 500MHz) PS3 - 8.0 Billion Samples/sec (8 ROPS x 2 Samples x 500MHz) Pixel Fill Rate with 4x Multisampled Anti-Aliasing Xbox 360 - 4.0 Billion Pixels/sec (8 ROPS x 4 Samples x 500MHz / 4) PS3 - 2.0 Billion Pixels/sec (8 ROPS x 2 Samples x 500MHz / 4) Pixel Fill Rate without Anti-Aliasing Xbox 360 - 4.0 Billion Pixels/sec (8 ROPS x 500MHz) PS3 - 4.0 Billion Pixels/sec (8 ROPS x 500MHz) Frame Buffer Bandwidth Xbox 360 - 256.0 GB/sec (dedicated for frame buffer rendering) PS3 - 20.8 GB/sec (shared with other graphics data: textures and vertices) PS3 - 10.8 GB/sec (with 10.0 GB/sec subtracted for textures and vertices) PS3 - 8.4 GB/sec (with 12.4 GB/sec subtracted for textures and vertices) Texture/Vertex Memory Bandwidth Xbox 360 - 22.4 GB/sec (shared with CPU) Xbox 360 - 14.4 GB/sec (with 8.0 GB/sec subtracted for CPU) Xbox 360 - 12.4 GB/sec (with 10.0 GB/sec subtracted for CPU) PS3 - 20.8 GB/sec (shared with frame buffer) PS3 - 10.8 GB/sec (with 10.0 GB/sec subtracted for frame buffer) PS3 - 8.4 GB/sec (with 12.4 GB/sec subtracted for frame buffer) Shader Model Xbox 360 - Shader Model 3.0+ / Unified Shader Architecture PS3 - Shader Model 3.0 / Discrete Shader Architecture Should I say more, MikeB? |
FALSE: the RSX is clocked at 550 MHz (so all the subsequent math is flawed).
ALMOST TRUE: ...but you can't use all the 48 pipelines at the same time, so the theoretical maximum is much lower.
FALSE AND GETTING OLD: that's just the bandwith of the 10 MB EDRAM, not the frame buffer bandwith (which is the Memory Bandwidth, shown below).
As a rule of thumb: ATI GPU's have more raw power, while NVIDIA GPU's are better engineered.
some reference:
http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27
http://wiki.ps2dev.org/ps3:rsx
http://www.extremetech.com/article2/0,1697,2053309,00.asp







