By using this site, you agree to our Privacy Policy and our Terms of Use. Close

@ Mifely,

you're well read and your arguments sound cool and everything, but I think you simply draw the wrong conclusions when it comes to CPU vs GPU. In the PC-world it's nearly always the GPU that is the bottle-neck in gaming (since people buy a comparatively strong CPU from Intel) like for example Core 2 Duo E8400 together with a GeForce 8800GT.

Same phenomenon with consoles. All the 3 cores in the X360 aren't even used (because it's tricky to code multi-threads), same with PS3, and still they have no trouble feeding their respective GPUs. And the reason why all games are nearly identical today on the X360 and PS3 is because the GPUs are nearly identical in performance. The ATI GPU in the X360 has nearly identical performance to a Radeon X1900XT* and in the PS3 it's a version of GeForce 7800GTX - thus it's the GPU that is the bottleneck, not the CPU. (X1900XT=7800GTX)

* Note: I know that the X360 GPU has architectural simiralities to ATI R600, why the shader power is comparatively a bit stronger than in X1900XT and 7800GTX, but it's no big deal).

And yeah, Moores law isn't dead, at least not with GPUs. The 8800GTX nearly doubled the performance of 7800GTX already. And just read the benchmarks for GeForce 280GTX and Radeon 4870 in a couple of weeks and you'll see it's actually faster than "double every 18 months" in the long run.