By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Woah there souixan, the GC didn't used DDR3, it used 1T-SRAM or something like that. The Wii is the one using GDDR3 RAM on a 128-bit bus width (few words: the same as the PS3 and 360)

Also, nobody knows yet the ACTUAL clock rates of CPU, GPU and RAM of the Wii. And we still don't the actual architecture of the GPU which could be the same as the GC's, but it could be the same as the 360's too. Mario Galaxy is a good example of graphical achievement on the Wii, it uses many shader effects that I'm certain where not possible on the GC... and it still runs flawlessly at 60 frames per second. So in short, the Wii is far above any console of the past generation.

And no, the Wii doesn't makes the GC games perform any better.

EDIT

@scorptile: Gamecube CPU was 485Mhz, Xbox 1 was 733Mhz, but GC's was more powerful because it was 64 bit and Xbox's was 32 bit (note that the PS2's CPU was 128 bit), BUT the Xbox's 1 GPU was faster and more advanced, so it was more powerful than the GC's GPU. And the PS2's GPU was the less advanced and powerful of the three.

The clock speeds of the Wii's CPU are rumored to be 729Mhz because it says that number on the heatspreader of the CPU... but my GC's CPU doesn't says its clockspeed on its heatspreader. Neither does the PS3's Cell BBE, so the 729Mhz figure is just a very random assumption. Personally I think that having a 90nm processor running at that speed is a waste. Of course it MUST be slower than the 360's and PS3's, but think of notebook computers, they usually run at around 1.6~1.8Ghz. So I'd put my money for the Wii's CPU speed at over 1Ghz, but less than 1.5Ghz.