By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ListerOfSmeg said:
Have Wii U specs even been confirmed? The only break down I saw they looked over half the chip, didn't know what those areas were for and just ran with what they did know.
I am looking for an article now that talked about the GC chip having an area dedicated to shaders, etc. that most developers never used. I did find this article though that discussed the GC CPU chip to the Xbox CPU
"Also this whole processor thing is quite twisted considering Xbox and GameCube are two TOTALLY DIFFERENT architectures (32/64-bit hybrid, PowerPC native compared to 32-bit Wintel). GameCube, having this architecture, has a significantly shorter data pipeline than Xbox’s PIII setup (4-7 stages versus up to 14), meaning it can process information more than twice as fast per clock cycle. In fact, this GCN CPU (a PowerPC 750e IBM chip) is often compared to be as fast as a 700mhz machine at 400mhz. So GCN could be 849mhz compared to Xbox’s 733mhz machine performance wise.
Now if GC at 400mhz was equal to 700mhz then Wii U at 1.2 is around 2.1ghz to a nonPPC based chip.
Here is link. http://www.purevideogames.net/blog/?p=479

Sadly I am unable to find the article I original looked for. It explained how the GC had a separate area for shader support that few developers actually used. I was curious if since the Wii U chip goes back all the way to GC then this area must still be there. I remember in the one break down I saw of Wii U they didn't know what some areas did so they dismissed them.

CPU architectures have moved on a long way since then and X86 CPUs are at least equal to PowerPC in efficiency.

As you mentioned already, clock speed counts very little compared to the overall achitecture