By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
timmah said:
Kynes said:
timmah said:
Those of you that keep making sensationalized statements based only on the clock speed number don't understand that this CPU is a totally different architecture than what you're familiar with. The WiiU CPU is a RISC processor, the other consoles use a CISC processor. The difference is that a RISC uses significantly less complex instructions, meaning there is a lot less processing overhead when crunching numbers. RISC processors are also designed to do more with each clock cycle (better efficiency). This provides a huge boost per clock cycle in the amount of computations that can be done, especially for gaming tasks such as physics and AI. This is why, though the CPU has less raw clock speed than the 360, it could theoretically be a bit faster at the tasks it's asked to do when code is optimized for the architecture. Coupled with the fact that the GPU is much faster and can take on some additional tasks, plus there is a separate DSP, and the fast interconnects referenced by z101, this gives it a significant leg up over the current gen in future *optimized* games (which none of these ports were)

 

Wow, great argument. People like you are the reason I quit posting on this site full of people who don't have a clue how to debate & disagree in a civil manner years ago. I'm wondering why I came back.

You came back to post misleading information, perhaps. The architecture of the Wii U CPU is over a decade old. It loses out to just about any modern architecture on efficiency. The fact it runs at some extremely low clocks makes the "woo, GHz doesn't matter anymore" false in this case, except perhaps toward iPads and iPhones out there. It loses to the Cell and the Xenon on most significant measurements. 

Wii U CPU - 8,400 MIPS and ~13 GFLOPS. Xenon 19,200 MIPS and ~110 GFLOPS. The PPE on the Cell alone - excluding the seven SPEs - does 10,200 MIPS and ~26 GFLOPS. 

Also like I said, the only good thing about it all is the general purpose aspect of current GPUs... but as the Wii U games are clearly showing it, that doesn't cover everything does it.

Take a look at the later tweets by the person who actually did the hack (had seen these earlier but took them from another post in this thread)...

"It's worth noting that Espresso is *not* comparable clock per clock to a Xenon or a Cell. Think P4 vs. P3-derived Core series."

"The Espresso is an out of order design with a much shorter pipeline. It should win big on IPC on most code, but it has weak SIMD."

"And I'm sure it's not an "idle" clock speed. 1.24G is exactly in line with what we expected for a 750-based design."

"So yes, the Wii U CPU is nothing to write home about, but don't compare it clock per clock with a 360 and claim it's much worse. It isn't."

So, different, but not as big a deal as you're making it out to be.