By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U hacked and CPU clock speed revealed, apparently.

I don't think its fair to say that what we see now is the best we'll get, especially since 1st party offerings are ported Wii games, and 3rd party offerings are hastily ported 360 games.

Remember during WiiU development, Nintendo and many 3rd parties made comments about how Nintendo built the WiiU to fit their requests. Gearbox was especially positive on the WiiU during development. And the Trine 2 devs have said that they would have to take stuff out of the WiiU version to get it to run on the 360 or PS3. There is more here than meets the eye, for certain.



Check out my Youtube Let's Play channel here.

Around the Network
Soleron said:

timmah said:
Those of you that keep making sensationalized statements based only on the clock speed number don't understand that this CPU is a totally different architecture than what you're familiar with.

No. It's a very similar architecture to Gamecube and Wii. It's also used in servers we have performance data for. We are VERY familiar with it.

The WiiU CPU is a RISC processor, the other consoles use a CISC processor.

Nope they're both RISC.

The difference is that a RISC uses significantly less complex instructions, meaning there is a lot less processing overhead when crunching numbers. RISC processors are also designed to do more with each clock cycle (better efficiency). This provides a huge boost per clock cycle in the amount of computations that can be done, especially for gaming tasks such as physics and AI.

This was true in the 1980s. Now it is not, because instruction complexity isn't much of a factor in efficiency any more. The overhead for x86 vs ARM, estimated by Intel, is 10% of die area. The difference between the Wii U and 360 dies is a LOT more than 10%. Try 300%. And even then a 10% difference can easily be swamped by better or worse design choices by each CPU.

This is why, though the CPU has less raw clock speed than the 360, it could theoretically be a bit faster at the tasks it's asked to do when code is optimized for the architecture.

Yes. But the clock speed disparity is so huge that the Wii U CPU cannot be faster no matter how much better you might care to speculate it is per clock.

Coupled with the fact that the GPU is much faster and can take on some additional tasks, plus there is a separate DSP, and the fast interconnects referenced by z101, this gives it a significant leg up over the current gen

Yes. The GPU is faster. The Wii U has different strengths and weaknesses to 360/PS3 for sure.

in future *optimized* games (which none of these ports were).

No. What we see now is what we get. Optimisation cannot produce the kind of leap you are imagining, and third parties will not be putting in the effort to do so because the easy route is just to stick to PS3/360 for another year and then jump to PS4/720.

They told us the Cell would eventually be optimised for and be amazingly better. The Cell was actually suited to optimisation as well, being something brand new (unlike Wii U). We didn't see any leap of that kind, especially not versus the competition.




Ah, for some reason I thought the 360 was using a CISC processor, I remember that the original Xbox did. Either way, it does appear from the limited info I've seen that we end up with 2x the overall raw power of the current crop of systems, which is about where I expected things to be. I've also read that Nintendo worked very hard on optimizing the architecture of the system, so I'm very interested to see what can be done with it (I think it will surprise some naysayers in the end).

On the 'what you see is what you get' comment, I disagree. That's never been the case with early ports on any new system and we've always seen significant improvements in graphical quality over the life of a console, this is past history, not just a guess; PS2 and PS3 are both excellent examples of this. I just don't buy that for the first time in gaming history, a console's potential was somehow fully realized on launch day. We'll just have to wait and see.

Oh, and thank you for actually putting a reasoned argument forward unlike some :).



Crono141 said:
I don't think its fair to say that what we see now is the best we'll get, especially since 1st party offerings are ported Wii games, and 3rd party offerings are hastily ported 360 games.

Remember during WiiU development, Nintendo and many 3rd parties made comments about how Nintendo built the WiiU to fit their requests. Gearbox was especially positive on the WiiU during development. And the Trine 2 devs have said that they would have to take stuff out of the WiiU version to get it to run on the 360 or PS3. There is more here than meets the eye, for certain.


This! well said.

Also, the early tech demos (japanese garden rendered and the Zelda demo rendered separately on two screens) could certainly not have been done on a 'weak' piece of hardware.

And Trine 2 is beautiful, the fact that it was done by a low budget studio is pretty amazing.



timmah said:
Kynes said:
timmah said:
Those of you that keep making sensationalized statements based only on the clock speed number don't understand that this CPU is a totally different architecture than what you're familiar with. The WiiU CPU is a RISC processor, the other consoles use a CISC processor. The difference is that a RISC uses significantly less complex instructions, meaning there is a lot less processing overhead when crunching numbers. RISC processors are also designed to do more with each clock cycle (better efficiency). This provides a huge boost per clock cycle in the amount of computations that can be done, especially for gaming tasks such as physics and AI. This is why, though the CPU has less raw clock speed than the 360, it could theoretically be a bit faster at the tasks it's asked to do when code is optimized for the architecture. Coupled with the fact that the GPU is much faster and can take on some additional tasks, plus there is a separate DSP, and the fast interconnects referenced by z101, this gives it a significant leg up over the current gen in future *optimized* games (which none of these ports were)

 

Wow, great argument. People like you are the reason I quit posting on this site full of people who don't have a clue how to debate & disagree in a civil manner years ago. I'm wondering why I came back.

You came back to post misleading information, perhaps. The architecture of the Wii U CPU is over a decade old. It loses out to just about any modern architecture on efficiency. The fact it runs at some extremely low clocks makes the "woo, GHz doesn't matter anymore" false in this case, except perhaps toward iPads and iPhones out there. It loses to the Cell and the Xenon on most significant measurements. 

Wii U CPU - 8,400 MIPS and ~13 GFLOPS. Xenon 19,200 MIPS and ~110 GFLOPS. The PPE on the Cell alone - excluding the seven SPEs - does 10,200 MIPS and ~26 GFLOPS. 

Also like I said, the only good thing about it all is the general purpose aspect of current GPUs... but as the Wii U games are clearly showing it, that doesn't cover everything does it.



 

 

 

 

 

http://www.neogaf.com/forum/showthread.php?t=502059&page=2. Post 89

Just a quick update from the team that made Trine 2 since they have been very supportive of the Wii U. He talks about making the graphics more vivid for the next patch.

We are happy to see Trine 2: Director's Cut up there. :) The sales have been good and we're loving the user comments. :)


I agree. We originally wanted to do a -20% discount in North America too - just like we are doing in Europe right now - but it wasn't possible just yet.

We're thinking of doing the -20% for the Holidays though... nothing certain yet but we think it makes sense, although I'm not sure if there's enough time since the launch to not anger anybody... Maybe we'll time it with the update that we plan to bring in mid-to-late December (featuring Voice Chat, Wii U Pro Controller support, minor fixes, new languages like BR-PT, and additional vividness to the graphics so it will look even more awesome). We also plan to bring the demo in December.

Let's see how it goes... Looks like everybody's happy with the game so we have good reason to be too. :)

- Joel, Frozenbyte team



Around the Network
haxxiy said:
timmah said:
Kynes said:
timmah said:
Those of you that keep making sensationalized statements based only on the clock speed number don't understand that this CPU is a totally different architecture than what you're familiar with. The WiiU CPU is a RISC processor, the other consoles use a CISC processor. The difference is that a RISC uses significantly less complex instructions, meaning there is a lot less processing overhead when crunching numbers. RISC processors are also designed to do more with each clock cycle (better efficiency). This provides a huge boost per clock cycle in the amount of computations that can be done, especially for gaming tasks such as physics and AI. This is why, though the CPU has less raw clock speed than the 360, it could theoretically be a bit faster at the tasks it's asked to do when code is optimized for the architecture. Coupled with the fact that the GPU is much faster and can take on some additional tasks, plus there is a separate DSP, and the fast interconnects referenced by z101, this gives it a significant leg up over the current gen in future *optimized* games (which none of these ports were)

 

Wow, great argument. People like you are the reason I quit posting on this site full of people who don't have a clue how to debate & disagree in a civil manner years ago. I'm wondering why I came back.

You came back to post misleading information, perhaps. The architecture of the Wii U CPU is over a decade old. It loses out to just about any modern architecture on efficiency. The fact it runs at some extremely low clocks makes the "woo, GHz doesn't matter anymore" false in this case, except perhaps toward iPads and iPhones out there. It loses to the Cell and the Xenon on most significant measurements. 

Wii U CPU - 8,400 MIPS and ~13 GFLOPS. Xenon 19,200 MIPS and ~110 GFLOPS. The PPE on the Cell alone - excluding the seven SPEs - does 10,200 MIPS and ~26 GFLOPS. 

Also like I said, the only good thing about it all is the general purpose aspect of current GPUs... but as the Wii U games are clearly showing it, that doesn't cover everything does it.

From what I understand, IBM doesn't even make the 750 anymore, but has opted to do a 'custom' processor line instead. This is custom chip that is in the PowerPC family (they said it was 'similar' to the broadway, not the same, anything in the PowerPC family would be 'similar' since it's the same architecture. Everything in Intel's lineup could be called 'similar', but an i3 beats the pants off of a core2 clock for clock). It's certainly not just an overclock of 10 year old tech, it is something custom built for Nintendo. Even the hacker said you couldn't compare this clock for clock with the 360's processor. Since we don't have the real specs, anything else is just opinion and conjecture.



timmah said:

Also, the early tech demos (japanese garden rendered and the Zelda demo rendered separately on two screens) could certainly not have been done on a 'weak' piece of hardware.

And Trine 2 is beautiful, the fact that it was done by a low budget studio is pretty amazing.

You may notice from latest DF Face-offs that where WiiU is choking and heavily dropping in performance are scenes with lot of characters - so lot of AI, which is run on CPU, and will stay on CPU for a long time. Though CPU is not that great, GPU is decent, with some 2-3 time shader performance of PS360 (though there is still uncertainty is it based on Turks (480:24:8), Redwood (400:20:8) or, as some suggest Redwood LE (320:16:8)), so games like Trine 2, that are quite GPU heavy, will look better on it (as they'll be using those shaders to full potential).



haxxiy said:
timmah said:
Kynes said:
timmah said:
Those of you that keep making sensationalized statements based only on the clock speed number don't understand that this CPU is a totally different architecture than what you're familiar with. The WiiU CPU is a RISC processor, the other consoles use a CISC processor. The difference is that a RISC uses significantly less complex instructions, meaning there is a lot less processing overhead when crunching numbers. RISC processors are also designed to do more with each clock cycle (better efficiency). This provides a huge boost per clock cycle in the amount of computations that can be done, especially for gaming tasks such as physics and AI. This is why, though the CPU has less raw clock speed than the 360, it could theoretically be a bit faster at the tasks it's asked to do when code is optimized for the architecture. Coupled with the fact that the GPU is much faster and can take on some additional tasks, plus there is a separate DSP, and the fast interconnects referenced by z101, this gives it a significant leg up over the current gen in future *optimized* games (which none of these ports were)

 

Wow, great argument. People like you are the reason I quit posting on this site full of people who don't have a clue how to debate & disagree in a civil manner years ago. I'm wondering why I came back.

You came back to post misleading information, perhaps. The architecture of the Wii U CPU is over a decade old. It loses out to just about any modern architecture on efficiency. The fact it runs at some extremely low clocks makes the "woo, GHz doesn't matter anymore" false in this case, except perhaps toward iPads and iPhones out there. It loses to the Cell and the Xenon on most significant measurements. 

Wii U CPU - 8,400 MIPS and ~13 GFLOPS. Xenon 19,200 MIPS and ~110 GFLOPS. The PPE on the Cell alone - excluding the seven SPEs - does 10,200 MIPS and ~26 GFLOPS. 

Also like I said, the only good thing about it all is the general purpose aspect of current GPUs... but as the Wii U games are clearly showing it, that doesn't cover everything does it.

Take a look at the later tweets by the person who actually did the hack (had seen these earlier but took them from another post in this thread)...

"It's worth noting that Espresso is *not* comparable clock per clock to a Xenon or a Cell. Think P4 vs. P3-derived Core series."

"The Espresso is an out of order design with a much shorter pipeline. It should win big on IPC on most code, but it has weak SIMD."

"And I'm sure it's not an "idle" clock speed. 1.24G is exactly in line with what we expected for a 750-based design."

"So yes, the Wii U CPU is nothing to write home about, but don't compare it clock per clock with a 360 and claim it's much worse. It isn't."

So, different, but not as big a deal as you're making it out to be.



I wish that guy hadn't made that comment. I mean, he's not wrong, but it leaves all this hope for certain people that the Wii U CPU can perform much better than the 360 CPU on future games.



HoloDust said:
timmah said:

Also, the early tech demos (japanese garden rendered and the Zelda demo rendered separately on two screens) could certainly not have been done on a 'weak' piece of hardware.

And Trine 2 is beautiful, the fact that it was done by a low budget studio is pretty amazing.

You may notice from latest DF Face-offs that where WiiU is choking and heavily dropping in performance are scenes with lot of characters - so lot of AI, which is run on CPU, and will stay on CPU for a long time. Though CPU is not that great, GPU is decent, with some 2-3 time shader performance of PS360 (though there is still uncertainty is it based on Turks (480:24:8), Redwood (400:20:8) or, as some suggest Redwood LE (320:16:8)), so games like Trine 2, that are quite GPU heavy, will look better on it (as they'll be using those shaders to full potential).

My specific experience with BLOPS2 was that it had a couple minor issues during one heavy action scene, but seemed to handle equally crazy scenes later on in that same level just fine. What was strange was, the worst framerate issue I saw was during a cutscene when there was absolutely nothing going on, so I chalk some of this up to poor optimization. I still think it's just too early to pass judgment. If the same thing had held true for the PS3 based on some early games, you never would have seen amazing looking titles like the Uncharted games for example.