By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U hacked and CPU clock speed revealed, apparently.

Soleron said:
I wish that guy hadn't made that comment. I mean, he's not wrong, but it leaves all this hope for certain people that the Wii U CPU can perform much better than the 360 CPU on future games.

I'm not under any illusion that it's going to blow the 360 out of the water from a CPU standpoint, it won't. I think it might be marginally better if optimized for properly (just my opinion), but that's not even the point. I'm just not buying into the line that it's somehow waaaaaay slower than the 360's CPU and is going to keep every major 3rd party away. That's why I mentioned 'sensationalized' posts in one of my earlier comments.



Around the Network
haxxiy said:
timmah said:
Kynes said:
timmah said:
Those of you that keep making sensationalized statements based only on the clock speed number don't understand that this CPU is a totally different architecture than what you're familiar with. The WiiU CPU is a RISC processor, the other consoles use a CISC processor. The difference is that a RISC uses significantly less complex instructions, meaning there is a lot less processing overhead when crunching numbers. RISC processors are also designed to do more with each clock cycle (better efficiency). This provides a huge boost per clock cycle in the amount of computations that can be done, especially for gaming tasks such as physics and AI. This is why, though the CPU has less raw clock speed than the 360, it could theoretically be a bit faster at the tasks it's asked to do when code is optimized for the architecture. Coupled with the fact that the GPU is much faster and can take on some additional tasks, plus there is a separate DSP, and the fast interconnects referenced by z101, this gives it a significant leg up over the current gen in future *optimized* games (which none of these ports were)

 

Wow, great argument. People like you are the reason I quit posting on this site full of people who don't have a clue how to debate & disagree in a civil manner years ago. I'm wondering why I came back.

You came back to post misleading information, perhaps. The architecture of the Wii U CPU is over a decade old. It loses out to just about any modern architecture on efficiency. The fact it runs at some extremely low clocks makes the "woo, GHz doesn't matter anymore" false in this case, except perhaps toward iPads and iPhones out there. It loses to the Cell and the Xenon on most significant measurements. 

Wii U CPU - 8,400 MIPS and ~13 GFLOPS. Xenon 19,200 MIPS and ~110 GFLOPS. The PPE on the Cell alone - excluding the seven SPEs - does 10,200 MIPS and ~26 GFLOPS. 

Also like I said, the only good thing about it all is the general purpose aspect of current GPUs... but as the Wii U games are clearly showing it, that doesn't cover everything does it.


Now now, I can agree that the Wii U CPU is slower, but saying that design is not as efficient as modern day stuff is actually not true :P That's like saying the P4 was more efficient than the P3 which would be a load of shit other than RAM handling.



Soleron said:
curl-6 said:
Ah, the Megahertz Myth. I remember this old fallacy from back in the 90s, and I can't believe people STILL think clock speed = power.

Each Wii U core takes up no more than 1/3 of the area of a 360 core. It is therefore not faster per-clock, unless a miracle happened at IBM. Performance = clock speed * performance per clock (approximately, yes it's a tautology).

To use an example from the 90s, would you rather have a 600MHz Pentium III or a 1.6GHz Pentium 4? The Pentium III was faster per-clock, but the difference is so big that you'd obviously get the P4.

and then we come back to why Core i CPUs are a blessing, ah~~~ so good.....



Soleron said:
I wish that guy hadn't made that comment. I mean, he's not wrong, but it leaves all this hope for certain people that the Wii U CPU can perform much better than the 360 CPU on future games.


not on a CPU level at least, it all depends on how close all the parts are in the Wii U, if the latency from the CPU-GPU-Memory are really really low, then we'd have a good machine either way, just not in a brute force matter, it's kinda like the GC all over again.



dahuman said:
Soleron said:
curl-6 said:
Ah, the Megahertz Myth. I remember this old fallacy from back in the 90s, and I can't believe people STILL think clock speed = power.

Each Wii U core takes up no more than 1/3 of the area of a 360 core. It is therefore not faster per-clock, unless a miracle happened at IBM. Performance = clock speed * performance per clock (approximately, yes it's a tautology).

To use an example from the 90s, would you rather have a 600MHz Pentium III or a 1.6GHz Pentium 4? The Pentium III was faster per-clock, but the difference is so big that you'd obviously get the P4.

and then we come back to why Core i CPUs are a blessing, ah~~~ so good.....

Positive emotions and Intel do not go together.



Around the Network
Soleron said:
dahuman said:
Soleron said:
curl-6 said:
Ah, the Megahertz Myth. I remember this old fallacy from back in the 90s, and I can't believe people STILL think clock speed = power.

Each Wii U core takes up no more than 1/3 of the area of a 360 core. It is therefore not faster per-clock, unless a miracle happened at IBM. Performance = clock speed * performance per clock (approximately, yes it's a tautology).

To use an example from the 90s, would you rather have a 600MHz Pentium III or a 1.6GHz Pentium 4? The Pentium III was faster per-clock, but the difference is so big that you'd obviously get the P4.

and then we come back to why Core i CPUs are a blessing, ah~~~ so good.....

Positive emotions and Intel do not go together.


not on a price level no, but it feels good to see the faster CPUs rape though everything.



DieAppleDie said:
In DBZ terms that would be a Vegeta when he reached the earth for the first time


More like 3 Yamchas, you know, since it's triple core and stuff.



My phone has 1.6 GHZ , what the fuck Nintendo...



Soleron said:
curl-6 said:
Ah, the Megahertz Myth. I remember this old fallacy from back in the 90s, and I can't believe people STILL think clock speed = power.

Each Wii U core takes up no more than 1/3 of the area of a 360 core. It is therefore not faster per-clock, unless a miracle happened at IBM. Performance = clock speed * performance per clock (approximately, yes it's a tautology).

To use an example from the 90s, would you rather have a 600MHz Pentium III or a 1.6GHz Pentium 4? The Pentium III was faster per-clock, but the difference is so big that you'd obviously get the P4.

Size is also not necessarily a cast-iron indicator of power; there's other factors. (Latency, GPGPU) This hacker fellow already said that comparing it to the 360 CPU and surmising it's much weaker is a mistake. 



curl-6 said:
Soleron said:
curl-6 said:
Ah, the Megahertz Myth. I remember this old fallacy from back in the 90s, and I can't believe people STILL think clock speed = power.

Each Wii U core takes up no more than 1/3 of the area of a 360 core. It is therefore not faster per-clock, unless a miracle happened at IBM. Performance = clock speed * performance per clock (approximately, yes it's a tautology).

To use an example from the 90s, would you rather have a 600MHz Pentium III or a 1.6GHz Pentium 4? The Pentium III was faster per-clock, but the difference is so big that you'd obviously get the P4.

Size is also not necessarily a cast-iron indicator of power; there's other factors. (Latency, GPGPU) This hacker fellow already said that comparing it to the 360 CPU and surmising it's much weaker is a mistake. 

OK.

I'm only gonna say this once.

Fuck GPGPU.

Pretend it does not exist. There are a large number of reasons for this.

Apart from that, YES, the CPU is not the only factor in the system. But all the info we have (die size, clock speed, approximate architecture, resulting games) indicates that it is the major bottleneck to performance. That is the only point I am making here.

He said that assuming it's much weaker (as in, lol 1/3 of the clock 1/3 of the performance) is a mistake. He hasn't said whether it's stronger or weaker. I think the evidence still means it's weaker and that is a problem.