By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U hacked and CPU clock speed revealed, apparently.

Soleron said:
curl-6 said:
Soleron said:
curl-6 said:
Ah, the Megahertz Myth. I remember this old fallacy from back in the 90s, and I can't believe people STILL think clock speed = power.

Each Wii U core takes up no more than 1/3 of the area of a 360 core. It is therefore not faster per-clock, unless a miracle happened at IBM. Performance = clock speed * performance per clock (approximately, yes it's a tautology).

To use an example from the 90s, would you rather have a 600MHz Pentium III or a 1.6GHz Pentium 4? The Pentium III was faster per-clock, but the difference is so big that you'd obviously get the P4.

Size is also not necessarily a cast-iron indicator of power; there's other factors. (Latency, GPGPU) This hacker fellow already said that comparing it to the 360 CPU and surmising it's much weaker is a mistake. 

OK.

I'm only gonna say this once.

Fuck GPGPU.

Pretend it does not exist.

Apart from that, YES, the CPU is not the only factor in the system. But all the info we have (die size, clock speed, approximate architecture, resulting games) indicates that it is the major bottleneck to performance. That is the only point I am making here.

He said that assuming it's much weaker (as in, lol 1/3 of the clock 1/3 of the performance) is a mistake. He hasn't said whether it's stronger or weaker. I think the evidence still means it's weaker and that is a problem.

Why? Because devs won't code to take advantage of it? I'm confused.

As for resulting games, I don't think launch games can be fairly used as a good measuring stick of a system's capabilities.

I don't doubt that the 360 CPU has more grunt, (though whether this can be worked around by clever developers, like the Gamcube/Wii's lack of programmable pixel/vertex shaders, is another matter) but this doesn't overly bother me; the Wii offered the best gaming experiences for me this generation and it's what, 1/5 as strong as the Wii U's CPU based on clock speed and core count alone? (Not to mention cache size) The ability to run games on the level of Assassin's Creed 3 is enough for me, I'm more interested in the gamepad.



Around the Network
curl-6 said:
...

Why? Because devs won't code to take advantage of it? I'm confused.

See below.

As for resulting games, I don't think launch games can be fairly used as a good measuring stick of a system's capabilities.

I think third parties are pretty done with the Wii U. And the best looking Nintendo games have had lower sales; a Nintendo under financial stress will not be able to justify putting a huge AAA team for 5 years on a 3m selling Zelda if the same team could be split into four making 20m selling Kart, Sports, Fitness and 2D Mario all at once.

I don't doubt that the 360 CPU has more grunt, (though whether this can be worked around by clever developers, like the Gamcube/Wii's lack of programmable pixel/vertex shaders, is another matter) but this doesn't overly bother me; the Wii offered the best gaming experiences for me this generation and it's what, 1/5 as strong as the Wii U's CPU based on clock speed and core count alone? (Not to mention cache size) The ability to run games on the level of Assassin's Creed 3 is enough for me, I'm more interested in the gamepad.

I agree. I was happy with the Gamecube honestly.

Let's look at why GPGPU doesn't exist on the PC:

- Tools and APIs are proprietary to Nvidia. The only GPU physics library worth anything is this.
- I don't see anyone working to change this with new tools or engines. There are some paid-for tech demos that didn't amount to much.
- Running code on the GPU is very, very hard because of poor memory access. It takes experts with PhDs in CS right now. It's incredibly expensive to hire the talent and give them the time to optimise the code. Even if it had potential savings you can do a better job with the time and money running it on the CPU.
- GPUs consume a lot of power at full load. It is more energy efficient to run CPU code right now unless the code is very good.
- Splitting the GPU between GPGPU work and rendering work is very hard and would need AMD's full support.
- AMD as a company is near bankrupt and has no time or ability to help with this.
- The code you get is very specific to that GPU. It's even less portable than Cell code was to an Xbox CPU. Devs would practically need to reimplement three times for three consoles. Unless Wii U has dominating share the proposition is poor. The other two consoles will have a more balanced CPU:GPU power ratio making it make less sense on them anyway.
- Budgets are constrained this gen especially. Games are increasingly limited by budget instead of hardware. Spending money to especially put calculation on the GPU has to be lower priority than merely HAVING those features in any form.
- The parts of game code that are suited to running on a GPU (physics) are low priority and frankly optional features compared to getting the game even running.
- It's been around for five years and none of the above has changed even a little bit.



Soleron said:
curl-6 said:
...

Why? Because devs won't code to take advantage of it? I'm confused.

See below.

As for resulting games, I don't think launch games can be fairly used as a good measuring stick of a system's capabilities.

I think third parties are pretty done with the Wii U. And the best looking Nintendo games have had lower sales; a Nintendo under financial stress will not be able to justify putting a huge AAA team for 5 years on a 3m selling Zelda if the same team could be split into four making 20m selling Kart, Sports, Fitness and 2D Mario all at once.

I don't doubt that the 360 CPU has more grunt, (though whether this can be worked around by clever developers, like the Gamcube/Wii's lack of programmable pixel/vertex shaders, is another matter) but this doesn't overly bother me; the Wii offered the best gaming experiences for me this generation and it's what, 1/5 as strong as the Wii U's CPU based on clock speed and core count alone? (Not to mention cache size) The ability to run games on the level of Assassin's Creed 3 is enough for me, I'm more interested in the gamepad.

I agree. I was happy with the Gamecube honestly.

 

Let's look at why GPGPU doesn't exist on the PC:

- Tools and APIs are proprietary to Nvidia. The only GPU physics library worth anything is this.
- I don't see anyone working to change this with new tools or engines. There are some paid-for tech demos that didn't amount to much.
- Running code on the GPU is very, very hard because of poor memory access. It takes experts with PhDs in CS right now. It's incredibly expensive to hire the talent and give them the time to optimise the code. Even if it had potential savings you can do a better job with the time and money running it on the CPU.
- GPUs consume a lot of power at full load. It is more energy efficient to run CPU code right now unless the code is very good.
- Splitting the GPU between GPGPU work and rendering work is very hard and would need AMD's full support.
- AMD as a company is near bankrupt and has no time or ability to help with this.
- The code you get is very specific to that GPU. It's even less portable than Cell code was to an Xbox CPU. Devs would practically need to reimplement three times for three consoles. Unless Wii U has dominating share the proposition is poor. The other two consoles will have a more balanced CPU:GPU power ratio making it make less sense on them anyway.
- Budgets are constrained this gen especially. Games are increasingly limited by budget instead of hardware. Spending money to especially put calculation on the GPU has to be lower priority than merely HAVING those features in any form.
- The parts of game code that are suited to running on a GPU (physics) are low priority and frankly optional features compared to getting the game even running.
- It's been around for five years and none of the above has changed even a little bit.

 

The PS3 was an absolute bitch to program for as well, but devs eventually got it to work. We'll definitely see a lot of multiplats that aren't well done on Wii U, but I don't think we have to give up hope completely just yet.

And while I largely agree about Nintendo, they still have teams like Retro that target high-end graphical fidelity. (And as a side note, if you're implying Skyward Sword is one of the best looking Wii games I strongly disagree; from a technical perspective it didn't really make good use of the Wii's graphical hardware)