By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Soleron said:
Viper1 said:
ethomaz said:
It's the Wii U's CPU really bad, Jim Sterling hate again or lazy developer???

...

 

* to illustrate how this works, I'm going to use a series of figures purely for demonstrational purposes.   They in no way accurately depct the real figures but it can give you a sense of how to handle the work load.

Say the PS3 and X360 can handle 6 operations per clock cycle.    At 3.2 Ghz, that's 19.2 billion operations per second.
The game engine is designed around that clock speed.

Now say the Wii U can handle 10 operations per clock cycle but clocked much slower at just 2 Ghz.   That's still 20 billion operations per second.
But the game engine as it was designed is expecting a much faster clock that isn't there.   So the game engine can only operate at 62.5% of it's design capacity.  Now they have to rework the game engine to enable the more efficient clock cycle which is what they promised to be working on but apparently couldn't manage in time.

No that doesn't make sense. Game code responds well to greater IPC or greater clock speed.

The problem is that the Wii U is NOT capable of '10' operations per second. It's worse than the 360, because it's a third of the transistor count. Imagine it's capable of '4'.

If perfectly optimised for each it'd still be better on 360.

I think the only thing better is the OOE part, but like I mentioned either here or in other threads, the overall won't be faster than the Xenon or Cell since I'm also aware of that die size. I'm leaning more like a dual or tri core at maybe 1-1.6GHz tops, which might not be super bad depending on the kind of RISC on that thing. I'm thinking they just want it to execute codes and are not even worried about the physics calculations on that thing LOL, prolly just like "meh, they can offload some to the GPU, done."