By using this site, you agree to our Privacy Policy and our Terms of Use. Close
mine said:

Not common sense. Common sense is that comilers do the hard work for the programmer. The most important thing the developer can do is studying the effects of their data structures, data and program flow on the CPU caches and act accordingly.

So: as the Wii U has MORE L2 cache per Core than the XOne and PS4 Cores. Code runs on the Wii U cores more efficient. But the HD twins make it more then up with more cores and - of course - a better GPU.

BTW: take a look at the PS4 developer presentations. The small L2 cache is hurting BIG time when CPU AND GPU are accessing the GDDR...

Or the other way round: the Wii Us big L2 cache, big eDRAM and balanced CPU core / GPU enabled to deliver more than most people expected from such a configuration...

On top of that, even in multithreaded game engines you are going to have one main thread that has a heavier workload than the other threads. The Wii U CPU has an asymmetric L2 cache with 2 MB for one of the cores, and 512 KB for each of the two other cores. That means that it's design has been optimized to run video games.