By using this site, you agree to our Privacy Policy and our Terms of Use. Close
curl-6 said:
zeldaring said:

What they have a article saying wiiu specs leaked  and its 352Gflops and it was never updated or retracted.

the Wii U is not 10 years of evolution in terms of GPU, it uses some ATI/AMD TeraScale solution not all that different from R600 from 2007.

and it was using 64bit DDR3 while the 360 had 128bit GDDR3 + those 10MB of edram

As for the ps5 it wasn't because of the tools it was cause of its design go read the article.

Source? DF's piece on the Wii U's spec leak in 2013 never says it's 352 Gflops: https://www.eurogamer.net/df-hardware-wii-u-graphics-power-finally-revealed

Also, reporting a leak is also not a confirmation that it's true.

If you read the PS5 vs Xbox Series article, two of the cited advantages were a lower level API and a more efficient GPU compiler; dev tools.

https://www.eurogamer.net/digitalfoundry-2024-df-weekly-if-xbox-series-x-is-more-powerful-how-does-ps5-compete-so-closely

Again, who said anything about Wii U being "10 years of evolution?" Who are you even arguing with? You're ranting and raving about 10-year-old arguments in a thread which isn't even about that topic. Take a breath and chill out.

From the article.

Chipworks' shot is still being analysed, but the core fundamentals are now seemingly beyond doubt. The Wii U GPU core features 320 stream processors married up with 16 texture mapping units and featuring 8 ROPs 

That's basically a gpu that has 352 gflops it's not rocket science to figure out. Not to mention a way more modern architecture. It would put around switch power.

From the ps5 article 

More than one key triple-A developer tells us that the PlayStation GPU compiler is significantly more efficient than the Microsoft alternative, meaning that there's better utilisation of the graphics hardware.

That seems like a system design not dev tools

Last edited by zeldaring - on 20 July 2024