| D2K said: This is clickbait. The reason it is not coming to the WIi U has nothing to do with "techinical demands." The fact that he claims it is "pushing the limits" of the XB1 and PS4 is not a good thing. For one, it shows that the twins are going to top-out in power in less than a few years if developers (and others have stated this too) are already hitting walls. However in this case it just EA trolling Nintendo as usual. There is no reason to even ask if an EA game is coming to the Wii U.
Why anyone at this point would make a story as to why an EA game is not coming to the Wii U shows that they really have a lacking of substance in their life. They must really need attention really bad. I doubt that nary a Wii U dev kit has seen the light of day at EA over the past year. The Wii U hasn't been in the cards for EA since shortly after it's launch because of the Origin deal going bad. There is no reason to believe they even spent to time even putting the game on a Wii U dev-kit much less to confirm that the Wii U cannot 'handle' it.
The Wii U has a Tri-Core PowerPC750 with 3MB of fast eDRAM, a Dual Core ARM Processor, and a dedicated sound processor. That's 3 CPUs, 6 overall cores. It has a custom AMD RadeonHD 6670 GPU with 35MB of fast eDRAM with a bandwidth clocked as high as 1TB per-second which is much faster than the eDRAM and eSRAM in the XB1 and faster than the GDDR5 RAM in the PS4. And of course the Wii U has 2GB of DDR3 RAM (dev kits have 3 GB of RAM.) There isn't a single thing in this game that the Wii could not handle and possibly do better because of the low latency and smaller memory footprint of the RISC architechture in the Wii U which allows it to do a lot more with a lot less in terms of raw numbers. Out of the 6 available cores in the Wii U the only game confirmed to use more than one is Super Mario 3D World. Nano Assault NEO only uses one core where as Resogun for the PS4 uses 50% of the PS4 CPU.
What's sad about this article is that the developers actually are in the delusion that the fact that the game uses "only" 50% of the CPU and the GDDR5 unifed memory is "necessary" for the game is actually a good thing and are pounding their chest about it. Where as Shin'en Multimedia has stated the opposite about Nano Assault NEO. They have used the LEAST amount of resources possible on the WIi U.
These games pretty much look the same in terms of graphics. They are both launch titles. Nano Assault NEO obviously is one-year older which makes it even more impressvie. The fact that Shin'en Multimedia only has literally and handfull of people working for it and can easily make quality 8th-gen games and huge corporations like Electronic Arts cannot is proof positive that it is not what the Wii U "canot" do, it's about what developers fed up with Nintendo's BS are actively CHOOSING not to do.
Sometimes I wonder if these people even have a brain in their heads. Nintendo does a lot of stupid things, but when people make ridiculous statements like this is reall grinds my gears. When a studio like Slightly Mad Studios comes out and says that the Wii U runs the most graphicaly-intensive part of their up coming Project C.A.R.S. game with no problems at all it gets swept under the rug. http://nintendoenthusiast.com/interview/slightly-mad-interview-andy-tudor-project-cars-wii-u/ Keep in mind that he also noted that it was UNOPTIMIZED an STILL handled it no sweat. However when ANYONE says that the Wii U is 'underpowered' in anyway it is front page news trending worldwide. I didn't realize there were so many insecure people in the industry. Nintendo has done a lot of shady things over the years. They are not anyone's sweetheart right now and largely that is due to their unabashed arrogance in their business practices, but for people to keep this myth of the Wii U only being 'marginally' more powerful that the 7th-gen consoles is just embarrassing. When talking about the WIi U power-level compared to that of the XB1 and PS4 it basicaly becomes a IBM vs Intel argument. People on both sides of the fence will always believe the side they have chosen is the dominate one and neither side will admit that both sides bascially are giving you the same thing, just in different ways. |
The most popular hypothesis that Latte was a 16:320:8 part @ 550 mHz. Fortunately, we can see how such a part runs games on the PC. You know, the PC, that inefficient beast that's held back by Windows, thick APIs, Direct X draw-call bottlencks that break the back of even fast CPUs, and all that stuff. Here is a HD 5550, a VLIW5 GPU with a 16:320:8 configuration running at @550 mhz:
http://www.techpowerup.com/reviews/HIS/Radeon_HD_5550/7.html
And it blows past the 360 without any problems. It's not even close. And that's despite being on the PC!
Now lets scale things back a bit. This is the Llano A3500M w/ Radeon 6620G - a 20:400:8 configuration GPU, but it runs @ 444 MHz meaning it has exactly the same number of gflops and TMU ops as the HD 5550, only it's got about 20% lower triangle setup and fillrate *and* it's crippled by a 128 bit DDR 1333 memory pool *and* it's linked to a slower CPU than the above benchmark (so more likely to suffer from Windows/DX bottlenecks). No super fast pool of edram for this poor boy!
http://www.anandtech.com/show/4444/amd-llano-notebook-review-a-series-fusion-apu-a8-3500m/11
http://www.anandtech.com/show/4444/amd-llano-notebook-review-a-series-fusion-apu-a8-3500m/12
And it *still* comfortably exceeds the 360 in terms of the performance that it delivers. Now lets look again at the Wii U. Does it blow past the 360? Does it even comfortably exceed the 360? No, infact most ports are worse. infact a developer has confirmed the the wiiu to be 176gflops gpu VLIW5 which makes it slighty better then the the 360 240 glops VLIW4 but of couse nintedo fanboys will ignore the facts and make specs that from there ass.











