By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

I posted this in another forum, and though it's not a perfect comparison, it's something to think about:

"The Wii U has never been about power, but efficiency. The Wii U pulls off its HD graphics at a meager 35 watts. Not only does the system temperature remain cool at full load, but for some of us who are more keen on utility costs, this IS a factor! Much like Japanese cars, the power-per-unit-cost is absolutely phenomenal. While my car could blow the doors off of a Civic, there is NO denying that the Civic gets better horsepower-per-liter than my car, hands down. The Wii U is this EXACT principal in action. The performance-per-watt of the Wii U is absolutely phenomenal. Looking at Super Mario 3D World, and looking at Killzone: Shadow Fall may be a real stupid comparison graphically, but all I can say is that the PS4 operates around 150 watts, but I don't see 4 times the graphical power. Though this comparison isn't direct, it's interesting to note that 720p is 921,600 pixels, while 1080p is 2,073,600 pixels: 2.25 times the pixels. Each frame at 1080p costs 2.25 times more power to render than each frame at 720p, HOWEVER, Killzone: Shadow Fall hovers around 30-40 FPS (we'll use 35 to average it out), while Super Mario 3D World stays a rock solid 60 FPS. Doing some basic math for 1 minute of rendering: KZ:SF has rendered 72,576,000 pixels; SM3DW has rendered 55,296,000 pixels. As you can see, the rendering power of the Wii U just by resolution and framerate is NOT that far behind the PS4, and at a FRACTION of the energy cost.

By the FLOP numbers against the PS4, we could say that the PS4's performance-per-watt is better (supposing the Wii U is really only in the low GFLOPS range), but we have already seen how that can be totally spun around in ways that turn out to mean nothing in actual gameplay graphics performance (PS3 was touted to have the same teraflops as the much more modern and "more powerful" PS4!). The PS4 clearly pulls off better graphics than the PS3 while maintaining the same 2 TFLOP numbers; the PS3 and PS4 are the exact same power.

Having explained that, I see the Wii U being an overall winner in efficiency, but not in power. And I have to say, as kind of a "green" guy when it comes to energy consumption, I give the Wii U props for delivering a full HD experience at the energy cost of a light bulb. How great is it to be playing Super Mario 3D World knowing that it's only costing you the same as flipping on the light in your room? ^_^"

Again, there are other factors in rendering, especially considering the highly complex nature of modern technology, BUT, it's still something to consider in terms of raw power...



Check out my entertainment gaming channel!
^^/
Around the Network

I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.



How much more powerful are the PS4 and Xbone compared to the PS3 and X360?



sidmeiernintifan said:
How much more powerful are the PS4 and Xbone compared to the PS3 and X360?

8-9x



Does that mean 8 or 9 times more powerful?

If a game like The Last of us or GTA V is possible on a PS3, I doubt the power of a console 8.5 times more powerful will be fully used in a game.



Around the Network
eyeofcore said:
One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

No.. please... just.....not this.

This "IGN user" is none other than überidiot misterxteam (who also runs under other names on his blog. You can find that out by comparing the identical atrocious spelling and grammar errors several users share on that blog. But that is not the problem here, though I must admit it is very interesting to read through that blog because it almost mirrors the inherent mechanics of a religius cult, made visible in a completely different field.

The problem with this "second layer" (occasionally he pretends he can even "see three layers") is the fact that misterxmedia doesn't have the slightest clue how chips are made. He also doesn't understand the difference between 3D stacking and stacked chips. On top of that, he doesn't even understand the difference between an x-ray photography and normal photography. Almost all his "x-ray research" is actually looking at plain photography and making false assumptions due to a complete lack of knowledge.

I can assure you there are NO "hidden layers" inside the XBox APU or the WiiU gpu. To all readers of that blog I can only say "Enjoy the laughs you get out of that misguided guy's ideas". Maybe some of you will actually read a little about chip manufacturing (in books or in serious websites) and figure out just how hillariously bad misterxy can guess.



eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.



sidmeiernintifan said:
Does that mean 8 or 9 times more powerful?

If a game like The Last of us or GTA V is possible on a PS3, I doubt the power of a console 8.5 times more powerful will be fully used in a game.


The PS4 has roughly 10 times the power as the PS3.

 

By the end of the gen it'll feel like not enough though.



ESRAM on Xbox One needs updating. I'm pretty sure it's 109GB/s as standard due to the 10% GPU speed increase but has a theoretical peak of 192GB/s?

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview



drake4 said:
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.


No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?

EDIT:

Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...