By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

People in this and other websites should not take this information factual for Wii U since there are not much information about the hardware inside it and performance of it. We can only speculate though I am going to share my findings and my research involving Wii U;

Wii U CPU:
- Tri Core IBM 45nm PowerPC750CL/G3/Power7 Hybrid
- Core 0; 512KB Core 1; 2MB Core 2; 512KB L2 Cache
- Clocked at 1.24Ghz
- 4 stage pipeline/not Xenon-Cell CPU-GPU hybrid
- Produced at IBM advanced CMOS fabrication facility
- eDRAM is L2 cache(Power7 memory implementation)

*Wii CPU core was 20% slower than Xbox 360 core, since Wii U CPU is modified/ehanced and clocked 65-70 percent higher thus two Wii U cores should be on par or equal to three Xbox 360 cores or if all three cores are used then Wii U CPU is one third stronger/faster than Xbox 360 processor and faster than PlayStation 3 processor. http://gbatemp.net/threads/retroarch-a-new-multi-system-emulator.333126/page-7#post-4365165

- Dual Core ARM Cortex 8 for background OS tasks clocked at 1Ghz with 64KB L1 cache per core and 1MB of SRAM as L2 Cache, evolution of "Starlet" chip
- Single ARM9 "Starlet" core for Backward Compatibility, rumored to have higher clocks

Wii U Memory:
- DDR3 2GB, 1GB for OS, 1GB for Games
- eDRAM 32MB VRAM + 4MB Gamepad + 3MB CPU L2 cache
- Clocked at 550Mhz
- eDRAM act's as "Unified Pool of Memory" for CPU and GPU thus practically eliminating latency between them

Wii U GPU:
- VLIW 5/VLIW4 Radeon HD 5000/6000 40nm
- DirectX 11/Open GL 4.3/Open CL 1.2/Shader Model 5.0
- Supports GPGPU compute/offload
- Customized, Using custom Nintendo API codenamed GX2(GX1 Gamecube)
- Clocked at 550Mhz
- Produced at TSMC advanced CMOS 40nm
- Uses 36MB eDRAM as VRAM
- 4MB of eDRAM is allocated for streaming feed for gamepad
- GPU is customized and according to modification of GPU's in their previous consoles we can presume that Nintendo won't waste any mm^2 to unneeded features and customized to fit own needs.

*Eyefinity is present on AMD cards since Radeon HD 5000 so it is at least Radeon HD 5000 w/ Terascale 2

*If it is Radeon HD 4000 series and has 320SPU's then it is Radeon HD 4670 55nm though since Wii U uses Eyefinity and GPU is 40nm thus being Radeon HD 4000 series is invalid.

*Radeon HD 6000 was released in Q3 2010, final Wii U silicon was finished in Q2 2012 and released Wii U in Q4 2012. Looking at gap between E3 2011 and final Wii U silicon Q2 2012 plus Radeon HD 6000 evolved from Radeon HD 5000 that evolved from Radeon HD 4000 thus I presume switching to a newer though similar GPU and architecture was not a problem and all of these GPU's were produced at TSMC's fabs.

*Wii U from E3 2011 and its development kits had Radeon HD 4850, it is rumored that Wii U's newer development kit replaced 4850 with modified/customized/cut down Radeon HD 6850.

*Radeon HD 6850 has 1.5Tflops of performance at clock of 775Mhz with 1GB of GDDR5 VRAM and bandwidth of 130Gbps


*GPU in Wii U is clocked at exactly 30% lower at clock of 550Mhz and if it has 1/3 of SPU's thus it has 0.352Tflops. 36MB of eDRAM with 70-130Gbps bandwidth, 2-player co-op as for example in Black Ops 2 Zombie Mode is using Eyefinity(?) to stream two different in-game images/views.

*90nm 16MB eDRAM can do 130Gbps bandwidth, 45n 32MB eDRAM in WIi U should do the same plus since CPU's Cache and GPU uses eDRAM so latency is much much lomwer and AI can be offloaded to GPU

Wii U Note;
- Can’t use DirectX 11 because of OS, only Open GL 4.3/Nintendo API GX2 that is ahead of DX11
- Easier to program, no multiple bottlenecks causing issues as on Xbox 360 and PlayStation 3
- Efficient hardware with no bandwidth bottlenecks
- Launch games and ports use 2 cores and only a couple of ports/games use 3rd core to a decent degree
- Wii U CPU has much higher operations per cycle/Mhz than Xbox 360/PlayStation 3, it is unknown if it is faster
- Wii's CPU core was slower
- Minor Power7 architecture implementation involving memory allows shave off 10 or more percent of texture size without loss of quality(special compression or something)
- Wii U power consumption at full system load is 40 watts(highest load ever possible in its current state)
- Wii U's PSU/power brick is rated 75 watt and has 90% efficiency thus it can handle 67.5 watts
- Wii U's Flash storage, RAM, USB ports, motherboard, fan and small secondary chips consume around 5 to 10 watts in total when fully stressed/used
- Wii U's SoC(CPU and GPU) estimated maximum power consumption is 30 to 35 watts
- Supports 4k displays, native 4k via HDMI 1.4b (possible 2D 4k games?)

*It is maybe in fact most efficient performance per watt in the world in terms of 45/40nm chips/SoC's

In case you are wondering why some games run on Wiii U worse compared 7th generation consoles then I have simple explanation; Wii U hardware is noticeably different than Xbox 360's or PlayStation 3's because their processors and not true CPU's since they can operate GPU related tasks well compared to Wii U that is primarily a CPU. Another reason is that developers don't put spend resources and time on porting the game for Wii U thus it lacks proper optimizations and adaptions of their game engines to Wii U's hardware. Even though some ports perform lower than on 7th generation consoles, in most cases run on higher resolution and/or at Native 720p/HD.

Most if not all 3rd party launch games were made on older Wii U development kits that had 20 to 40% lower clocks than final development kit that Nintendo released near the launch so developers did not had much time to adapt their games to the final devkit thus games were running poorly also games like Darksiders 2, Warriors Orochi 3 Hyper, Batman Arkham City and other were using only two cores while third was not used at all or was barely used to aid performance of the game involving CPU related tasks. Since most ports are from Xbox 360 and/or PlayStation 3 versions of the games there are sure to be incompatibilities since Xbox 360 and PlayStation 3 Processors do CPU and also GPU tasks plus GPU's, RAM/Memory, Latency and other factors are different than on Wii U thus optimizations and adaptations is needed though cheap ports as always tend to be a train wreck. Don't you agree?

Wii U may not be a "leap" as Xbox One or PlayStation 4 though it is a tiny leap over Xbox 360 and PlayStation 3 when looking it very roughly and when taking into account all the implementations, features and "tricks" then the gap is even bigger. Wii U has more embedded RAM than Xbox One that has 32MB of eSRAM while Wii U has 36MB of superior eDRAM for GPU also eDRAM blows away GDDR5 in PlayStation 4 in terms of bandwidth if I am correct? 130Gbps on Wii U versus 80Gbps on PlayStation 4?

We need to take in consideration that Wii U's CPU Espresso has a certain implementation from Power7 architecture that allows usage of eDRAM and we know that CPU in Wii U has total of 3MB of eDRAM as L2 Cache and it could also use main eDRAM pool as L3 Cache and maintain connection with GPU thus creating HSA/hUMA like capabilities and reducing latency by far between CPU and GPU communications and data transfer.

Wii U's GPU was Radeon HD 4000 series in very first alpha development kit and that was Radeon HD 4850 that was 55nm and not RV740 40nm and by that time of first unveiling of Wii U there was Radeon HD 6000 on the scene for nearly a year or now almost three years so Nintendo could easily switch to Radeon HD 6000 series since it is basically evolution of Radeon HD 5000 that is refinement of Radeon HD 4000 series and further it is supported by using Eyefinity features on Wii U's gamepad that was proven by Unity demo on Wii U and Call Of Duty Black Ops 2 when in co-op in zombie mode also Wii U can stream up to two images at gamepad though it hasn't been used and maybe it will be used in near future.

Also people seem to forget about power consumption of dedicated GPU's versus embedded into motherboard ones that naturally consume less plus Wii U's GPU uses eDRAM that consumes 1 to 2 watts max compared to GDDR5 that consumes around 9 or more watts per 2Gb. GPU in Wii U is embedded thus it does not use PCI-E, additional PCB and chips plus has embedded eDRAM in it so consumption of eDRAM could be negated thus power consumption can be radically reduced.

Lets take for example Wii U's GPU die size and Radeon HD 6970 die size and assume that Wii U GPU is VLIW 4 based Radeon HD 6000 series GPU and not VLIW5. Radeon HD 6970 consumes 250 watts maximum and has die size of 389mm^2 and 2Gb of GDDR5 and is clocked at 880Mhz. Lets reduce it to 320 SPU's that will use 80mm^2 thus consumption is lowered to roughly 83 watts then we remove 2GB of GDDR5 and it goes down to 70-73 watts, now lets lower down the clock of it from 880mhz to 550mhz so that is roughly 35% lower clocks and when clocks are 25% then power consumption is cut in half since it is 35% so the GPU consumption goes down from 70/73 to roughly 14-15 watts without being embedded and we could easily shave off couple of more watts and it would most likely go down to 10 watts.

Since maximum power consumption of Wii U is 40 watts and from my calculation GPU consumes roughly mere 10 watts then Wii U's CPU could consume 15 to 20 watts and rest of system around 10 to 15 watts depending on how much Wii U's CPU consumes since it is an unknown factor. I am not counting any possible customizations on the Wii U's GPU, this is all rough estimations and we don't know the whole picture. Wii U is a beast when looking at what nm process was built on and probably most efficient machine on that nm process in the world.

We can not really compare Wii U's GPU to any off-shelf/standard dedicated GPU, since some features that are found in regular dedicated GPU's are not needed when embedded thus with some minor modifications I can see Wii U having 384 SPU's thus at 550Mhz should have 420 Gflops rather than 320 SPU's if it was a standard  dedicated GPU with die size of it. If Nintendo was to do drastical modifications they could even reach 500SPU's and very close to 600gflops. One of homebrew hackers coounted 600 shaders so I am wondering if Nintendo is maybe using one technology that AMD has never used that was from ATI that they also never used and they call it "high density" that is going to be used in AMD's Steamroller cores, from what information AMD released... "High Density" can increase density of the chip by 30% and reduce size by 30% and reduce power consumption.

Involving Wii U's DDR3 RAM, its bandwidth has theoretical maximum of 51,2GBps since it has four 512MB chips and not one large 2GB chip so anyone thinking that its maximum bandwidth is mere 12.8GBps is a tech illiterate or a sucker to Beyond 3D and NeoGAF forums that tech savvy people fleed away or avoiding.

Wii U's Power Brick/PSU is rated 75 watts and has efficiency of 90% so it can handle at max 68 watts without serious degrading and since Wii U consumes 40 watts there is available 28 watts though I would dare only to bump power consumption to 60 watts or 20 aditional watts if I could increase performance.

*I won't like where I got all of this information's though I can assure you I did a lot of research and digging on the internet, collecting bits and pieces and then putting the together into one picture thus: "EUREKA!!"



Around the Network

You should not assess the WiiU hardware by its supposed specs (remember a lot of those numbers are assumptions).

Read Digital Foudry articles or the lastest Watchdogs comments.

The games. only the games.



http://www.youtube.com/watch?v=w9VWRB07yqc&feature=youtu.be



http://forums.anandtech.com/showthread.php?p=35555156#post35555156



There seems to be more evidence now that the wii u gpu is at the lower expectation of gflops figures at about 176 gflops. This is based on the fact that the gpu is using a power hungry 40nm or 45nm process but is consuming a very low amount of power. So may actually be 160 streaming units etc. Possibly something like the mobility radeon hd 6400.

Clearly this still represents a large architecture improvement over 360 and PS3 plus you have the high speed video memory. The fact that many later games are still struggling on wii u to fully match 360 and PS3 frame rates also adds to this.

It's sad to say but I don't think we have gone low enough yet with regard the true wii u spec.

Also probably worth mentioning that the wii u cpu design is 32bit not the 64bit design of the xbox one and ps4 jaguar cores.

I would have liked to have seen 360 and PS3 spec's added to get the full picture. Still many games perform better on 360 and PS3 compared to wii u.



Around the Network
eyeofcore said:
http://forums.anandtech.com/showthread.php?p=35555156#post35555156 <--- you could join the discussion here :)


Now that is some serious concentrated nonsense.



bonzobanana said:
There seems to be more evidence now that the wii u gpu is at the lower expectation of gflops figures at about 176 gflops. This is based on the fact that the gpu is using a power hungry 40nm or 45nm process but is consuming a very low amount of power. So may actually be 160 streaming units etc. Possibly something like the mobility radeon hd 6400.

Clearly this still represents a large architecture improvement over 360 and PS3 plus you have the high speed video memory. The fact that many later games are still struggling on wii u to fully match 360 and PS3 frame rates also adds to this.

It's sad to say but I don't think we have gone low enough yet with regard the true wii u spec.

Also probably worth mentioning that the wii u cpu design is 32bit not the 64bit design of the xbox one and ps4 jaguar cores.

I would have liked to have seen 360 and PS3 spec's added to get the full picture. Still many games perform better on 360 and PS3 compared to wii u.

1) yeah I need to add a range for GPU flops as its still too much grey area.

2) This is for current gen, not last. Adding PS360 would be pointless.

3) Any game that runs better on PS360 vs WiiU was solely due to lack of effort on the Wii U port. Games made with good dev time for Wii U always show better on Wii U. Going forward with games like WatchDogs, CODGhosts, etc will continue to prove this.



superchunk said:
bonzobanana said:
There seems to be more evidence now that the wii u gpu is at the lower expectation of gflops figures at about 176 gflops. This is based on the fact that the gpu is using a power hungry 40nm or 45nm process but is consuming a very low amount of power. So may actually be 160 streaming units etc. Possibly something like the mobility radeon hd 6400.

Clearly this still represents a large architecture improvement over 360 and PS3 plus you have the high speed video memory. The fact that many later games are still struggling on wii u to fully match 360 and PS3 frame rates also adds to this.

It's sad to say but I don't think we have gone low enough yet with regard the true wii u spec.

Also probably worth mentioning that the wii u cpu design is 32bit not the 64bit design of the xbox one and ps4 jaguar cores.

I would have liked to have seen 360 and PS3 spec's added to get the full picture. Still many games perform better on 360 and PS3 compared to wii u.

1) yeah I need to add a range for GPU flops as its still too much grey area.

2) This is for current gen, not last. Adding PS360 would be pointless.

3) Any game that runs better on PS360 vs WiiU was solely due to lack of effort on the Wii U port. Games made with good dev time for Wii U always show better on Wii U. Going forward with games like WatchDogs, CODGhosts, etc will continue to prove this.


At this point in time I'm not convinced by that. I'm a wii u owner myself and directly comparing wii u titles to 360 and PS3 has shown many wii u weaknesses. I think its generally acknowledged that the wii u has less cpu processing power than 360 and PS3.  Many recent games still show weaknesses in the wii u version.

It's a nice console but I really think the wii u is Nintendo's entry into the current gen rather than next gen. I disagree about adding 360 and PS3 being pointless because your specification doesn't show how close the wii u is to existing models. In many ways the ps3 and 360 have just as much right to claim next gen performance as wii u. Many ps3 and 360 games run with better frame rates, more  graphic detail, and higher resolutions. Many cpu intensive features are missing from wii u games. The eurogamer faceoff's and lens of truth show wii u performance for what it is.



Looking through I see that some specs in the second post haven't been updated. The wii u has at least 35mb of built in memory for the gpu. As in addition to the 32MB it has to have the 2MB frame buffer and 1MB texture cache for wii compatibility mode.

If you are stating 176-352 gflops you should also state 160-320 stream processors.

Is the wii u gpu 550mhz or 600mhz? I thought ps3 was 550mhz and wii u gpu 600mhz. Not checked though.



bonzobanana said:
There seems to be more evidence now that the wii u gpu is at the lower expectation of gflops figures at about 176 gflops. This is based on the fact that the gpu is using a power hungry 40nm or 45nm process but is consuming a very low amount of power. So may actually be 160 streaming units etc. Possibly something like the mobility radeon hd 6400.

Clearly this still represents a large architecture improvement over 360 and PS3 plus you have the high speed video memory. The fact that many later games are still struggling on wii u to fully match 360 and PS3 frame rates also adds to this.

It's sad to say but I don't think we have gone low enough yet with regard the true wii u spec.

Also probably worth mentioning that the wii u cpu design is 32bit not the 64bit design of the xbox one and ps4 jaguar cores.

I would have liked to have seen 360 and PS3 spec's added to get the full picture. Still many games perform better on 360 and PS3 compared to wii u.

There is no evidence at all that Wii U's GPU is 176GFLOPS and you are basing your assumptions on Wii U's performance by looking at performance of cheap ports that Wii U gets and those cheap ports are ports of Xbox 360 versions of those games for the most part so your premise that it has just 176GFLOPS and just 160 SPU's and that is basically Radeon HD 6400 is total fallacy and failure on your part.

Only fact is that you judge a console by looking at cheap ports, you would say the same thing for PlayStation 3 in its first year when it got tons of cheap ports.

32 and 64bit does not have much impact on performance, it is mainly about amount of RAM and adress space. If every code was writen in 64 bit then you would have worser performance than 32bit code because some codes perform best in 32 bit and 64 bit coding would be just an overhead and chuging more resources.

Those many games are cheap ports of Xbox 360 versions of those games, also Deus Ex Human Revolution Director's Cut has better visuals and rock solid framerate, Batman Arkham Origins on Wii U has much better framerate stability and better lightning and shadows and Need For Speed Most Wanted has some PC textures and much better lightning and framerate also all of those cheap ports do not have screen tearing!

Read this;http://www.ign.com/boards/threads/official-wii-u-lobby-specs-graphics-power-thread.452775697/page-188#post-483141827 dealwithit