By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

HoloDust said:
Well, if it's any consolation, that GPU looks to me very much like HD 5550 (pretty much what I thought it would be similar to after first measurements, that or 5570) - Youtube 5550 and some games, and you'll see it's not that bad (well, it is, but not THAT bad) - it can play Skyrim on medium with some AA @720p at 30+ frames for example (and that's on PC, with all its baggage), so I'm sure Zelda will look quite nice.

That's an excellent observation HoloDust! 

HD5550 has a VP rating of 27 and a power consumption of 39W on 40nm node.

The power consumption of the Wii U in games is roughly 35W. This is for the entire console, not just the GPU. I am going to give the benefit of the doubt and assume the developers are not using the Wii U's CPU and GPU at 90%+ capacity, which is not telling us the actual load power consumption once developers utilize the GPU/CPU more effectively. 

http://www.eurogamer.net/articles/digitalfoundry-wii-u-is-the-green-console

Assuming the specs outlined in the article/found by NeoGaf are at least in the ballpark, imo the graphics card is weaker than HD5550 because it is based on R700 series not HD5000/Redwood core.

Redwood HD5550 (DX11) = 550mhz GPU clocks (352 Gflops), 320 SPs, 16 TMUs, 8 ROPs, 28.8 GB/sec memory bandwidth over 128-bit bus  = 27 VP

RV730 HD4650 (DX10.1) = 600mhz GPU clocks (384 Gflops), 320 SPs, 32 TMUs, 8 ROPs, 16 GB/sec memory bandwidth over 128-bit bus = 17.8 VP

R700 Wii U's rumored GPU (DX10.1) = 550mhz GPU clocks (352 Gflops), 320 SPs, 16 TMUs, 8 ROPs, 12.8 GB/sec memory bandwidth over 64-bit bus 

^ That means R700 in Wii U is actually slower than HD5550. You can also see that even with just 320 SPs, these GPUs still need memory bandwidth ==> 4650 is much slower than HD5550 wven with higher GPU clock and 2x the TMUs, partly because it has just 16GB/sec memory bandwidth. The other reason is HD4000 series has worse performance per clock/IPC than HD5000 does. This can be observed by comparing HD4870 vs. HD5770. Despite HD4870's near 50% memory bandwidth advantage over HD5770 (http://www.gpureview.com/show_cards.php?card1=564&card2=615), the 2 cards are actually similar in performance.

If we assume that some "secret" sauce & eDRAM has allowed R700 in Wii U to land between HD4650 and HD5550, we would get a VP rating of 22.4 (avg of HD4650 and HD5550).

The GPU in Xbox 360 was claimed to be similar to X1800XT 512MB by ATI themselves, or a VP rating of 16.7.  That means a full blown HD5550 GPU is roughly 62% faster than the GPU in the Xbox 360. HD5550 has 28.8 GB/sec memory bandwidth but Wii U's only has 12.8GB/sec. I am inclined to believe that 32MB of eDRAM cannot make up for losing more than half of the memory bandwidth. For that reason I'd put R700 in Wii U at 22.4 VP instead of 27 VP of HD5550.

The rumored specs for Durango includes a GPU about as powerful as an HD7770 (94 VP). For Orbis, it's ~HD7850 (141 VP).

http://alienbabeltech.com/abt/viewtopic.php?p=41174

That means PS4 and Xbox 720 could easily end up 4.2-6.3x faster. I would consider that in an entirely different league.

Going from 1280x720 to 1920x1080 is a 2.25x increase in pixels. The GPU in the Wii U appears to be barely 50% more powerful than Xbox 360's (if that). That means Wii U is going to struggle rendering next generation games at native 1080P at 30 fps. The other downside is lack of DX11 which means no next generation effects in any of Wii U's games. Having said that most people buy Nintendo's console for their 1st party games, not graphics. 

================================

The facepalm moment for me is Nintendo could have just went with a $130 65W Trinity A10-5700 and ended up with a faster CPU and GPU. 

http://www.techpowerup.com/reviews/AMD/FM2_APU_Review/

I am flabergasted as to what Nintendo was thinking on their CPU and GPU selection choice. How in the world is the console $350 with such anemic CPU/GPU components? Even if we consider that the controller costs $175 of that, it's not as if Nintendo would have had to pay the retail $130 for the A10-5700. If Nintendo waited 6 more months to launch their console, they could have fit an even faster A10-6700 in the same 65W power envelope of A10-5700, and this would also have given them time to build up 3rd party support and have a stronger game line-up at launch.



Around the Network

BlueFalcon said:

The other downside is lack of DX11 which means no next generation effects in any of Wii U's games.

Good analysis.

I just want to add the Wii U's GPU have features compatible with DX11... of course the Wii U don't use DX11 because that a Microsoft proprietary API and Nintendo uses other API (I forgot the name)... but the GPU have features similar or compatible with DX11.

At least that what said Nintendo.



Héctor Martín (@marcan42) says the register names come from a AMD R600 GPU.

"Oh, and for those who claim it's not a Radeon-like design: http://www.marcansoft.com/paste/Kq0OLb0X.txt … . R6xx. Register names match AMD ones too."



ethomaz said:

. but the GPU have features similar or compatible with DX11.

At least that what said Nintendo.

Even if true, keep in mind that DX11 features are now extensively focused around very GPU intensive effects:

1) DirectCompute for accelerating graphical effects such as ambient occlusion (AO/HDAO) (Sleeping Dogs, Hitman Absolution) , contact hardening shadows (Bioshock Infinite, Dirt Showdown), post-processing effects/Alpha Coverage AA (Far Cry 3, Sleeping Dogs).

If you look at the games that use compute shaders, HD7850 can barely get 30 fps at 1080P in those DX11 games. 

http://www.xbitlabs.com/images/graphics/his-iceq-x2-7970-7950-7850/10_hit-a.png

http://www.xbitlabs.com/images/graphics/his-iceq-x2-7970-7950-7850/07_sleepd.png

IMO that means Wii U's GPU won't be able to take advantage of this either since it's just too slow. And Evergreen (R700) architecture was never designed for GPGPU compute. That's what GCN is for.

2) Tessellation

The geometry engines responsible for Tessellation in GCN HD7000 parts are up to 4x faster than in HD6000 series. 

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/12

HD6000 series has dual-geometry engines which are 2-3x faster than in HD5000 series.

http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/6

R700 GPU is 1 generation below HD5000 series......That means R700 in Wii U will have 8-12x slower tessellation performance, making this feature most likely worthless. 

If Wii U's GPU is R600, not R700, then it's a complete dog as traditional Anti-aliasing in that architecture is broken for MSAA above 1600x1000 pixels. At 1080P, the R600 would incur a 2x performance hit with MSAA over R700 series. If true, that would mean no MSAA/SMAA on Wii U at 1080P.



@BlueFalcon

I don't know... @Marcan is saying the registers in the GPU shows R600 based and not R700.

http://www.marcansoft.com/paste/Kq0OLb0X.txt



Around the Network
ethomaz said:

 

@BlueFalcon

I don't know... @Marcan is saying the registers in the GPU shows R600 based and not R700.http://www.marcansoft.com/paste/Kq0OLb0X.txt

 

That would be even worse (see my last graph in the last post). R600 is essentially a 2nd revision of R500 in Xbox 360's GPU. Xbox 360 had to rely on MLAA/FXAA for most of its anti-aliasing work since MSAA performance is horrendous on R500-600 generations.

I still don't understand why Nintendo didn't just go with A10-5700 APU. 65W TDP, faster GPU and CPU than what they have. Would have required no R&D!! They should hire external consultants to design the successor to Wii U. They are brilliant at making 1st party games but have no clue about hardware. $350 for a console with shading power maybe 50% faster than XBox 360's in the year 2013 is mind-blowingly bad. I can't explain that. 



BlueFalcon said:
--- snip of a very good write up -----

Very good analysis and probably right in line with what it actually is.

However, one item I disagree with is your opinion on if WiiU GPU can due effects similar to that of DX11.

1) only MS will use DX
2) this means the others use other SDKs that can and do mirror the DX effects and technologies.
3) we know WiiU has support from Unity Engine and UE4 that are both using DX11 equivalent features/technologies.

I also think that it won't be necessary for WiiU to match a 1080p native game in 1080p. If it jumps down to 720p30fps, it may suffice for allowing any other game to be ported to WiiU. Of course this would depend on the game and the end user (many people do not notice these things). But these engines would allow that type of scaling to fit the hardware.

Also note, that if the game is already 720p30fps on PS4/NBox then its likely it simply could not be scaled to WiiU, but I don't think we'll see very many of those games, if at all on those two systems.



 

superchunk said:
However, one item I disagree with is your opinion on if WiiU GPU can due effects similar to that of DX11.

I am just afraid the GPU inside Wii U isn't fast enough to utilize GPGPU functions, it has no dedicate compute units/shaders like GCN does, and tessellation performance will be rock bottom. How can they recreate similar DX11 effects with such anemic GPU horsepower. Even if the GPU is 50% faster than Xbox 360, that's nowhere near enough because the performance hit from DX9 to DX11 features is at least 2x. 

superchunk said:
 If it jumps down to 720p30fps, it may suffice for allowing any other game to be ported to WiiU. Of course this would depend on the game and the end user (many people do not notice these things). But these engines would allow that type of scaling to fit the hardware.

How does one justify paying $300-350 for a console that still requires you to buy a $60-70 mechanical hard drive in the context of PS4/Xbox 720 if its total processing power is 4-6x less and it doesn't even have a strong 1st party line-up yet? Once gamers see the graphics of PS4 / Xbox 720's exclusive games, the Wii U would need a price cut to $199 due to the  $60-70 mechanical HDD effectively pushing the total system cost to $260-270....Even that would be way too expensive if Xbox 720 is $350. Nintendo is in serious trouble. They bet all their marbles on the tablet controller :(

The 7 inch Kindle Fire costs $159 in retail, has a 2-point multi-touch 1024x600 screen, 8.5 hour battery life, dual-core 1.2ghz CPU, 1GB of memory and and 8GB of storage. How is Nintendo selling the Basic Wii for $299 when the controller is inferior in every way to the Kindle Fire and the rest of the consoles would get owned by a low-end A10 APU? Nintendo's pricing makes no sense.



BlueFalcon said:

HoloDust said:
Well, if it's any consolation, that GPU looks to me very much like HD 5550 (pretty much what I thought it would be similar to after first measurements, that or 5570) - Youtube 5550 and some games, and you'll see it's not that bad (well, it is, but not THAT bad) - it can play Skyrim on medium with some AA @720p at 30+ frames for example (and that's on PC, with all its baggage), so I'm sure Zelda will look quite nice.

That's an excellent observation HoloDust! 

Why, thanks :)

HD5550 has a VP rating of 27 and a power consumption of 39W on 40nm node.

I think that HD5550 GDDR5 has rating of 27VP - HD5550 DDR3 is at 21.8VP. Also, I think 39W is TDP for desktop part - if you look at Mobility 5750, which is Redwood Pro (400:20:8), its TDP is 25W (I am, as always, baffled with these significant differences between desktop and mobile parts).

Redwood HD5550 (DX11) = 550mhz GPU clocks (352 Gflops), 320 SPs, 16 TMUs, 8 ROPs, 28.8 GB/sec memory bandwidth over 128-bit bus  = 27 VP

RV730 HD4650 (DX10.1) = 600mhz GPU clocks (384 Gflops), 320 SPs, 32 TMUs, 8 ROPs, 16 GB/sec memory bandwidth over 128-bit bus = 17.8 VP

R700 Wii U's rumored GPU (DX10.1) = 550mhz GPU clocks (352 Gflops), 320 SPs, 16 TMUs, 8 ROPs, 12.8 GB/sec memory bandwidth over 64-bit bus 

Well, I think that 5550 DDR2 could be good aproximation for rumoured specs so far - it is 320:16:8 part @550MHz, with DDR2 over 128 bit bus, for total bandwith of 12.8GB/s (just like suggested WiiU's over 64bit bus). That card's rating is 16.5 VP.

^ That means R700 in Wii U is actually slower than HD5550. You can also see that even with just 320 SPs, these GPUs still need memory bandwidth ==> 4650 is much slower than HD5550 wven with higher GPU clock and 2x the TMUs, partly because it has just 16GB/sec memory bandwidth. The other reason is HD4000 series has worse performance per clock/IPC than HD5000 does. This can be observed by comparing HD4870 vs. HD5770. Despite HD4870's near 50% memory bandwidth advantage over HD5770 (http://www.gpureview.com/show_cards.php?card1=564&card2=615), the 2 cards are actually similar in performance.

If we assume that some "secret" sauce & eDRAM has allowed R700 in Wii U to land between HD4650 and HD5550, we would get a VP rating of 22.4 (avg of HD4650 and HD5550).

I honetly have no idea if those extra parts can make up for lack of memory bandwith and boost it to level of HD5550 DDR3 over 128bit bus (or more). But if it can, we might be back to 21.8 VP of HD5550 DDR3.

The GPU in Xbox 360 was claimed to be similar to X1800XT 512MB by ATI themselves, or a VP rating of 16.7.  That means a full blown HD5550 GPU is roughly 62% faster than the GPU in the Xbox 360. HD5550 has 28.8 GB/sec memory bandwidth but Wii U's only has 12.8GB/sec. I am inclined to believe that 32MB of eDRAM cannot make up for losing more than half of the memory bandwidth. For that reason I'd put R700 in Wii U at 22.4 VP instead of 27 VP of HD5550.

Well, some time ago, I listened to your advice and tried to compare architectures, clocks and configs when I was making those comparison tables - I came to guesstimate, based on similarity to HD2xxx series (as first unified shader PC cards), that Xenos is somewhere around 14.8 VP.

 

The other downside is lack of DX11 which means no next generation effects in any of Wii U's games.

To be honest, first time I started thinking it is based on Redwood is after some developers claiming it is DX11 capable GPU. I might be completely wrong though, or not up to speed with latest.

================================

The facepalm moment for me is Nintendo could have just went with a $130 65W Trinity A10-5700 and ended up with a faster CPU and GPU. 

Backward compatibility maybe? But then again, Dolphin proved to be more than capable of emulating Wii, I see no reason why Nintendo could not make near perfect x86 Wii emulator on their own.





BlueFalcon said:

 

superchunk said:
However, one item I disagree with is your opinion on if WiiU GPU can due effects similar to that of DX11.

I am just afraid the GPU inside Wii U isn't fast enough to utilize GPGPU functions, it has no dedicate compute units/shaders like GCN does, and tessellation performance will be rock bottom. How can they recreate similar DX11 effects with such anemic GPU horsepower. Even if the GPU is 50% faster than Xbox 360, that's nowhere near enough because the performance hit from DX9 to DX11 features is at least 2x. 

superchunk said:
 If it jumps down to 720p30fps, it may suffice for allowing any other game to be ported to WiiU. Of course this would depend on the game and the end user (many people do not notice these things). But these engines would allow that type of scaling to fit the hardware.

How does one justify paying $300-350 for a console that still requires you to buy a $60-70 mechanical hard drive in the context of PS4/Xbox 720? Once gamers see the graphics of PS4 / Xbox 720's exclusive games, the Wii U would need a price cut to $199 due to the same $60-70 mechanical HDD effectively pushing the total cysstem cost to $260-270....Even that would be way too expensive compared to a $350 Xbox 720! Nintendo is in serious trouble :(

1) It is a GPGPU as stated by Nintendo directly. It has engines that have features similar to DX11... not actually using DX means you won't see any performance hit like DX9 to 11... its not the same SDK. Can't judge it the same.

2) Value of a console or anything is what matters. I didn't buy a WiiU because I thought it would be identical to its competitors in power. Granted I hoped it would be close enough to get similar support (and still do) but that wouldn't have changed my buying decision. I wanted Nintendo games. Its as simple as that. $350 is a great price for Nintendo Next-gen games plus many others plus MiiVerse plus awesome browser back up by an equally awesome tablet experience.

The HDD is not a necessity for me yet and likely won't be unless I decide to go digital on all disc games. I likely won't based on how I see pricing and BC for digital and Wii games respectively right now. Furthermore X720 and PS4 are unlikely to be less than $399... a year from now. By then, WiiU could very well be reduced to $299 or less by then.

Nintendo is in trouble now due to lack of continuous content, nothing else.