By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - "Wii U GPU is several generations ahead of current gen" Shin'en

farlaff said:
Conegamer said:
Well, we sorta already knew there was more power under the hood than we've seen. But will anyone actually use it?

This. Specially from Western developers. 




Seeing how the Unity Engine that is currently running on the Wii U is using DX10.1 (thanks to it's modern GPU) level features. The Indies will probably be the first 3rd party developers that maybe able to diffrentiate between the Wii U and PS360; from a visual point of view. In comparison the PS360 are stuck using DX8.1 levenl features. An example is Shadow of the Eternals looks awesome and I think it's only running the DX10.1 equivalent of CryEngine3.



Around the Network

I assume they mean GPU generation. Which is the only way it would make sense.

And before you say "hurr durr they only make games for Nintendo so they don't know" STOP RIGHT THERE.

Shin'en are tech monsters. They can run laps around some of the big time devs in terms of code. They have roots in PC demoscene, where it's all about maxing tech out.

These guys know what they're talking about. Nano Assault was nothing. Wait on their next game that isn't a launch title.



http://gamrconnect.vgchartz.com/profile/92109/nintendopie/ Nintendopie  Was obviously right and I was obviously wrong. I will forever be a lesser being than them. (6/16/13)

AnthonyW86 said:
HoloDust said:
AnthonyW86 said:
oni-link said:

I read that it is a heavily modified version of the Evergreen series.  Seeing how the specs are so close (from the die shots) to that line of GPU's it's the closest one (without the special sauce) to compare the GPU to.  The Wii U is even made in the same plant as the Evergreen!!!

for reference:

http://www.amd.com/US/PRODUCTS/DESKTOP/GRAPHICS/ATI-RADEON-HD-5000/HD-5550/Pages/hd-5550-overview.aspx#2

Note that Wii-U's gpu only has half the bandwidth though.

5550 actually comes in 3 flavors when it comes to memory - GDDR5 (57.6GB/s), DDR3 (28.8GB/s) and DDR2 (12.8GB/s). Aggregate VP ratings for these cards are 27, 21.8 and 16.5, so they are quite sensitive to memory bandwidth.

WiiU, although it has DDR3, has bandwidth of the DDR2 version, but I'm thinking that EDRAM might put it somewhere between DDR2 and DDR3 version (theoretically I guess maybe even on par with DDR3 version).

For comparison, XOne's GPU equivalent (7770) has VP rating of 96, and PS4's (7850) stands at 141.

This is all without customizations each platform holder incorporated in their solution, so differences are most likely even bigger.

For reference, 360 is rated at around 12.

Thanks for saving me the time to explain, but yes you are right. And that's my point in bringing it up, the difference with the other next gen consoles will be massive. Especially since you're talking about 2gb, let's say they want to use 1gb for the gpu that's a lot of memory to fill with such a small bandwidth. Even the 360 and ps3 have about double.

It will also depend on the game engine. For example in raw computing power an HD7850 is about 50%  faster than a HD7770, however it can be anywhere from 50% up to almost 100% faster depending on the game.

On a side note in the benchmarks i have encountered where an HD7850 manages about 30fps an HD5550 only does about 3fps-4fps... and that's the DDR3 version.


Wow that's an amazing diffrence!!! I was actually feeling mighty confident about the customized Evergreen GPU inside the Wii U until that.  I wonder if the special sauce will help increase it's performance even a wee bit?  It was never a question that the PS4/XBone will be more powerful!!   The statement that most Nintendo developers, for this instance Shin'en is that the Wii U is not underpowered compared to the current-gen systems.  In fact, when a developer takes time to know the systems capabilities, the Wii U is able to do things a bit beyond what the PS360 maybe capable of. (DX10.1-11.0 features vs DX8.1 features) etc.



HoloDust said:
green_sky said:

The jump doesn't seem all that big if it running Radeon 4650. Originally it was rumoured to be running Radeon 4850. Which would have been significant jump ahead from 7800 GT in PS3.

I still fondly remember those days when lot of us where hoping they would go with 4850 or equivalent (5750 for example)...if they went with that (and GDDR5 of course), WiiU would be some 1.6x weaker than XOne and around 2.35x weaker than PS4, so though still weakest of 3, it would be definitely, spec-wise, next gen, which would mean much easier porting to it and better 3rd party support.


The die shots already debunked any conception it was a Radeon 4xxx series!!! Even before the die shots the 4xxx series was more power hungry than what Nintendo planned for the system thus it made no sense to use the 4xxx as the basis for Latte.  Even comparison to Evergreen isn't even accurate as the GPU is highly customized. Thus even if Evergreen was the basis it wouldn't matter as the chip is heavily modified. 



They should've just used a modified AMD 7800M GPU. It's made for laptops, so power consumption would be low and that would've put the system very close to the XBox One and would be able to easily run any PS4 game, might have to be at 720p instead of 1080p, but most people can't even tell the differnce sitting on a couch 6-7 feet away.



Around the Network
green_sky said:

Yea i knew what he was hinting at. The jump doesn't seem all that big if it running Radeon 4650. Originally it was rumoured to be running Radeon 4850. Which would have been significant jump ahead from 7800 GT in PS3. Either way it doesn't matter much. It was a conscious decision by Nintendo to not make something really powerful. Saves development budget and keeps costs low. http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html


It's none of those GPUs. Its a 100% fully custom design and thus unlike any of them.



I bet this thread will get 200+ posts



oni-link said:
HoloDust said:
green_sky said:

The jump doesn't seem all that big if it running Radeon 4650. Originally it was rumoured to be running Radeon 4850. Which would have been significant jump ahead from 7800 GT in PS3.

I still fondly remember those days when lot of us where hoping they would go with 4850 or equivalent (5750 for example)...if they went with that (and GDDR5 of course), WiiU would be some 1.6x weaker than XOne and around 2.35x weaker than PS4, so though still weakest of 3, it would be definitely, spec-wise, next gen, which would mean much easier porting to it and better 3rd party support.


The die shots already debunked any conception it was a Radeon 4xxx series!!! Even before the die shots the 4xxx series was more power hungry than what Nintendo planned for the system thus it made no sense to use the 4xxx as the basis for Latte.  Even comparison to Evergreen isn't even accurate as the GPU is highly customized. Thus even if Evergreen was the basis it wouldn't matter as the chip is heavily modified. 

I was actually talking about days way, way back before any of the specs were known - you know, when some of us where hoping they would actually make decently powered system (that's where 4850 comes in) ;)

As for Evergreen, I think the jury is still out on that, I was always leaning toward that after first measurements, but I think there's a chance it might be RV730 (4650) shrank to 40nm. One way or the other, you can squeeze only so much out of them, no matter how much customized they are - unfortunately, they are still old, pretty inefficient VLIW5 architectures (360 too uses older version of that), which, if you look at at benchmarks (or aggregate benchmark numbers I posted already) don't look too good compared to PS4/XOne equivalent cards.

Anyway, both 5550 and 4650 are faster than what's inside PS360, so if indeed 320 shader GPU is inside of WiiU, there really shouldn't be any discussion about it.



Is there any point in making a "from the ground up" GPU in this day and age anyway? Aside from wanting ridiculous power consumption restraints, it just seems to me like a massive waste of time and money, and Nintendo is probably paying out the rear end for this GPU because it's so heavily customized.

Why not just use a more off-the-shelf part that AMD has already done most of the grunt work on, tweak it specifically for games and cut power consumption here and there and call it a day?

Many of AMD's notebook GPUs would've given Nintendo close to XBox One performance and still a reasonable power draw (40 watts or so) and probably would've even been cheaper per unit than going for a completely custom part.

And third parties would've liked it more too because PC GPU architecture to them is like putting a fish in water.

I think this is going to be one of the bitter lessons Nintendo learns from this generation. No one cares about the difference in power consumption in a console unless you are making something the size of a house and a custom solution in the long term just isn't worth the time or money or headaches from disgruntled third parties.



HoloDust said:
oni-link said:
HoloDust said:
green_sky said:

The jump doesn't seem all that big if it running Radeon 4650. Originally it was rumoured to be running Radeon 4850. Which would have been significant jump ahead from 7800 GT in PS3.

I still fondly remember those days when lot of us where hoping they would go with 4850 or equivalent (5750 for example)...if they went with that (and GDDR5 of course), WiiU would be some 1.6x weaker than XOne and around 2.35x weaker than PS4, so though still weakest of 3, it would be definitely, spec-wise, next gen, which would mean much easier porting to it and better 3rd party support.


The die shots already debunked any conception it was a Radeon 4xxx series!!! Even before the die shots the 4xxx series was more power hungry than what Nintendo planned for the system thus it made no sense to use the 4xxx as the basis for Latte.  Even comparison to Evergreen isn't even accurate as the GPU is highly customized. Thus even if Evergreen was the basis it wouldn't matter as the chip is heavily modified. 

I was actually talking about days way, way back before any of the specs were known - you know, when some of us where hoping they would actually make decently powered system (that's where 4850 comes in) ;)

As for Evergreen, I think the jury is still out on that, I was always leaning toward that after first measurements, but I think there's a chance it might be RV730 (4650) shrank to 40nm. One way or the other, you can squeeze only so much out of them, no matter how much customized they are - unfortunately, they are still old, pretty inefficient VLIW5 architectures (360 too uses older version of that), which, if you look at at benchmarks (or aggregate benchmark numbers I posted already) don't look too good compared to PS4/XOne equivalent cards.

Anyway, both 5550 and 4650 are faster than what's inside PS360, so if indeed 320 shader GPU is inside of WiiU, there really shouldn't be any discussion about it.


LOL...I remember the rumors/speculations so well!!! Anyways, people really shouldn't say RV7xxx or Evergreen as those are only basis for the actual chip.(me included) Though the more things are being known about the system the more likely it is based on the 5xxx series( HDMI 1.4,  40nm process, 320 spu's, DX11 capability. etc) http://www.amd.com/US/PRODUCTS/DESKTOP/GRAPHICS/ATI-RADEON-HD-5000/HD-5550/Pages/hd-5550-overview.aspx#2  Nintendo really should have upped the system's RAM to be at least 3GB and had people buy their own HDMI cable etc ! Though I was just happy that it was more than the rumored 756MB total RAM. (same source that did the RV7xxx btw)