By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U GPU Die Image! Chipworks is AWESOME!

Seems like the GPU is an exclusive GPU designed specifically for the Wii U. Thus comparing it to other Radeon processors will do the chip such a disservice. I'm impressed how Nintendo went all customized once again. Freakin' awesome IMHO.



Around the Network
Aielyn said:
Here's a question - what part of the system handles the Gamepad's screen?

I ask for two reasons - one, I'd think there'd at least be some part of the GPU designed specifically for the graphics of the Gamepad, due to issues like latency. Two, the Gamepad has a resolution matching that of the Wii... could it actually be using the same parts of the chip that handle the Wii Mode of the Wii U?

Anyway, just a thought.


Well ,I know that there are 2 Wireless:

1: Wireless b/g/n for Internet connection

2: Wireless N for 2 GamePad controllers to stream 2 HD images (1 per controller).......but I'm not sure if it stays 1080p stream per controller or if it gets reduce to a 720p stream per controller when using 2 GamePad Controllers at once.



Kaizar said:
Aielyn said:
Here's a question - what part of the system handles the Gamepad's screen?

I ask for two reasons - one, I'd think there'd at least be some part of the GPU designed specifically for the graphics of the Gamepad, due to issues like latency. Two, the Gamepad has a resolution matching that of the Wii... could it actually be using the same parts of the chip that handle the Wii Mode of the Wii U?

Anyway, just a thought.


Well ,I know that there are 2 Wireless:

1: Wireless b/g/n for Internet connection

2: Wireless N for 2 GamePad controllers to stream 2 HD images (1 per controller).......but I'm not sure if it stays 1080p stream per controller or if it gets reduce to a 720p stream per controller when using 2 GamePad Controllers at once.

isn't the gamepad screen resolution 854 x 480? Why would they stream a higher resolution than the screen can display? 



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

I can't believe I read all those comments - interesting read though.

Anyways, so will anyone be able tell the performance of the WiiU GPU just by looking it, or are we wasting our time? I would have thought that it would be like open heart surgery in that you just get a WiiU, cut it open and see what makes it tick. Can't someone simply hook it up to a machine or something and run some test?



Nintendo Network ID: DaRevren

I love My Wii U, and the potential it brings to gaming.

DaRev said:
I can't believe I read all those comments - interesting read though.

Anyways, so will anyone be able tell the performance of the WiiU GPU just by looking it, or are we wasting our time? I would have thought that it would be like open heart surgery in that you just get a WiiU, cut it open and see what makes it tick. Can't someone simply hook it up to a machine or something and run some test?

Just by looking at it even experts can't get a proper read of its real time performance.  It's just that customized and hard to relate it to other known hardware to get a good guess.  More time is needed, I mean this hasn't even been known for 2 days yet, and we only know about 1/2 of what is on the die.  And the CPU die pic is on its way as well.

On the hooking it up to test its capabilities......

The best indicator for that would probably to have a Wii U dev kit, that or the tech to be able to read such things is expensive or not readily available.



NNID: crazy_man

3DS FC: 3969 4633 0700 

 My Pokemon Trading Shop (Hidden Power Breeding)

Around the Network
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.


I can help shade more light on this subject, although I posted this earlier on this thread:

http://wiiu-gamer.net/2012/11/06/wii-u-amplifies-code-speed-and-avoids-ram-bottleneck/



I would bet money the DF article is correct, i'm sure they contacted a developers in the industry and told them they got pics of the gpu, spill the beans, you won't get in trouble under NDA because we can we figured it out, it would make them look stupid if they were wrong.



zarx said:
Kaizar said:
Aielyn said:
Here's a question - what part of the system handles the Gamepad's screen?

I ask for two reasons - one, I'd think there'd at least be some part of the GPU designed specifically for the graphics of the Gamepad, due to issues like latency. Two, the Gamepad has a resolution matching that of the Wii... could it actually be using the same parts of the chip that handle the Wii Mode of the Wii U?

Anyway, just a thought.


Well ,I know that there are 2 Wireless:

1: Wireless b/g/n for Internet connection

2: Wireless N for 2 GamePad controllers to stream 2 HD images (1 per controller).......but I'm not sure if it stays 1080p stream per controller or if it gets reduce to a 720p stream per controller when using 2 GamePad Controllers at once.

isn't the gamepad screen resolution 854 x 480? Why would they stream a higher resolution than the screen can display? 

Yes, I believe the screen might be 480p, but for some reason Nintendo is streaming true HD to 2 controllers.

And Nintendo said that you would notice the difference if they streamed lesser quality images to the GamePad, unless it were to display 2D sprites only, but even then it would have to be no less then 720p to not see a difference when it comes to displaying 2D sprites only.

But that's all I know about the subject.



Kaizar said:
Aielyn said:
Here's a question - what part of the system handles the Gamepad's screen?

I ask for two reasons - one, I'd think there'd at least be some part of the GPU designed specifically for the graphics of the Gamepad, due to issues like latency. Two, the Gamepad has a resolution matching that of the Wii... could it actually be using the same parts of the chip that handle the Wii Mode of the Wii U?

Anyway, just a thought.


Well ,I know that there are 2 Wireless:

1: Wireless b/g/n for Internet connection

2: Wireless N for 2 GamePad controllers to stream 2 HD images (1 per controller).......but I'm not sure if it stays 1080p stream per controller or if it gets reduce to a 720p stream per controller when using 2 GamePad Controllers at once.

I'm not so much interested in the transmission of the data, as the creation of it. Does the same part of the GPU that handles creation of the normal screen graphics also handle the Gamepad's graphics, or are they segregated in some way?

Figuring out how the Gamepad graphics generation works might help to figure out what's going on with the GPU as a whole, is what I'm thinking.



_crazy_man_ said:
DaRev said:
I can't believe I read all those comments - interesting read though.

Anyways, so will anyone be able tell the performance of the WiiU GPU just by looking it, or are we wasting our time? I would have thought that it would be like open heart surgery in that you just get a WiiU, cut it open and see what makes it tick. Can't someone simply hook it up to a machine or something and run some test?

Just by looking at it even experts can't get a proper read of its real time performance.  It's just that customized and hard to relate it to other known hardware to get a good guess.  More time is needed, I mean this hasn't even been known for 2 days yet, and we only know about 1/2 of what is on the die.  And the CPU die pic is on its way as well.

On the hooking it up to test its capabilities......

The best indicator for that would probably to have a Wii U dev kit, that or the tech to be able to read such things is expensive or not readily available.

Hence, the reason why the Wii U would still be able to hold it's own against the PS4/720 when it comes out.  When everything is said and done it'll be about the games and how those games are suited to the system.  This won't be another Wii vs PS360 gap I am sure of that.(I'm getting the 720 as my second system of choice after my Wii U)  One thing for sure is that this coming E3 will be telling as we will see what dedicated Wii U games could do against the much hyped PS4/720.