Forums - Nintendo Discussion - Wii U GPU Die Image! Chipworks is AWESOME!

Digital Foundry article...

http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

320:16:8 - Similar to HD 4650/4670.

The best part "crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware."

Now we know why third-party don't like Nintendo lol. 



Around the Network
JEMC said:
Aielyn said:
I actually wonder if Nintendo hasn't done something to hold the power usage down at this point in time. As I understand it, Nintendo themselves said the system would use 45 W at peak.

http://www.officialnintendomagazine.co.uk/41850/wii-u-specs-update-memory-and-storage-space-revealed/

So why are they all using about 33 W? Did Nintendo just do extra efficiency modifications in the last few months? Or are the launch titles specifically not using all of the available power? Or is it just that they haven't tested the appropriate games?

I don't know.

Let's assume a case where the game/console runs at 35W. With a power brick with an efficiency of 70% that means that it draws 50W from the wall, add another 20% left for safety (because efficiency goes down with time, etc.) and you end with 60W. With the power brick being 70W, where are the other 10W?

Who knows, maybe they'll do what happened with the PSP or the 3DS and launch a firmware that increases the speeds of the CPU or the GPU (or maybe both).

I'm fairly certain that Nintendo wouldn't have been describing power draw from the wall, specifically because of that efficiency issue - if they were to say "the system draws 60 W from the wall", then after a year, it would draw more, and their claim would no longer be accurate. But the power draw from the system is easily measured.

The other issue is that Nintendo said it could require up to 75 W, but would typically use only 45 W. Based on 70% efficiency, 45 W becomes only 31.5 W, which is below the number we've seen for most games. It's more likely that the 45 W number is the system power draw, not the total power draw at the wall, in my opinion. But then, I'm no expert.



@marcan42 (the guy that reveled the GPU clock)

- The NeoGAF folks could've just asked me and I would've told them about the 32MB MEM1 and 2MB MEM0/EFB without die shots :P

- 32MB 1T-SRAM MEM1, 2MB 1T-SRAM MEM0/EFB, 1MB SRAM ETB. I bet the I/O block with the tank is SATA and the 7x next to it USB.

- I still think the entire Wii GX is in there too. It's not a ridiculous amount of die area - about 2x the size of the EFB 1T block.

https://twitter.com/marcan42 



first... no need to put a 9 mega pixel image in the OP...

good jod AMD...



Proudest Platinums - BF: Bad Company, Killzone 2 , Battlefield 3 and GTA4

ethomaz said:

Digital Foundry article...

http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

320:16:8 - Similar to HD 4650/4670.

The best part "crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware."

Now we know why third-party don't like Nintendo lol. 

And why most third party games to date aren't making the hardware sing. (Trine 2 being pretty much the only exception so far)

Seriously, what is so hard about giving your developing partners an adequate level of information about your system? If you want third parties to make quality software for your system, why hold back on giving them the info they need? Nintendo baffle me sometimes.



Bet with Liquidlaser: I say PS5 and Xbox Series X will sell more than 56 million combined by the end of 2023.

Around the Network
ethomaz said:

Digital Foundry article...

http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

320:16:8 - Similar to HD 4650/4670.

I love how they call it similar to 4650, a 320:32:8 card built on 55nm...instead of comparing it to 5550, which has 320:16:8 GPU built on 40nm, just like WiiU is believed (at the moment) to have, and at exact same clock (550MHz).



Aielyn said:
JEMC said:

I don't know.

Let's assume a case where the game/console runs at 35W. With a power brick with an efficiency of 70% that means that it draws 50W from the wall, add another 20% left for safety (because efficiency goes down with time, etc.) and you end with 60W. With the power brick being 70W, where are the other 10W?

Who knows, maybe they'll do what happened with the PSP or the 3DS and launch a firmware that increases the speeds of the CPU or the GPU (or maybe both).

I'm fairly certain that Nintendo wouldn't have been describing power draw from the wall, specifically because of that efficiency issue - if they were to say "the system draws 60 W from the wall", then after a year, it would draw more, and their claim would no longer be accurate. But the power draw from the system is easily measured.

The other issue is that Nintendo said it could require up to 75 W, but would typically use only 45 W. Based on 70% efficiency, 45 W becomes only 31.5 W, which is below the number we've seen for most games. It's more likely that the 45 W number is the system power draw, not the total power draw at the wall, in my opinion. But then, I'm no expert.

I think I didn't put it right or you didn't get it the way I wanted. Let's do it the other way around.

Let's start on the power brick, which is rated at 75W. From those 75 leave a 20% off for security and you get 60W. An efficiency reduces those 60 to 42W that are what the console can use to run the games and you end with 10W more than what it is using right now.

And just to be clear, I'm not saying that Nintendo is reserving some power. It may also be that they have chosen to go more safe and increased that 20% of safety to 25% and also use a power brick with an efficiency of only 60%, resulting in 33W to run the games which is what they are doing right now.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
I think I didn't put it right or you didn't get it the way I wanted. Let's do it the other way around.

Let's start on the power brick, which is rated at 75W. From those 75 leave a 20% off for security and you get 60W. An efficiency reduces those 60 to 42W that are what the console can use to run the games and you end with 10W more than what it is using right now.

And just to be clear, I'm not saying that Nintendo is reserving some power. It may also be that they have chosen to go more safe and increased that 20% of safety to 25% and also use a power brick with an efficiency of only 60%, resulting in 33W to run the games which is what they are doing right now.

I fully understood what you were saying. I just don't think Nintendo's own words would refer to the from-the-wall power draw, because that would imply that typical power draw is lower than what we've seen on pretty much all games so far. My issue isn't with the available power from the brick, but on what Nintendo said, and its correlation to what we've seen. Either the Wii U is drawing more power typically, or it's drawing a lot less at max than it was supposed to.



As I am not allowed to express my opinion about these specs, I'll just ask:

Is this good or bad?



runqvist said:
As I am not allowed to express my opinion about these specs, I'll just ask:

Is this good or bad?


It's more powerful than the current systems. And the devil is in the customization details and to the third party's will if they want to port from PS4/720.