By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

dahuman said:

or like this!

oh wait that one got canned, nevermind. XD


or like this:



Around the Network
oni-link said:
dahuman said:

or like this!

oh wait that one got canned, nevermind. XD


or like this:

Yes I have that, played co-op with my brother lol.



oni-link said:
eyeofcore said:
fatslob-:O said:
eyeofcore said:
oni-link said:
at this point I think no real Nintendo fan will care what specs the Wii U has!!! I for one don't care if it has the specs of a 15 year old Sega Dreamcast (I'm so tired of the BS about specs taking a rumor as fact) as long as it produces games as beautiful as SM3DW, X, Rayman Legends, Bayo2 etc!!! People shouldn't give a shit about which console has the biggest D!ck as long as it satisfies their needs no? Besides PC outshines any of the "big" 3's consoles anyday...so does Steam Machines!!!


I agree with you, I am here just refuting the obvious rumor and fallacy...

Fatslob forces a rumor as a fact yet the die shot, mm^2 available for GPU and other things say otherwise... He can't handle it.

Specs don't matter, Wii ruled for several years straight stomping Xbox 360 and PlayStation 3 in sales and software sales and DS proved PSP and 3DS proved Vita are no match to the attractive design and games, for three years in a row, weakest console won in sales and software sales.

Hopefully the Wii U will continue the trend.

Die shots don't matter when a console is taking less than 35 watts. Deal with it! 

This rumor is partly backed up by the fact that it takes less than 35 watts. Deal with it once again. 

BTW what fallacy is there when you yourself can't back up your point with sources ? (Uh oh it really sounds like your reaching for that damage control LOL.)


Die Shot matters and it is proof and you negate it... It is 35 watts highest...

I quote myself...

"Hmmm... Care to tell me something that I did not knew? So you deny that process/litography quality can not be improved or that a higher quality silicon can be used?

Wow... Your forceful premise and guesses are constant, I follow AMD and I follow their stock yet it is not my primariy focus compared to finding a job in this recession. Recent leaks point out that Kaveri/Steamroller will have lower floating yet higher interger performance intentionally so floating points calculations/processing will be offloaded to GPU... This will make worser performance in older games :/

So Radeon HD 6570M is 104mm^2 so they can squeeze it under 100.71mm^2, die size of 93.6mm^2. Thanks for the information.

Chipworks guy said that Latte's GPU is heavily customized and that he could not tell what kind of AMD's GPU is so it is supposedly extremely customized design, by the way. You crushed your theory and theory of that guy that it is 160-200 shaders LMAO."

"Nope... It is 30 watts, at worse it is 35 watts sometimes and the power supply is 75 watts with 90% efficiency so it can handle 67.5 watts at max and degradation of the power supply unit is ignorable if not stressed fully to reduce its life span.

How can we be certain that Nintendo did not locked resources? Just in case if the system/OS is unstable...

You need to deal with this... You are looking at PC GPU's not mobile/embedded ones at all also does that TDP include GDDR3/GDDR5 memory into TDP/power consumption plus the board its self? If so then remove GDDR3(I know that 1GB 65nm one uses 20watts) or if is GDDR5(2GB 46nm is 7.5-8 watts so 512MB is 1.75-2 watts) then that the board its self could take a couple of more watts. So look at Radeon HD 6570M... 30 watt rated TDP, if 1GB of GDDR3 45nm then lets remove that thus power consumption goes down to 16 watts then lower the clocks from 600 to 550 thus power consumption goes down to 13.6 then we remove the board that could use 1-2 watts and it is 11.6-12.6 then we add like 2-3 watts that 32mb eDRAM uses and we are around 12.6-15.6.

I might be wrong, you may say I am FOS yet I might be right or yet we both don't know s*** when comes to what is used in TDP calculation of GPU since GDDR3/GDDR5 consume noticeable amount of watts and can add to heat of the GPU."

Only person doing damage control is you, here, right now, this instance, pronto! You are FOS.


Eyeforce it doesn't matter the fact that he took bgassasin's word/rumor (which is obviously flawed hence shinen's rebuttal response) as fact shows it's a pointless endeavor debating with him!!!  Looking at how current ports like SC:BL, AC:IV, Trine2, NFS:MW, along with statements from Havokhttp://thewiiu.com/topic/3685-havok-talks-engine-use-on-wii-u-says-youll-see-things-you-wont-see-elsewhere/, Frozenbyte http://www.nowgamer.com/news/1999044/wii_u_is_a_truly_powerful_console_more_powerful_than_360_and_ps3_trine_2_dev.html, Unity http://www.cinemablend.com/games/Wii-U-GPGPU-Squashes-Xbox-360-PS3-Capable-DirectX-11-Equivalent-Graphics-47126.html and now recently Shin'en regarding the rumor https://twitter.com/ShinenGames.  Again  I am tired of all this made up shit (which is obviously defamation tactics used by anti-Nintendo fanboys) about the Wii U specs.  They can say it is as powerful as the N64 (which Ninjablade said once) for all I care...but I don't care as long as I am playing games that look this good:

I'm not on a quest to shit on the WII U. That's the ps360 gamers job to do it. (Why don't you read my other posts in this before you start assuming I have some hidden agenda.) 

If your willing to take eyeofcores delusion over anandtechs serious analysis go for it.



Wii U W101 vs Knack:

I actually prefer the latter!!!



infamous (PS4) vs Bayo2 (Wii U)



Around the Network
fatslob-:O said:
oni-link said:
eyeofcore said:
fatslob-:O said:
eyeofcore said:
oni-link said:
at this point I think no real Nintendo fan will care what specs the Wii U has!!! I for one don't care if it has the specs of a 15 year old Sega Dreamcast (I'm so tired of the BS about specs taking a rumor as fact) as long as it produces games as beautiful as SM3DW, X, Rayman Legends, Bayo2 etc!!! People shouldn't give a shit about which console has the biggest D!ck as long as it satisfies their needs no? Besides PC outshines any of the "big" 3's consoles anyday...so does Steam Machines!!!


I agree with you, I am here just refuting the obvious rumor and fallacy...

Fatslob forces a rumor as a fact yet the die shot, mm^2 available for GPU and other things say otherwise... He can't handle it.

Specs don't matter, Wii ruled for several years straight stomping Xbox 360 and PlayStation 3 in sales and software sales and DS proved PSP and 3DS proved Vita are no match to the attractive design and games, for three years in a row, weakest console won in sales and software sales.

Hopefully the Wii U will continue the trend.

Die shots don't matter when a console is taking less than 35 watts. Deal with it! 

This rumor is partly backed up by the fact that it takes less than 35 watts. Deal with it once again. 

BTW what fallacy is there when you yourself can't back up your point with sources ? (Uh oh it really sounds like your reaching for that damage control LOL.)


Die Shot matters and it is proof and you negate it... It is 35 watts highest...

I quote myself...

"Hmmm... Care to tell me something that I did not knew? So you deny that process/litography quality can not be improved or that a higher quality silicon can be used?

Wow... Your forceful premise and guesses are constant, I follow AMD and I follow their stock yet it is not my primariy focus compared to finding a job in this recession. Recent leaks point out that Kaveri/Steamroller will have lower floating yet higher interger performance intentionally so floating points calculations/processing will be offloaded to GPU... This will make worser performance in older games :/

So Radeon HD 6570M is 104mm^2 so they can squeeze it under 100.71mm^2, die size of 93.6mm^2. Thanks for the information.

Chipworks guy said that Latte's GPU is heavily customized and that he could not tell what kind of AMD's GPU is so it is supposedly extremely customized design, by the way. You crushed your theory and theory of that guy that it is 160-200 shaders LMAO."

"Nope... It is 30 watts, at worse it is 35 watts sometimes and the power supply is 75 watts with 90% efficiency so it can handle 67.5 watts at max and degradation of the power supply unit is ignorable if not stressed fully to reduce its life span.

How can we be certain that Nintendo did not locked resources? Just in case if the system/OS is unstable...

You need to deal with this... You are looking at PC GPU's not mobile/embedded ones at all also does that TDP include GDDR3/GDDR5 memory into TDP/power consumption plus the board its self? If so then remove GDDR3(I know that 1GB 65nm one uses 20watts) or if is GDDR5(2GB 46nm is 7.5-8 watts so 512MB is 1.75-2 watts) then that the board its self could take a couple of more watts. So look at Radeon HD 6570M... 30 watt rated TDP, if 1GB of GDDR3 45nm then lets remove that thus power consumption goes down to 16 watts then lower the clocks from 600 to 550 thus power consumption goes down to 13.6 then we remove the board that could use 1-2 watts and it is 11.6-12.6 then we add like 2-3 watts that 32mb eDRAM uses and we are around 12.6-15.6.

I might be wrong, you may say I am FOS yet I might be right or yet we both don't know s*** when comes to what is used in TDP calculation of GPU since GDDR3/GDDR5 consume noticeable amount of watts and can add to heat of the GPU."

Only person doing damage control is you, here, right now, this instance, pronto! You are FOS.


Eyeforce it doesn't matter the fact that he took bgassasin's word/rumor (which is obviously flawed hence shinen's rebuttal response) as fact shows it's a pointless endeavor debating with him!!!  Looking at how current ports like SC:BL, AC:IV, Trine2, NFS:MW, along with statements from Havokhttp://thewiiu.com/topic/3685-havok-talks-engine-use-on-wii-u-says-youll-see-things-you-wont-see-elsewhere/, Frozenbyte http://www.nowgamer.com/news/1999044/wii_u_is_a_truly_powerful_console_more_powerful_than_360_and_ps3_trine_2_dev.html, Unity http://www.cinemablend.com/games/Wii-U-GPGPU-Squashes-Xbox-360-PS3-Capable-DirectX-11-Equivalent-Graphics-47126.html and now recently Shin'en regarding the rumor https://twitter.com/ShinenGames.  Again  I am tired of all this made up shit (which is obviously defamation tactics used by anti-Nintendo fanboys) about the Wii U specs.  They can say it is as powerful as the N64 (which Ninjablade said once) for all I care...but I don't care as long as I am playing games that look this good:

I'm not on a quest to shit on the WII U. That's the ps360 gamers job to do it. (Why don't you read my other posts in this before you start assuming I have some hidden agenda.) 

If your willing to take eyeofcores delusion over anandtechs serious analysis go for it.


Your comments and stating rumor as fact point otherwise...

You love forcing pre determined mind set to other people, force an opinion of yours, trying to brainwash others that think like you GenericBot? Go starve.

As I said before(yet you don't learn), that was months ago and early on my research and keep showing your desperation... Everyone can see it!



So fellow Nintendo fans...why are we even need to argue about unconfirmed specs again?  Can we let the specs go back to and enjoying games like this:

 



fatslob-:O said:

Die shots don't matter when a console is taking less than 35 watts. Deal with it! 

This rumor is partly backed up by the fact that it takes less than 35 watts. Deal with it once again.

To be fair, 35W =/= to 160 shaders. I'm not denying 160, and not saying it's more than that, but the GPU could definitely be very low powered and also have higher amount of shaders than Wii U. If you take everything into consideration, Latte runs at roughly 18-ish watts. I stated this before, the Mobility Radeon HD 5650 is a 400 shader graphics card, clocked at 650mhz, 40nm process, and it runs from 15-19 watts depending on load. This is a mobile part, but it was originally based on the PC versions, so a similar idea could have easily applied to Latte. 15-19 watts is the enire graphics card, the GPU itself will run about 3-5 less watts then that. So in the end, Wii U could have very well have been a 400 shader part and still retain the 33-36W it runs at. Though, it's obviously not a 400 shader part. 

In my view about die size, I think there is obviously more to the GPU than meets the eye. No, I'm not being a "delusional fanboy" making it more than it is (it's not compared to the 5 billion transistor X1 silicon), but sizes shouldn't be completely ignored, it's simply too much logic for a conventional 160 shader GPU. If we take measurements in terms of transistors, Latte at 40nm is 156.21mm2 in size (the size of the GPU die), 40nm AMD GPUs are about 6.00 - 6.25 million transistors per mm2 (I averaged several AMD 40nm GPU in terms of size and transistors that were available, and found other averages and came up with that average, a good Google search does wonders), meaning Latte has roughly 717-756 million transistors without the eDRAM, and for comparison, Xenos is 232 million transistors without eDRAM. eDRAM in Latte is around 220 million transistors if it is indeed 40nm Renesas eDRAM, so total, Latte is close to 1 billion transistors. This means Latte has about 3x more transistors compared to Xenos, likely having a lot to do with more modern architecture, and other fixed-function silicon Nintendo decided to include in it. BC stuff is likely very small part of the GPU, so I'm pretty positive that it doesn't take up much room at all, a few mm2 at most. I don't know about what other people say, but a 160 shader part made up of 750 million transistors is not something to scoff at, most conventional 160 shader parts are much smaller than this. The HD 6450 has 160 shaders, 40nm but is 370 million trasnsistors. And to compare with a GPU with more shaders, 400 shader processor part HD 5670 at 40nm is 627 million transistors.



forethought14 said:
fatslob-:O said:

Die shots don't matter when a console is taking less than 35 watts. Deal with it! 

This rumor is partly backed up by the fact that it takes less than 35 watts. Deal with it once again.

To be fair, 35W =/= to 160 shaders. I'm not denying 160, and not saying it's more than that, but the GPU could definitely be very low powered and also have higher amount of shaders than Wii U. If you take everything into consideration, Latte runs at roughly 18-ish watts. I stated this before, the Mobility Radeon HD 5650 is a 400 shader graphics card, clocked at 650mhz, 40nm process, and it runs from 15-19 watts depending on load. This is a mobile part, but it was originally based on the PC versions, so a similar idea could have easily applied to Latte. 15-19 watts is the enire graphics card, the GPU itself will run about 3-5 less watts then that. So in the end, Wii U could have very well have been a 400 shader part and still retain the 33-36W it runs at. Though, it's obviously not a 400 shader part. 

In my view about die size, I think there is obviously more to the GPU than meets the eye. No, I'm not being a "delusional fanboy" making it more than it is (it's not compared to the 5 billion transistor X1 silicon), but sizes shouldn't be completely ignored, it's simply too much logic for a conventional 160 shader GPU. If we take measurements in terms of transistors, Latte at 40nm is 156.21mm2 in size (the size of the GPU die), 40nm AMD GPUs are about 6.00 - 6.25 million transistors per mm2 (I averaged several AMD 40nm GPU in terms of size and transistors that were available, and found other averages and came up with that average, a good Google search does wonders), meaning Latte has roughly 717-756 million transistors without the eDRAM, and for comparison, Xenos is 232 million transistors without eDRAM. eDRAM in Latte is around 220 million transistors if it is indeed 40nm Renesas eDRAM, so total, Latte is close to 1 billion transistors. This means Latte has about 3x more transistors compared to Xenos, likely having a lot to do with more modern architecture, and other fixed-function silicon Nintendo decided to include in it. BC stuff is likely very small part of the GPU, so I'm pretty positive that it doesn't take up much room at all, a few mm2 at most. I don't know about what other people say, but a 160 shader part made up of 750 million transistors is not something to scoff at, most conventional 160 shader parts are much smaller than this. The HD 6450 has 160 shaders, 40nm but is 370 million trasnsistors. And to compare with a GPU with more shaders, 400 shader processor part HD 5670 at 40nm is 627 million transistors.

let it go man...let it go!!! It's all about the games!!!



oni-link said:

Wii U W101 vs Knack:

I actually prefer the latter!!!

Me too.

Or did you mix that up? ;)