By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

snowdog said:

Expresso isn't a scaled up Gekko or Broadway. There are similarities but both of those are single core CPUs that can't clock over 1GHz. It's a common misconception that Expresso is '3 x Broadways duct-taped together'.

All we know for certain is that both the CPU and GPU have been customised up the wazoo.

People who say Espresso is three overclocked Broadways miss one very important factor; Espresso has 12 times as much L2 cache as Broadway.



Around the Network

Just started reading this page. As far as the power draw is concerned it's quite meaningless if, as I've suspected for some time now, Latte is using an evolved TEV Unit. Wsippel has speculated that the odd 1:1 ratio of TMUs:ROPs could be explained by the 8 x TMUs not being TMUs but 8 x TEV Units instead, which could be interesting. I personally don't believe that to be the case however.

I'd say that it's more likely that the fixed function logic would be in the ALUs instead. It would explain why the ALU space is too large for a 160:8:8 GPU.

Either way, if the Wii U is using an evolved TEV Unit (and it would have to be an evolved one to avoid the problem that the Wii had with a nonstandard rendering pipeline making ports next to impossible) and fixed functions it would explain the low power draw.

We know that Nintendo were working closely with the likes of Epic, Crytek and Unity amongst others when designing the Wii U so you can be sure that they've fixed the problem that Hollywood had regarding porting games from the PS3 and 360.

If they've managed to sort that out and are using fixed functions then it will go some way to help mitigate the difference in power between the Wii U and the other 8th generation consoles. Fixed functions generate bugger all heat and are a great deal quicker to use than traditional programmable shaders.



snowdog said:
Just started reading this page. As far as the power draw is concerned it's quite meaningless if, as I've suspected for some time now, Latte is using an evolved TEV Unit. Wsippel has speculated that the odd 1:1 ratio of TMUs:ROPs could be explained by the 8 x TMUs not being TMUs but 8 x TEV Units instead, which could be interesting. I personally don't believe that to be the case however.

I'd say that it's more likely that the fixed function logic would be in the ALUs instead. It would explain why the ALU space is too large for a 160:8:8 GPU.

Either way, if the Wii U is using an evolved TEV Unit (and it would have to be an evolved one to avoid the problem that the Wii had with a nonstandard rendering pipeline making ports next to impossible) and fixed functions it would explain the low power draw.

We know that Nintendo were working closely with the likes of Epic, Crytek and Unity amongst others when designing the Wii U so you can be sure that they've fixed the problem that Hollywood had regarding porting games from the PS3 and 360.

If they've managed to sort that out and are using fixed functions then it will go some way to help mitigate the difference in power between the Wii U and the other 8th generation consoles. Fixed functions generate bugger all heat and are a great deal quicker to use than traditional programmable shaders.

There's no way Wii U is using a TEV unit, those created shaders by combining textures in layers, the kind of shaders we'e seeing in Wii U games goes against this method. It is possible Wii U is using a fixed function unit with more modern shader techniques, but very unlikely as programmable shaders have long since become standard.



That's why I said an evolution of the TEV Unit. It's the only reason I can think of why the ALUs are too large for a 160 ALU chip. There's extra logic in there, and it has to do something. And it would also explain why the power draw is so low and why a tiny 8 bit co-processor is needed to run Wii games.



Wyrdness said:
fatslob-:O said:

How exactly am I doing damage control by giving an analysis ? Talk about insecurity.

Newsflash, these newer consoles are almost PCs. I guess you aren't keeping up too much then ?

How exactly is "more with less" a cop out when you were clearly trying to gain more credibility with that statement to gain some grounds ? 

I didn't exactly say that I didn't "need" concrete evidence! I said that it wouldn't "matter" much because official developers will NOT specifically give ANY official specs. Better to go off of what we have than to wait and hence why I would rather look for something "consistent".

Power usage correlates with power whether or not you like! 

"You've made a claim based on one thing power usage to declare something as fact, it's not me who needs credibility it's you yourself and your anonymous source."

Hey guys I have some unknown 1 watt GPU that can destry a 300 watts beast known as the R9 290X. /sarcasm

So I guess I can't make a claim based on power usage eh ? 


Look at your own posts, basing assumptions on first year games as a basis and lol no these consoles may use a different architecture then before but they're far from PCs in both power and how they operate, go to PC specific sites and they'll crucify you for the comparison. Your more with less is a cop out because it's a means of dodging trying to respond to other peoples stances like when someone brought up Shi'en or when some brings up a much better looking game like X, it's a little get out of jail card to hide behind.

Again I don't need credibility I never made any hardware claims your failure to understand this concept further highlights the flaws in your approach to any argument, you did the claim and tried to pass it off as concrete not anyone else. Oh and your earlier post on concrete evidence.

"I couldn't care less about gaf. It doesn't matter if I have conrete evidence. I look for what's the most consistent. The performance so far says the very differently from the possible specs! Why are you getting so defensive ?"

 

Your sarcastic comment to gain ground further highlights how on tilt you are I didn't make claims on power and so on so your R9 290 non-sense are as pointless as the anonymous sources running rampant on GAF, you however did make claims and used first year games to try and back up your claim that's more damage control then speculation as every person is aware we won't see actual optimized performance for a few years although chances are you would of disappeared by then given how speculators are.

Then why are all three new consoles using off the shelf PC graphics cards then ? We all know that AMD doesn't have enough money to keep developing special GPU architectures LOL. Something tells me that you don't go to PC specific sites! I just go there to check out some news and reviews.

Your the one bringing out the jail card here LOL. I obviously acknowledged that the WII U can obtain better graphics. Why don't you go back a few pages and realize that. That "I game on PC" statement made you look mad suspect son LMAO. You still doin't get what I mean by that statement!

No I merely supported the that sentiment. Know the difference between "likely" and "concrete". I vouched for the former. 

There's no excuse for the WII U! The WII U became very close to a PC and a PC specifically depends on it's GPU to do the work for them in games just as the WII U is mostly dependent on it's GPU to do the same. The fact that it had to be specially optimized to take advantage of the newer architecture just goes to show you. A $100 dollar card at LAUNCH such as the hd 5670 is supposed to rape last gen consoles in terms of performance but apparently developers are having issues providing some clear advantage as shown in games like COD and batman. Especially in a game like call of duty! 

The one here scared is you. I can come to terms that the WII U may be weaker than the ps360 in some ways but as for you that's the different story. As for the rest, your copping out extremely hard LOL. Damage controlling about those ports LOLOL. You just showed your true intent here. The WII U shouldn't even have an issue if it was truly more powerful! Your damage controlling sounds something of that an apologist. The PS4 and X1 don't have issues outing their last gen counterparts so why should the WII U have any issues of doing it ? (You'll probably offer more damage controlling on this one.)



Around the Network
curl-6 said:
snowdog said:

Expresso isn't a scaled up Gekko or Broadway. There are similarities but both of those are single core CPUs that can't clock over 1GHz. It's a common misconception that Expresso is '3 x Broadways duct-taped together'.

All we know for certain is that both the CPU and GPU have been customised up the wazoo.

People who say Espresso is three overclocked Broadways miss one very important factor; Espresso has 12 times as much L2 cache as Broadway.

What ? It's just cache! Cache is a form of memory to reduce hits in performance by storing small amounts of frequently accessed data so that accessing it will be faster. Cache isn't some sort of unit for computing y'know.



fatslob-:O said:
curl-6 said:
snowdog said:

Expresso isn't a scaled up Gekko or Broadway. There are similarities but both of those are single core CPUs that can't clock over 1GHz. It's a common misconception that Expresso is '3 x Broadways duct-taped together'.

All we know for certain is that both the CPU and GPU have been customised up the wazoo.

People who say Espresso is three overclocked Broadways miss one very important factor; Espresso has 12 times as much L2 cache as Broadway.

What ? It's just cache! Cache is a form of memory to reduce hits in performance by storing small amounts of frequently accessed data so that accessing it will be faster. Cache isn't some sort of unit for computing y'know.

Cache is an important factor in chip efficiency and hence computing performance. 12x as much cache is a huge advantage.



I ill probably get my first ban for this but....... sh*t the f*ck up I'm sick and tired of this spec talk there and over there and there, as long as the games don't live up to the top end PC's ypu really dont have enything to talk about..


...PS the know how is at hte all time low right now



If it isn't turnbased it isn't worth playing   (mostly)

And shepherds we shall be,

For Thee, my Lord, for Thee. Power hath descended forth from Thy hand, That our feet may swiftly carry out Thy command. So we shall flow a river forth to Thee And teeming with souls shall it ever be. In Nomine Patris, et Filii, et Spiritūs Sancti. -----The Boondock Saints

snowdog said:
That's why I said an evolution of the TEV Unit. It's the only reason I can think of why the ALUs are too large for a 160 ALU chip. There's extra logic in there, and it has to do something. And it would also explain why the power draw is so low and why a tiny 8 bit co-processor is needed to run Wii games.

Some units may be disabled y'know. Each GPU manufacturer may do this because of poor yields on certain parts. The hd 7970 and the 7950 use the SAME DIE but the hd 7950 is weaker because of some photolithographic mistakes. They can't have all billions of transistors being placed perfectly or printed correctly so they'll essentailly disable the units where there are mistakes to sell the new unit at a lower price to reduce costs on manufacturing. 



curl-6 said:
fatslob-:O said:
curl-6 said:
snowdog said:

Expresso isn't a scaled up Gekko or Broadway. There are similarities but both of those are single core CPUs that can't clock over 1GHz. It's a common misconception that Expresso is '3 x Broadways duct-taped together'.

All we know for certain is that both the CPU and GPU have been customised up the wazoo.

People who say Espresso is three overclocked Broadways miss one very important factor; Espresso has 12 times as much L2 cache as Broadway.

What ? It's just cache! Cache is a form of memory to reduce hits in performance by storing small amounts of frequently accessed data so that accessing it will be faster. Cache isn't some sort of unit for computing y'know.

Cache is an important factor in chip efficiency and hence computing performance. 12x as much cache is a huge advantage.

Yep, agreed cache does matter but computing performance is determined by many factors such as the execution units, the branch predictors, and other things such as decode units. Cache will essentially maintain a more theoretical performance rather than increase but that's not a bad thing.