By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

Wyrdness said:
fatslob-:O said:

Sorry but power draw matters LOL. You won't be getting gtx 580 performance out of an awful 35 watts. Stay in denial as you wish. Gee go and figure out what I mean by "it can do more with less", if you think that's some jail card then don't be trying to refute that when you don't even know what I'm going on about. The one here with the intention is you trying to apply more damage control in this thread.


Deny what exactly? Only one who is making claims between us is you, if you read my posts I haven't made mention of any GPU, CPU, any GTX580, claimed the power is x amount and so on, I game on PC for most part and have completed building a high end PC, only telling you that the power usage is not a concrete indicator and you're rattled by this revelation. Look back at all your posts to everyone here and look at the few I've made, you come across as having an intent at this point due to your replies with other people even pulling you on this, I only told you specualtion is not entirely concrete as anyone can do it and even highlighted the was no offence intended but it seemed to put you on tilt.

I'm not taking offence to anything. Power usage is a pretty clear indicator of how powerful a system is disregarding other differences like process nodes and dissimilar architectures. If you gamed on PC then you would know where I'm going with this but instead of trying to here out what I meant by "it can do more with less" you only revealed your intent to damage control in this thead. You made this way to obvious with the "I game on PC" cop out statement of trying to get more credibilty.



Around the Network
fatslob-:O said:

I'm not taking offence to anything. Power usage is a pretty clear indicator of how powerful a system is disregarding other differences like process nodes and dissimilar architectures. If you gamed on PC then you would know where I'm going with this but instead of trying to here out what I meant by "it can do more with less" you only revealed your intent to damage control in this thead. You made this way to obvious with the "I game on PC" cop out statement of trying to get more credibilty.


Hahahaha talk about being rattled, look back at all your posts you're the only one doing damage control you just don't like it when you're being pulled on it. Consoles do not work the same as PCs so me building my own of the latter doesn't make me an expert on console components, me gaming on PC is not a cop out it's telling you that even from my point of view all this speculation is just that in the end only that, people who are experts couldn't fathom the GPU design and said it's not like anything else hence throwing all speculation at that point out the window. You know what is a cop out you with your "more with less" stance and your earlier gem "I don't need concrete evidence someone else said the same" logic, I don't need credibility because I have made no claims on what any hardware is unlike you that's the difference and something you're struggling to contemplate when trying to turn the tables to look smart.

You've made a claim based on one thing power usage to declare something as fact, it's not me who needs credibility it's you yourself and your anonymous source.



Daisuke72 said:
The Wii U is more efficient than the PS3 and Xbox 360, and has more RAM as well, so of course it'll have better looking games, but honestly, just barely.

In short, Ninjablade was right, the Wii U is on par with current gen, Nintendo fans seriously owe him an apology.

We owe him nothing; he was wrong, and he was a childish brat about it.



fatslob-:O said:
Wyrdness said:
fatslob-:O said:

I couldn't care less about gaf. It doesn't matter if I have conrete evidence. I look for what's the most consistent. The performance so far says the very differently from the possible specs! Why are you getting so defensive ?


You're the one getting defensive, I just pointed out that through out the whole year we've had speculation from people declaring it's this that using the same logic you're implying yet months later the whole speculation would change, you yourself from your own reply seem upset about this being brought up. At this point most of us as a result are only taking note from credible sources not people on the internet who claim to be working inside and know secrets about area 51 and so on. Anyone on the net can come across as an insider by taking a bit of info here and some speculation there and wording it right that's the point and one you seem annoyed about, if these so called insiders were really who they say they were then we wouldn't have conflicting speculation to begin with they would of got the info right off the go. As it stands they could be the cleaner in Gearboxes offices who overheard someone else speculating for all we know and the whole cycle seems to be one person says something other people say the same and follow it until it changes much later, whether you care about GAF is irrelevant as that's where all of this started and came from.

Your own official sources will never hand out secret info to anyone so why bother ?

@Bold I ain't too sure about that LOL. After all people keep saying that the internet is filled with sony fans. 

Conflicting speculation ? You either take it or leave it and BTW some hackers already confirmed what thw WII Us CPU is, I also wonder if he's just speculating too ? /sarcasm

FYI some of those rumors never change and infact would become a fact. 

Your guys at shin'en didn't deny the specs the WII U had, they only confirmed that it wasn't downgraded. A part of the "anonymous" rumors are being furthered by the the truth such as it's low power consumption and performance shown so far. 

A console's performance in its first year is not an accurate assessment of its capabilties. Developers are unfamiliar with the hardware, performance will improve as they become accustomed to it, just like any console.



F0X said:
joora said:
All this talk about technology is actually starting to bother me.

I' sure that implicating that better technology is the main ingredient behind a better looking game is a bit insulting to a game artist. A great game is rarely bound by technology restrains, because those restrains were taken into consideration before development. A great game artist will immerse the player so deeply into the game, that the number of pixels or polygons will stop matter. Console gaming community has indeed become worse than digital photography one... and that is worrying

As a (first and foremost) PC gamer I find all this "pissing contest" quite amusing. And sad. Very sad. Ranking gaming systems primarily by their technical abilities is the worst thing a "gamer" can make. I can easily remember piles of moving squares that were more fun and immersive than many of todays HD "materpeices". Any gamer that dissmises a game solely on whether it's "only" 720p or looks "too childlish" should stop calling himself a gamer. Not to mention that it's very disrespectful to the people that put a good part of their lives into making it.

Put your prejudices aside and open your mind to new experiences. You might be surprised what you find.


I agree with this. I game mainly on PC these days with 3DS as my secondary system, but I don't have the money or patience to build a high-end gaming PC right now. And I don't really care. My $500 ultrabook can run most of the PC games I want to play (Minecraft, Civilization, XCOM, Sims, Batman, Garry's Mod, and more), and I don't see the benefit in upgrading for at least a couple more years.


I had the chance to put together a new rig a month ago, and I've decided to just save the money and go with a smaller footprint, ITX case to free up some space, with a lower-middle spec. I honestly couldn't justify the expence since nothning in the near future doesn't seem that interesting to me, except maybe Star Citizen, which I doubt that I'll have time and motivation necessary to get some mileage out of it. I have a huge backlog on Steam and indies seem to putting together some very interesting game concepts.

Bought a 3DS XL too less than a month ago, thought the low screen resolution will drive me nuts since I've grown accustomed to hi-res screens everywhere. That was luckaly not the case, and there are some great games on the system, I'm playing it more than my PC currently.



.

Around the Network
Wyrdness said:
fatslob-:O said:

I'm not taking offence to anything. Power usage is a pretty clear indicator of how powerful a system is disregarding other differences like process nodes and dissimilar architectures. If you gamed on PC then you would know where I'm going with this but instead of trying to here out what I meant by "it can do more with less" you only revealed your intent to damage control in this thead. You made this way to obvious with the "I game on PC" cop out statement of trying to get more credibilty.


Hahahaha talk about being rattled, look back at all your posts you're the only one doing damage control you just don't like it when you're being pulled on it. Consoles do not work the same as PCs so me building my own of the latter doesn't make me an expert on console components, me gaming on PC is not a cop out it's telling you that even from my point of view all this speculation is just that in the end only that, people who are experts couldn't fathom the GPU design and said it's not like anything else hence throwing all speculation at that point out the window. You know what is a cop out you with your "more with less" stance and your earlier gem "I don't need concrete evidence someone else said the same" logic, I don't need credibility because I have made no claims on what any hardware is unlike you that's the difference and something you're struggling to contemplate when trying to turn the tables to look smart.

You've made a claim based on one thing power usage to declare something as fact, it's not me who needs credibility it's you yourself and your anonymous source.

How exactly am I doing damage control by giving an analysis ? Talk about insecurity.

Newsflash, these newer consoles are almost PCs. I guess you aren't keeping up too much then ?

How exactly is "more with less" a cop out when you were clearly trying to gain more credibility with that statement to gain some grounds ? 

I didn't exactly say that I didn't "need" concrete evidence! I said that it wouldn't "matter" much because official developers will NOT specifically give ANY official specs. Better to go off of what we have than to wait and hence why I would rather look for something "consistent".

Power usage correlates with power whether or not you like! 

"You've made a claim based on one thing power usage to declare something as fact, it's not me who needs credibility it's you yourself and your anonymous source."

Hey guys I have some unknown 1 watt GPU that can destry a 300 watts beast known as the R9 290X. /sarcasm

So I guess I can't make a claim based on power usage eh ? 



curl-6 said:
fatslob-:O said:
Wyrdness said:
fatslob-:O said:

I couldn't care less about gaf. It doesn't matter if I have conrete evidence. I look for what's the most consistent. The performance so far says the very differently from the possible specs! Why are you getting so defensive ?


You're the one getting defensive, I just pointed out that through out the whole year we've had speculation from people declaring it's this that using the same logic you're implying yet months later the whole speculation would change, you yourself from your own reply seem upset about this being brought up. At this point most of us as a result are only taking note from credible sources not people on the internet who claim to be working inside and know secrets about area 51 and so on. Anyone on the net can come across as an insider by taking a bit of info here and some speculation there and wording it right that's the point and one you seem annoyed about, if these so called insiders were really who they say they were then we wouldn't have conflicting speculation to begin with they would of got the info right off the go. As it stands they could be the cleaner in Gearboxes offices who overheard someone else speculating for all we know and the whole cycle seems to be one person says something other people say the same and follow it until it changes much later, whether you care about GAF is irrelevant as that's where all of this started and came from.

Your own official sources will never hand out secret info to anyone so why bother ?

@Bold I ain't too sure about that LOL. After all people keep saying that the internet is filled with sony fans. 

Conflicting speculation ? You either take it or leave it and BTW some hackers already confirmed what thw WII Us CPU is, I also wonder if he's just speculating too ? /sarcasm

FYI some of those rumors never change and infact would become a fact. 

Your guys at shin'en didn't deny the specs the WII U had, they only confirmed that it wasn't downgraded. A part of the "anonymous" rumors are being furthered by the the truth such as it's low power consumption and performance shown so far. 

A console's performance in its first year is not an accurate assessment of its capabilties. Developers are unfamiliar with the hardware, performance will improve as they become accustomed to it, just like any console.

Relax i'm starting shit with another dude as per usual. 

You know that the console can perform better and so do I. 



fatslob-:O said:

How exactly am I doing damage control by giving an analysis ? Talk about insecurity.

Newsflash, these newer consoles are almost PCs. I guess you aren't keeping up too much then ?

How exactly is "more with less" a cop out when you were clearly trying to gain more credibility with that statement to gain some grounds ? 

I didn't exactly say that I didn't "need" concrete evidence! I said that it wouldn't "matter" much because official developers will NOT specifically give ANY official specs. Better to go off of what we have than to wait and hence why I would rather look for something "consistent".

Power usage correlates with power whether or not you like! 

"You've made a claim based on one thing power usage to declare something as fact, it's not me who needs credibility it's you yourself and your anonymous source."

Hey guys I have some unknown 1 watt GPU that can destry a 300 watts beast known as the R9 290X. /sarcasm

So I guess I can't make a claim based on power usage eh ? 


Look at your own posts, basing assumptions on first year games as a basis and lol no these consoles may use a different architecture then before but they're far from PCs in both power and how they operate, go to PC specific sites and they'll crucify you for the comparison. Your more with less is a cop out because it's a means of dodging trying to respond to other peoples stances like when someone brought up Shi'en or when some brings up a much better looking game like X, it's a little get out of jail card to hide behind.

Again I don't need credibility I never made any hardware claims your failure to understand this concept further highlights the flaws in your approach to any argument, you did the claim and tried to pass it off as concrete not anyone else. Oh and your earlier post on concrete evidence.

"I couldn't care less about gaf. It doesn't matter if I have conrete evidence. I look for what's the most consistent. The performance so far says the very differently from the possible specs! Why are you getting so defensive ?"

 

Your sarcastic comment to gain ground further highlights how on tilt you are I didn't make claims on power and so on so your R9 290 non-sense are as pointless as the anonymous sources running rampant on GAF, you however did make claims and used first year games to try and back up your claim that's more damage control then speculation as every person is aware we won't see actual optimized performance for a few years although chances are you would of disappeared by then given how speculators are.



fatslob-:O said:
Viper1 said:
fatslob-:O said:

If the system had an hd 5650 the ports wouldn't be so troublesome!

Early ports were troublesome because of the CPU, not GPU.  Games developed on the PS3 or X360 are done so with game engines designed around a 3.2 Ghz clock rate.   The Wii U's CPU is just 1.24 Ghz.  The game engines themselves had problems on Wii U early on.   Notice that it's not such as issue with newer games as they've either began to use their next generation game engine (which is based on much lower clock cycles of the PS4 and XOne) or have learned better optimization of the older gen engine on Wii U.

Don't forget that the Wii U also pushes an extra 400,000 pixels over the X360 and PS3 with each game.  That's almost half of a 720p image right there on top of the game itself.

Clock rates don't mean a whole lot and they don't tell the full story! All that matters is that whether or not the CPU has enough performance. You know this! The ibm espresso has nothing more to offer other than more performance compared to the ibm broadway. It's architecture remains largely unchanged since the ibm gekko days. Essentially what the WII U has is basically a scaled up gamecube processor. Nintendo was dumb enough to settle backwards compatiblity rather than just move forward and be done with a more improved architecture. *rolls eyes* I don't recall any devs having issues with the CPU in fact the only one I can remember complaining was 4A Games. The next generation game engines only have improvements on multithreading.

I also realize that some resources are being hogged by the gamepad. Nontheless CPUs in the coming generation will mostly be a nonfactor. I still standby that the wii u can achieve more with less! (I'm sure pretty technical like pemalite would stand by my statement even though he dislikes all consoles equally.

I think we've had enough of saying that the wii u is weak just by looking at the specs. Nylevia would most likely agree that the WII U can also get a better output than the sub hd twins. 



Expresso isn't a scaled up Gekko or Broadway. There are similarities but both of those are single core CPUs that can't clock over 1GHz. It's a common misconception that Expresso is '3 x Broadways duct-taped together'.

All we know for certain is that both the CPU and GPU have been customised up the wazoo.



snowdog said:
fatslob-:O said:
Viper1 said:
fatslob-:O said:

If the system had an hd 5650 the ports wouldn't be so troublesome!

Early ports were troublesome because of the CPU, not GPU.  Games developed on the PS3 or X360 are done so with game engines designed around a 3.2 Ghz clock rate.   The Wii U's CPU is just 1.24 Ghz.  The game engines themselves had problems on Wii U early on.   Notice that it's not such as issue with newer games as they've either began to use their next generation game engine (which is based on much lower clock cycles of the PS4 and XOne) or have learned better optimization of the older gen engine on Wii U.

Don't forget that the Wii U also pushes an extra 400,000 pixels over the X360 and PS3 with each game.  That's almost half of a 720p image right there on top of the game itself.

Clock rates don't mean a whole lot and they don't tell the full story! All that matters is that whether or not the CPU has enough performance. You know this! The ibm espresso has nothing more to offer other than more performance compared to the ibm broadway. It's architecture remains largely unchanged since the ibm gekko days. Essentially what the WII U has is basically a scaled up gamecube processor. Nintendo was dumb enough to settle backwards compatiblity rather than just move forward and be done with a more improved architecture. *rolls eyes* I don't recall any devs having issues with the CPU in fact the only one I can remember complaining was 4A Games. The next generation game engines only have improvements on multithreading.

I also realize that some resources are being hogged by the gamepad. Nontheless CPUs in the coming generation will mostly be a nonfactor. I still standby that the wii u can achieve more with less! (I'm sure pretty technical like pemalite would stand by my statement even though he dislikes all consoles equally.

I think we've had enough of saying that the wii u is weak just by looking at the specs. Nylevia would most likely agree that the WII U can also get a better output than the sub hd twins. 



Expresso isn't a scaled up Gekko or Broadway. There are similarities but both of those are single core CPUs that can't clock over 1GHz. It's a common misconception that Expresso is '3 x Broadways duct-taped together'.

All we know for certain is that both the CPU and GPU have been customised up the wazoo.

marcan42, the hacker says differently! 

Why would AMD go out their way to customize the shit out of the GPU to give anyone a new architecture ? Both of the sub hd twins of current gen had off the shelf PC parts that just added some extra memory not processing compenents like the ati xenos and or disabled disabled ROPS like the nvidia RSX.