By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

Pemalite said:
Wyrdness said:

it's common knowledge consoles are more efficient with their specs then PCs it's how even after 7 years the 360/PS3 can run games at its specs compared to the higher requirements on PC, "LMAO" at that notion calls you out hard and you're starting to fall a part here. Your logic that developers have a handle on something previously so they should be perfect on new hardware is I'm sorry to say beyond misguided it simply doesn't work that way as the are a number of factors.

That's good about PS2/GC except they weren't the consoles I was talking about as they were the start of consoles using CPUs a lot more, older consoles were more GPU dependant. As for your last part it proves my earlier point and some of it is not even coherant.


Nope.avi

Consoles aren't always more efficient.
Take Oblivion for instance, it can run on PC hardware that's a fraction of the Xbox 360's and still run better.
Take Minecraft, on a PC that's equipped similar to the Xbox 360, you can still have unlimited worlds and a heap more players.
Take the origional Crysis on PC, it had far more foliage and effects than the console version on hardware that was equivalent to the consoles.
Take pretty much every Call of Duty, it could run on hardware similar to the consoles with better image quality.
Take Bioshock 1+2, it could run on PC hardware that's significantly less than the consoles and still look better.

I could continue on forever, but you get the idea.
If you take a game like Battlefield 3 or 4, it's almost an entirely new game on the PC when compared to the Xbox 360 and Playstation 3, you get larger maps, more players, higher resolution, better framerates, effects such as Tessellation, massively improved textures and other bonuses, of course that comes at the cost of better hardware being required.
Games like Civilization and StarCraft 2 are actually impossible on the Xbox 360 and Playstation 3 due to the lack of CPU horsepower.

Mantle however, will remove the API overhead, thus allowing PC's to be as efficient as the next generation of consoles when it comes to graphics tasks, optimisation for the CPU isn't really required considering the multiples faster CPU speeds the PC has had for years, even when compared to next generation.

The PC is also not limited to the horrible and old 1920x1080@60fps, Steam also provides games for cheap, especially during sales.


How disingenious. You conveniently omit to mention the amount of RAM in your biased analysis. And again, you mention that with better hardware, PCs are better. That was not the point, but you just can't help but keep shoving this down everyone's throat. Just like this 1080p comment you like to boast about at every single turn for no reason other than to brag about your setup. ¬_¬

The point was that similar specs for similar specs, consoles are more efficient than PCs. Put 512MB of RAM in those PC build you talked about and then come back with a fair analysis.

Now, you mentioned Mantle. That may change all this (and is a very nice tech if it works as intended), but let's wait until it's out before passing judgment. ¬_¬


And btw, Crysis, on a PC with specs similar to the XBox 360's ran like horse shit.



Around the Network
Pemalite said:
Wyrdness said:

it's common knowledge consoles are more efficient with their specs then PCs it's how even after 7 years the 360/PS3 can run games at its specs compared to the higher requirements on PC, "LMAO" at that notion calls you out hard and you're starting to fall a part here. Your logic that developers have a handle on something previously so they should be perfect on new hardware is I'm sorry to say beyond misguided it simply doesn't work that way as the are a number of factors.

That's good about PS2/GC except they weren't the consoles I was talking about as they were the start of consoles using CPUs a lot more, older consoles were more GPU dependant. As for your last part it proves my earlier point and some of it is not even coherant.


Nope.avi

Consoles aren't always more efficient.
Take Oblivion for instance, it can run on PC hardware that's a fraction of the Xbox 360's and still run better.
Take Minecraft, on a PC that's equipped similar to the Xbox 360, you can still have unlimited worlds and a heap more players.
Take the origional Crysis on PC, it had far more foliage and effects than the console version on hardware that was equivalent to the consoles.
Take pretty much every Call of Duty, it could run on hardware similar to the consoles with better image quality.
Take Bioshock 1+2, it could run on PC hardware that's significantly less than the consoles and still look better.

I could continue on forever, but you get the idea.
If you take a game like Battlefield 3 or 4, it's almost an entirely new game on the PC when compared to the Xbox 360 and Playstation 3, you get larger maps, more players, higher resolution, better framerates, effects such as Tessellation, massively improved textures and other bonuses, of course that comes at the cost of better hardware being required.
Games like Civilization and StarCraft 2 are actually impossible on the Xbox 360 and Playstation 3 due to the lack of CPU horsepower.

Mantle however, will remove the API overhead, thus allowing PC's to be as efficient as the next generation of consoles when it comes to graphics tasks, optimisation for the CPU isn't really required considering the multiples faster CPU speeds the PC has had for years, even when compared to next generation.

The PC is also not limited to the horrible and old 1920x1080@60fps, Steam also provides games for cheap, especially during sales.

It looks like you came here to crush more minds and myths Hehehe. Master race FTW. Mantle will bring even more goodness for the time being. 



I thought after Need for Speed, it was agreed that the Wii U was more powerful than the 360/PS3.



AgentZorn said:
I thought after Need for Speed, it was agreed that the Wii U was more powerful than the 360/PS3.

It depends on what you mean by more "powerful". It was explained in this thread despite the fact that the wii u likely has a deficit in faw shading power such as the amount of theoretical floating point performance but it can still achieve better graphics due to a more modern architecture with more memory to play around with. 



I have a few points to make here that seem to usually get overlooked when people examine the Wii U in detail:

1) It seems like the millionth time I am saying this, but the Wii U uses FLASH memory, not a hard disk drive. This drives down power consumption immensely, increases performance, and enhances reliability through the roof. This memory is pricey and Nintendo chose to use it for the aforementioned reasons, as opposed to shoving a gigantic hard drive in that had more negatives than positives.

2) From what I have seen of 1st-party Nintendo titles on Wii U, they are using 2x AA by default and V-Sync across ALL games. The V-Sync is HUGE, especially noticeable on multi-plats because 360 and PS3 have quite obvious tearing in so many games.

3) Though the CPU is slower than XOne or PS4 due to fewer cores and slightly lower clock speed, the chip itself is actually a better architecture and more powerful per core; if Nintendo were to use six cores and match the speed of XOne or PS4 (either or), it would be a MORE POWERFUL CPU. The developer of Nano Assault Neo (and some others, can't remember off the top of my head) have stated they don't even use one core fully, and there are THREE. The PowerPC chip they are using is pretty robust and many people are quick to jump on the low clock speed or smaller amount of cores when they don't think it through. Remember that XOne uses Kinect and PS4 uses PS Eye, and both have MUCH larger OSes.

4) Wii U has 2 GB of RAM compared to the 8 GBs each found in PS4 and XOne. It is also slower than the other two consoles. This being said, the actual amount deficit isn't as great as it seems on paper. This is because 1 GB (and sometimes a few MBs more for 1st-party software at minimum) is dedicated to games on the U, where maybe 3-3.5 GBs are available at any time for the other two, whether due to Kinect or OS use. It's a deficit, but not catastrophic.

5) Clearly, the Wii U's GPU is beefier than everyone has realized. I mentioned V-Sync and AA earlier, and I wasn't kidding about the always-on V-Sync. The only people used to V-Sync are PC players like myself, and it's awesome to not have to worry about screen tear. It's sweet. Plus, from what I hear, the RGB lock on certain colours has been lifted with the latest update, so it doesn't restrict colour output.



Around the Network
Hynad said:


How disingenious. You conveniently omit to mention the amount of RAM in your biased analysis. And again, you mention that with better hardware, PCs are better. That was not the point, but you just can't help but keep shoving this down everyone's throat. Just like this 1080p comment you like to boast about at every single turn for no reason other than to brag about your setup. ¬_¬

The point was that similar specs for similar specs, consoles are more efficient than PCs. Put 512MB of RAM in those PC build you talked about and then come back with a fair analysis.

Now, you mentioned Mantle. That may change all this (and is a very nice tech if it works as intended), but let's wait until it's out before passing judgment. ¬_¬


And btw, Crysis, on a PC with specs similar to the XBox 360's ran like horse shit.


Side-stepping the issue.
If you take Oblivion for instance, on the PC you could scale it down to 512Mb of Ram, Geforce 3 and a Pentium 3. - That's Xbox 1 Level GPU/CPU there which was an Xbox 360 game.

In general, of course the PC needs more Ram, it's doing 9000x more things at once than the console are and with better image quality, that kind of thing isn't for free you know.
Hows about those Next-Generation consoles and their OS's using more Ram than my PC just for OS and a couple of applications? And they still can't do 1/10th of what my PC can?

As for Crysis running like crap on hardware similar to the consoles, that simply ain't true. You could achieve 30fps by dropping the game to Direct X 9, applying a few tweaks to get the Ultra quality. - It's was well documented on the Crysis forums.

Heck, a 6 year old PC can still run every Xbox 360 and Playstation 3 Multiplatform game at 720P with at-least 30fps with a $30 GPU upgrade, Crysis Included.

As for 27" 1440P panels, they're now at a point where they are reasonably cheap, I.E. Couple hundred bucks. - If you have a high-end GPU, you shouldn't really be stuck in last-decade stuff and using 1080P unless you want 120hz.

If you think I'm being biased, so be it. I really couldn't care less, I've been accused of being an Xbox and a Playstation fanboy in the past on these forums, which is ironic really.



--::{PC Gaming Master Race}::--

fatslob-:O said:

Sorry but that optimization excuse don't work out bra. The PS360 could out their own predecessors so why isn't the WII U doing the same thing to EVERY game. Even the PS4 and X1 could shit on their current gen counterparts easily bra. If the WII U had more bruteforce power to out the PS360 why ain't it performing better ? All that's needed to make a game look and run better is a significantly more powerful GPU an clearly the WII U lacks this. This generation ain't exactly over yet. Consoles altogether this generation started to use the GPU more and became more PC like in their philosophies and consoles altogether next generation are even dressed up PC's. Hell it's thanks to the WII Us different CPU that it ain't branded as a dressed up PC yet but that don't matter too much when much of it's power comes from an off the shelf PC GPU LMAO. If your referring to even older consoles like the PS1 and the N64 they also relied alot on CPU too and the N64 didn't even have a graphics processor! 

PS4 and XB1 have literally no excuse to not out-perform their predecessors (leaving the bad CoD PS4 port with terrible framerate aside), we all know Wii U is not a huge leap compared to PS360, so I don't see why we need to compare it the same way...

Off-shelf PC GPU? Just to put this out, no 40nm 160 shader GPU out there in the market has as much GPU logic as Latte does. All of them (including cards like lower end 6000, 7000, 8000 and the newer R2 xxx cards) are around ~ 67 mm2 in size, while also sharing around ~ 330 million transistors each. Latte is far larger than any 160 shader part at 156.21mm2, and it has well over 700 million transistors (excluding eDRAM). It doesn't make too much sense for Nintendo to have taken only an off-shelf GPU, reduce shaders, TMUs or ROPs and leave it at that considering size, process and transistor count. Latte doesn't compare to any 160 shader part, at all, period.

Just note, I'm not saying it has more shaders, more texture mapping units (if it even has any) nor ROPs, I'm just saying that Latte has a ton of logic that you simply cannot explain without the aid of newer documentation. As for 3rd party ports running bad on Wii U, you can easily blame documentation. Yes, it's an excuse, and a very relevant excuse. It was terrible before launch, and if you don't have the entire API well documented, then you're stuck with as much development effort put into the games as to how the PS4 version of Ghosts is demonstrating its performance. For PS4, there is literally no excuse for that cheap-looking game like Ghosts to run with terrible framerate on that hardware. Though, I don't blame the hardware, I blame the developers. Same thing applies to Wii U, although on a worser scale considering how I'm sure PS4 is easier to develop for than Wii U, and the PS4 documentation is a lot easier to follow than Wii Us.



forethought14 said:
fatslob-:O said:

Sorry but that optimization excuse don't work out bra. The PS360 could out their own predecessors so why isn't the WII U doing the same thing to EVERY game. Even the PS4 and X1 could shit on their current gen counterparts easily bra. If the WII U had more bruteforce power to out the PS360 why ain't it performing better ? All that's needed to make a game look and run better is a significantly more powerful GPU an clearly the WII U lacks this. This generation ain't exactly over yet. Consoles altogether this generation started to use the GPU more and became more PC like in their philosophies and consoles altogether next generation are even dressed up PC's. Hell it's thanks to the WII Us different CPU that it ain't branded as a dressed up PC yet but that don't matter too much when much of it's power comes from an off the shelf PC GPU LMAO. If your referring to even older consoles like the PS1 and the N64 they also relied alot on CPU too and the N64 didn't even have a graphics processor! 

PS4 and XB1 have literally no excuse to not out-perform their predecessors (leaving the bad CoD PS4 port with terrible framerate aside), we all know Wii U is not a huge leap compared to PS360, so I don't see why we need to compare it the same way...

Off-shelf PC GPU? Just to put this out, no 40nm 160 shader GPU out there in the market has as much GPU logic as Latte does. All of them (including cards like lower end 6000, 7000, 8000 and the newer R2 xxx cards) are around ~ 67 mm2 in size, while also sharing around ~ 330 million transistors each. Latte is far larger than any 160 shader part at 156.21mm2, and it has well over 700 million transistors (excluding eDRAM). It doesn't make too much sense for Nintendo to have taken only an off-shelf GPU, reduce shaders, TMUs or ROPs and leave it at that considering size, process and transistor count. Latte doesn't compare to any 160 shader part, at all, period.

Just note, I'm not saying it has more shaders, more texture mapping units (if it even has any) nor ROPs, I'm just saying that Latte has a ton of logic that you simply cannot explain without the aid of newer documentation. As for 3rd party ports running bad on Wii U, you can easily blame documentation. Yes, it's an excuse, and a very relevant excuse. It was terrible before launch, and if you don't have the entire API well documented, then you're stuck with as much development effort put into the games as to how the PS4 version of Ghosts is demonstrating its performance. For PS4, there is literally no excuse for that cheap-looking game like Ghosts to run with terrible framerate on that hardware. Though, I don't blame the hardware, I blame the developers. Same thing applies to Wii U, although on a worser scale considering how I'm sure PS4 is easier to develop for than Wii U, and the PS4 documentation is a lot easier to follow than Wii Us.

Your forgetting the fact that the eDRAM takes up a significant amount of die space. After all cache doesn't cost a small amount of transistors. BTW I don't literally mean "off the shelf" that was a slight hyperbole. By that I mean pretty damn similar. If it truly had around 700 million transistors of enabled logic then why is it so hard for the wii u to completely beat the PS360 and why does it consume 35 watts in total not including the disc drives etc ? Right now 600 million transistors of logic make sense because the actual graphics processing component is around 100mm^2 not the 156mm^2 you initally thought. It could easily compare to a 320 shader part that has some disabled shaders too and BTW none of those 160 shader parts make sense because they only have 4 rops so don't just assume that I am referring to 160 shader parts. An HD 5550 is looking pretty likely right now for what nintendo has used as a base. 

Oh as for your "latte" having more logic do you even know if all of that is ENABLED logic. IE the shaders that ACTUALLY WORK. It's very common to see some alot of GPU manufacturers disable a part of the DIE that NOT WORKING. Since the "latte" probably has around 900 million transistors in total and a third of it is probably reserved for things like the eDRAM. Half of it is probably used for things like GPU logic and the rest is used to create extra eDRAM and and GPU logic so that the chip doesn't end up having less yields.

Do we even have a DF or LOT analysis to even come to the conlusion that it runs worse on the PS4 ? (Doesn't matter anyways since the PS4 is just 5 days away from analysis.) 



Pemalite said:
Hynad said:


How disingenious. You conveniently omit to mention the amount of RAM in your biased analysis. And again, you mention that with better hardware, PCs are better. That was not the point, but you just can't help but keep shoving this down everyone's throat. Just like this 1080p comment you like to boast about at every single turn for no reason other than to brag about your setup. ¬_¬

The point was that similar specs for similar specs, consoles are more efficient than PCs. Put 512MB of RAM in those PC build you talked about and then come back with a fair analysis.

Now, you mentioned Mantle. That may change all this (and is a very nice tech if it works as intended), but let's wait until it's out before passing judgment. ¬_¬


And btw, Crysis, on a PC with specs similar to the XBox 360's ran like horse shit.


Side-stepping the issue.
If you take Oblivion for instance, on the PC you could scale it down to 512Mb of Ram, Geforce 3 and a Pentium 3. - That's Xbox 1 Level GPU/CPU there which was an Xbox 360 game.

In general, of course the PC needs more Ram, it's doing 9000x more things at once than the console are and with better image quality, that kind of thing isn't for free you know.
Hows about those Next-Generation consoles and their OS's using more Ram than my PC just for OS and a couple of applications? And they still can't do 1/10th of what my PC can?

As for Crysis running like crap on hardware similar to the consoles, that simply ain't true. You could achieve 30fps by dropping the game to Direct X 9, applying a few tweaks to get the Ultra quality. - It's was well documented on the Crysis forums.

Heck, a 6 year old PC can still run every Xbox 360 and Playstation 3 Multiplatform game at 720P with at-least 30fps with a $30 GPU upgrade, Crysis Included.

As for 27" 1440P panels, they're now at a point where they are reasonably cheap, I.E. Couple hundred bucks. - If you have a high-end GPU, you shouldn't really be stuck in last-decade stuff and using 1080P unless you want 120hz.

If you think I'm being biased, so be it. I really couldn't care less, I've been accused of being an Xbox and a Playstation fanboy in the past on these forums, which is ironic really.

You must be joking about crysis.The internet was crying back then with how much demanding and badly optimised its engine was. As for the rest of your points, sure, consoles don't have to run as many things, which makes them more efficient. You talk about scaling things down on PC to achieve better performance. You have to scale them to a point where the games look just the same, if not worse, than they do on consoles (with same specs).  And then you imply mods and ini tweaks to make sure the game doesn't even run as intended. As I said, you're being disingenious.

As for your bias, one has to be totally blind to not see you're a PC elitist.



Daisuke72 said:
snowdog said:
Daisuke72 said:
snowdog said:
He was wrong. More than 3 times more cache for the GPU, 3 times more cache for the CPU, a DSP, an out-of-order execution CPU, a CPU with half the stages in its pipeline, a DX11-equivalent feature set and 4 times the RAM says he was wrong.

And if that isn't enough then Sonic Racing Transformed, Need For Speed Most Wanted, the gimping of Rayman Legends removing half of those black things from the swarm so that it would run on the PS3 and 360, Pikmin 3, The Wonderful 101, Super Mario 3D World, Bayonetta 2, X, Mario Kart 8 and SSBU all say he's wrong.


He always said the Wii U had the advantage in RAM, and overall efficiency. So? At the end of the day the Wii U is closer to current gen than it is next, and it's alot closer to current, which was his initial point, which you tried arguing and still are. 



He was wrong. It's as simple as that. Being somewhere between the 360 and PS4 in terms of power isn't 'on par' with the 360...unless you change the meaning of the words 'on par'. The Wii U is around 3-4 times more powerful than the 360, but that doesn't for a second mean that games will be 3-4 times better looking.

We've already seen the PS4 and One struggling to run games at a higher resolution than 720p and the difference between the 360 and the PS4 and One in terms of power is a great deal bigger.

Four times as powerful? Not even close, but keep telling yourself that, I would go in-depth, but last time we had this discussion you pretty much speculated on everything. Hell, every reply started with something like.

 

"I think..."

 

"Nintendo has complex..."

 

But like I said, just putting it out there, Ninjablade was right. Ninjablade won. RIP my nigga Ninja. 

are you ninjanlade?...  i see you created a new  profile!!! ;)



34 years playing games.