By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

jonathanalis said:
how many times much better than ps360 a system have to be to show graphics like the 2 wii U tech demos showed in E3 2011?

After seing that i just cant believe that wii U isnt at least 3 or 4 times more powerfull than ps360.


Don't you realise how mad that comment is? Nothing at E3 is important, the important thing is the results of real hardware.  Microsoft  used a nvidia gpu based PC's to demonstrate the xbox one at E3. The benchmark of what the wii u is capable of is an actual wii u. We are post launch now. 



Around the Network
curl-6 said:
bonzobanana said:
With any discussion about the latte fabrication you have to bear in mind the huge amount of stuff included in the latte, not only is a huge part the edram but there is also the arm cpu, audio dsp, 1MB wii u texture memory, 2MB wiii frame buffer, wii gpu, sections designed for high speed compression of the wii u gpu frame buffer and downscaled to fit in the wii gpu frame buffer. There is a huge amount of additional stuff.

The 176 gflops figure is completely realistic, not only does it fit in with the power consumption figures but it fits in with how the wii u is actually performing. Let's not forget the generational difference between 360/PS3 gpu's and the later radeon's is quite significant in performance.

If the latte really is 352 gflops and the architecture is much improved plus you have 32MB of high speed edram then what the hell is going wrong that the wii u that it is struggling to outperform 360/PS3 graphically? The figure of 176 gflops makes total sense, it simply works with all the information we have. If the latte really is 352 gflops then something has gone horribly wrong in the design of the wii u that is creating major issues. I don't believe this. I believe Nintendo have a designed a console at the absolute minimum price to merely match current gen performance overall. However as a wii u owner I'm more than happy to be proved wrong but the evidence surely dictates that of the possible range of gflops performance 176-352 gflops where once we were clinging on to believing it was 352 gflops infact the lower figure is much more realistic.

I guess an alternative view is that CPU is so weak that the compute functionality of the radeon gpu is being utilised for practically every game and compromising graphic output. I don't believe this myself. I also don't believe the wii u console is hard to develop for, I believe the complete opposite is true and I don't believe all developers are being lazy on wii u either.

Ultimately I believe the wii u is a low performance console designed to a specification for good quality cartoon graphics for Nintendo games and a huge profit for Nintendo and its shareholders (if it sold well). A continuation of their 'withered technology' philosophy that was so successful for gameboy, wii etc.

http://en.wikipedia.org/wiki/Gunpei_Yokoi

Current PS3/360 games are built on 8 years of optimization for their specific hardware. Wii U hasn't even been out for a year yet, no console is maxed out that fast. You need exclusives designed around a console's  specific hardware to really show off what it can do, and no Wii U exclusive so far has aimed for high end graphics, they all adopt simpler styles that preclude technological boundary-pushing.

The Wii U has yet to have its Uncharted 2, its Gears of War, its Mario Galaxy, that one game that really puts its chipset through its paces and shatters its established graphical standards.

We don't know how mature the development software is but the wii u is not a complicated design. It's using a well known and well documented radeon gpu (which one we aren't sure of) and I don't think there are any surprises when it comes to the 32 bit CPU which dates back to 1997. The wii u is not a ps3 or even a xbox 360 its a much simpler design. When the ps3 and 360 came out they were cutting edge hardware and new to market that is not the case with wii u. Also just about the most impressive gamecube/wii title is Rogue Squadron despite being a launch title.  Also Halo represents a very level of optimisation on the original xbox. The wii u is not cutting edge its a combination of well documented and mature components clearly this will have an effect on what level of optimisation it achieves. Did the wii make great strides over the gamecube in optimisation beyond the extra 64MB memory and 50% overclock plus dvd storage capacity. I certainly don't think so.

The gamecube itself hit the ground running with great performance unlike the ps2 because it was a much better design to develop for as is the wii u.



fatslob-:O said:

Sorry but that optimization excuse don't work out bra. The PS360 could out their own predecessors so why isn't the WII U doing the same thing to EVERY game. Even the PS4 and X1 could shit on their current gen counterparts easily bra. If the WII U had more bruteforce power to out the PS360 why ain't it performing better ? All that's needed to make a game look and run better is a significantly more powerful GPU an clearly the WII U lacks this. This generation ain't exactly over yet. Consoles altogether this generation started to use the GPU more and became more PC like in their philosophies and consoles altogether next generation are even dressed up PC's. Hell it's thanks to the WII Us different CPU that it ain't branded as a dressed up PC yet but that don't matter too much when much of it's power comes from an off the shelf PC GPU LMAO. If your referring to even older consoles like the PS1 and the N64 they also relied alot on CPU too and the N64 didn't even have a graphics processor! 


Did you see the first few 360 games? They were far from a huge leap I even named one earlier, this is what calls you out on your damage control, selective memory is a common trait with people doing this, it wasn't until Oblivon and Gears arrived (two games that were optimized for the hardware setting) that we saw a big leap, you're starting to sound desperate at this point and you should look up what off the shelf actually means. The N64 had a GPU for you to even try and pass that off shows how off key you are N64 had one of the first programmable GPUs in the RCP it had features that even PCs were just getting, look up how Ken Kutaragi fought tooth and nail to make sure PS3 had a GPU because Sony wanted it to run purely off a CPU and he said it just can't work with out it, consoles have to have a GPU of some kind. Forethought and Hynid have adressed everything else so need for me to repeat their posts.

Scoobe

GCN is a GPGPU architecture like I said, AMD supplied each company with a base GPU which is GCN hardware and each company made their custom changes to them depending on their console design. MS have a video of their customization that they released before E3 for reference, the closer to PC then before term is referance to how both consoles aren't using ground up propriety hardware to allow GCN which makes development across all platforms a lot easier that's all it means, AMD introduce Mantle soon as aswell for better optimization for all platforms.



bonzobanana said:
jonathanalis said:
how many times much better than ps360 a system have to be to show graphics like the 2 wii U tech demos showed in E3 2011?

After seing that i just cant believe that wii U isnt at least 3 or 4 times more powerfull than ps360.


Don't you realise how mad that comment is? Nothing at E3 is important, the important thing is the results of real hardware.  Microsoft  used a nvidia gpu based PC's to demonstrate the xbox one at E3. The benchmark of what the wii u is capable of is an actual wii u. We are post launch now. 


It was important for N64 and GC.

In these consoles we saw way much better visual games than the tech demos showed at E3.

That image motiveted me to tell this madness:

http://3.bp.blogspot.com/-FZ_OA-ITDUk/UMtHx5cFc1I/AAAAAAAADIE/Ls0kZ7URz-E/s1600/Zelda+Wii+U+-+Nintendo+Blast.png

Its a Nintendo caracteristic to be true with what is possible to do in the console in the E3, i supose they would be true this time again.



forethought14 said:
fatslob-:O said:

Your forgetting the fact that the eDRAM takes up a significant amount of die space. After all cache doesn't cost a small amount of transistors. BTW I don't literally mean "off the shelf" that was a slight hyperbole. By that I mean pretty damn similar. If it truly had around 700 million transistors of enabled logic then why is it so hard for the wii u to completely beat the PS360 and why does it consume 35 watts in total not including the disc drives etc ? Right now 600 million transistors of logic make sense because the actual graphics processing component is around 100mm^2 not the 156mm^2 you initally thought. It could easily compare to a 320 shader part that has some disabled shaders too and BTW none of those 160 shader parts make sense because they only have 4 rops so don't just assume that I am referring to 160 shader parts. An HD 5550 is looking pretty likely right now for what nintendo has used as a base. 

Oh as for your "latte" having more logic do you even know if all of that is ENABLED logic. IE the shaders that ACTUALLY WORK. It's very common to see some alot of GPU manufacturers disable a part of the DIE that NOT WORKING. Since the "latte" probably has around 900 million transistors in total and a third of it is probably reserved for things like the eDRAM. Half of it is probably used for things like GPU logic and the rest is used to create extra eDRAM and and GPU logic so that the chip doesn't end up having less yields.

Do we even have a DF or LOT analysis to even come to the conlusion that it runs worse on the PS4 ? (Doesn't matter anyways since the PS4 is just 5 days away from analysis.) 

I can't answer that, because I simply don't know. I think the question that should be asked, is what functions are being used right now? It has a tessellator on fixed-function silicon, that's likely not being used right now, it has extra GPRs compared to a conventional R700 series GPU, all of that extra cache (Nintendo has mentioned that Wii U was heavily reliant on memory) how much of that is actually being used? Unfortunately with Wii U dev kits being badly documented, I wouldn't be surprised if a lot of GX2 functions of Latte are being left unused. Like how Two Tribes suddenly discovered a hardware feature that reduced memory usage, and saved them 100MB. Was this feature not documented? How would they have just "discovered" something.

OK, then say it some other way. Saying "Off shelf part" makes it seem like Nintendo took a, for example, 5550, looked at it, and thought, "Hmm...this one's good, get rid of useless stuff and put eDRAM". They've worked on the console for over 3 years, that's a lot of time to have done all sorts of things to the GPU. 

eDRAM is 40nm Renesas eDRAM, we know information from this DRAM as it comes directly from the Renesas website, it should take around 220 million (+/- a few million), leaving 717 million for the GPU (depending on the million transistors per mm2). You're forgetting that off-shelf parts use die space for things you'll need for them being PC parts. Latte won't need those components and can easily be used for something else. And no, Latte has a DX 10.1 (plus extras) feature set-like compatibility for GX2 (based on documentation), it's not based on a 5000 series GPU (full DX 11 compatibility). It has an R700 base, and they modified it heavily from there. AMD does work with customers, and would allow modifications of stuff like shaders to work better on particular hardware.

And I never said that all of that 156.21mm2 was GPU logic, I already subtracted all of the eDRAM and I apologize for bad wording, I'm referring to the "logic" as anything usable to gaming, everything that's on the die that can technically be used for gaming purposes. However, eDRAM is also a factor for gaming too, conventional GPUs don't have that, and it can be very useful for increasing performance. You seem to be passing this off like it's disposable and not part of the entire GPU system. Heck, even for the CPU since it also has direct access. ROPs though, would not make a big difference in transistors, even if they added in 4 more. 

And I wasn't thinking about certain Wii-related functions in the GPU that are likely used for BC purposes, so I suppose it is possibly lower than what I have calculated. Though, going by Shiota's (from Iwata Asks) comment, he makes it seem like they simply took Wii U parts, and modified them so that they could be used for Wii BC as well. Of course, we don't know what exactly what they're talking about, but since they're talking about BC on Wii U for Wii, that could be the case. And I believe Marcan said that Wii emulation wasn't being run by an implanted "Hollywood GPU", so this could mean Wii GPU emulation is done from the same Wii U parts, meaning that not much would have been wasted for Wii GPU emulation at all, if any was. Though, if that's not true, that would put transistors at around 600 (Wii BC logic + gamepad compression shouldn't take up much space at 40nm) would still put it above some parts with more shaders. Plus, Latte is produced on a more mature 40nm process than any 5000 series GPU was....

You know, that "unusable logic" idea factors more for the commercial off-shelf GPU, a console GPU will not have as much if there's any, otherwise that'll be a terribly designed GPU if it has so much unused logic.

And no, we don't have a DF analysis, but so far, reviewers have complained about these framerate issues. If those framerate issues are there (unless the consoles playing those games are malfunctioning in some way), then I would blame the developers, not the hardware. 

Why would the latte be based off of the HD 4000 series ? The only 40nm hd 4000 series part was the 4750 and the 4770 and their they have way too much power consumption even accounting for lower clocks too and what's more is that they have too much die space. 

Just so y'know the intel iris pro or the GT3e has eDRAM too. I also realized that the CPU could access the eDRAM so what ? 

@Bold That's a terrible assumption that you made right there about photolithography! This applies to EVERY CHIP. Defects are very common in every worlds of complex integrated circuits. The reason that manufacturers create extra logic in the first place is to fight off bad yields and defective parts of the chip. The hd 5670, 5570 and the 5550 all use the very SAME DIE but y'know what, the hd 5550 is SIGNIFICANTLY weaker than the 5670.

http://www.tomshardware.com/reviews/radeon-hd-5550-radeon-hd-5570-gddr5,2704-7.html

BTW the WII U has a smaller DIE too. 

So don't just go off assuming that this mostly applies to PC GPUs infact consoles are guilty of defect too and what's more is that the WII U has around 900 million transistors so it's pretty prone to mistakes also. YOU CAN'T HAVE ALL PERFRECT 900 MILLION TRANSISTORS. A FEW OF THEM HAS TO HAVE SOME DEFECTS. 

Doesn't matter we'll get a DF analysis eventually to conclude this. 



Around the Network
Wyrdness said:
fatslob-:O said:

Sorry but that optimization excuse don't work out bra. The PS360 could out their own predecessors so why isn't the WII U doing the same thing to EVERY game. Even the PS4 and X1 could shit on their current gen counterparts easily bra. If the WII U had more bruteforce power to out the PS360 why ain't it performing better ? All that's needed to make a game look and run better is a significantly more powerful GPU an clearly the WII U lacks this. This generation ain't exactly over yet. Consoles altogether this generation started to use the GPU more and became more PC like in their philosophies and consoles altogether next generation are even dressed up PC's. Hell it's thanks to the WII Us different CPU that it ain't branded as a dressed up PC yet but that don't matter too much when much of it's power comes from an off the shelf PC GPU LMAO. If your referring to even older consoles like the PS1 and the N64 they also relied alot on CPU too and the N64 didn't even have a graphics processor! 


Did you see the first few 360 games? They were far from a huge leap I even named one earlier, this is what calls you out on your damage control, selective memory is a common trait with people doing this, it wasn't until Oblivon and Gears arrived (two games that were optimized for the hardware setting) that we saw a big leap, you're starting to sound desperate at this point and you should look up what off the shelf actually means. The N64 had a GPU for you to even try and pass that off shows how off key you are N64 had one of the first programmable GPUs in the RCP it had features that even PCs were just getting, look up how Ken Kutaragi fought tooth and nail to make sure PS3 had a GPU because Sony wanted it to run purely off a CPU and he said it just can't work with out it, consoles have to have a GPU of some kind. Forethought and Hynid have adressed everything else so need for me to repeat their posts.

Scoobe

GCN is a GPGPU architecture like I said, AMD supplied each company with a base GPU which is GCN hardware and each company made their custom changes to them depending on their console design. MS have a video of their customization that they released before E3 for reference, the closer to PC then before term is referance to how both consoles aren't using ground up propriety hardware to allow GCN which makes development across all platforms a lot easier that's all it means, AMD introduce Mantle soon as aswell for better optimization for all platforms.

Nonetheless the xbox 360 had ALL of the superior multiplats! The one damage controlling is you. FYI the RCP is more of a coprocessor like the xeon phi rather full blown GPU. The only thing programmable about the RCP is the RSP which is based off of the MIPS 4000 integer vector processor. Hence, why it's a co-processor rather than a GPU. As for the RDP that's fixed function. The RCP is getting those things because that thing is practically a CPU that acts like a GPU for the most part and y'know what CPU's did back in the day ? (You probably don't know. Answer carefully otherwise you'll be branded as a fraud for that "I game on PC" cop out statement.) When did ken kutuaragi fight tooth and nail for a GPU ? If anything that dude wanted the PS3 to be cell processor only machine. It was the engineers with ken kutaragi that argued that the GPU was necessary.

What more delusional crap do you want to bring up now ? 



fatslob-:O said:

Nonetheless the xbox 360 had ALL of the superior multiplats! The one damage controlling is you. FYI the RCP is more of a coprocessor like the xeon phi rather full blown GPU. The only thing programmable about the RCP is the RSP which is based off of the MIPS 4000 integer vector processor. Hence, why it's a co-processor rather than a GPU. As for the RDP that's fixed function. The RCP is getting those things because that thing is practically a CPU that acts like a GPU for the most part and y'know what CPU's did back in the day ? (You probably don't know. Answer carefully otherwise you'll be branded as a fraud for that "I game on PC" cop out statement.) When did ken kutuaragi fight tooth and nail for a GPU ? If anything that dude wanted the PS3 to be cell processor only machine. It was the engineers with ken kutaragi that argued that the GPU was necessary.

What more delusional crap do you want to bring up now ? 


I think you should look up what damage control means before you use it as a number of you posts contain incoherant uses of terms and words, the RCP was dedicated to doing the work of a GPU hence why I pointed out all consoles need a GPU of some form that's the point, yeah I got the engineers mixed up yet it still proves my point that something dedicated to graphical duties has to be there. What's hilarious is your constant babble about gaming on PC highlighting that you must well beyond upset about it, explain where gaming on PC means you have to be a tech head I have over 100 games (concrete proof, something you're not familiar with) on my Steam account as well as several on my Battle.Net account and even Streaming Diablo 3. I'd honestly be surprised if you knew what a CPU is tbh let alone what it does, for your knowledge a CPU executes logical, input/output of a system by following the instructions of programing seeing as you ask.

Gaming on PC means just that kiddo, I'm a gamer on the PC platform go back and read and you'll see I never said to be a tech head, the irony in your post is that you've formed some delusional view of this thinking the's an underlining meaning trying to disprove it with words like fraud yet your own self professed technical know how has been called out and questioned and quite effectively as well, for all your attempts to look slick you just look mad now.



Wyrdness said:
fatslob-:O said:

Nonetheless the xbox 360 had ALL of the superior multiplats! The one damage controlling is you. FYI the RCP is more of a coprocessor like the xeon phi rather full blown GPU. The only thing programmable about the RCP is the RSP which is based off of the MIPS 4000 integer vector processor. Hence, why it's a co-processor rather than a GPU. As for the RDP that's fixed function. The RCP is getting those things because that thing is practically a CPU that acts like a GPU for the most part and y'know what CPU's did back in the day ? (You probably don't know. Answer carefully otherwise you'll be branded as a fraud for that "I game on PC" cop out statement.) When did ken kutuaragi fight tooth and nail for a GPU ? If anything that dude wanted the PS3 to be cell processor only machine. It was the engineers with ken kutaragi that argued that the GPU was necessary.

What more delusional crap do you want to bring up now ? 


I think you should look up what damage control means before you use it as a number of you posts contain incoherant uses of terms and words, the RCP was dedicated to doing the work of a GPU hence why I pointed out all consoles need a GPU of some form that's the point, yeah I got the engineers mixed up yet it still proves my point that something dedicated to graphical duties has to be there. What's hilarious is your constant babble about gaming on PC highlighting that you must well beyond upset about it, explain where gaming on PC means you have to be a tech head I have over 100 games (concrete proof, something you're not familiar with) on my Steam account as well as several on my Battle.Net account and even Streaming Diablo 3. I'd honestly be surprised if you knew what a CPU is tbh let alone what it does, for your knowledge a CPU executes logical, input/output of a system by following the instructions of programing seeing as you ask.

Gaming on PC means just that kiddo, I'm a gamer on the PC platform go back and read and you'll see I never said to be a tech head, the irony in your post is that you've formed some delusional view of this thinking the's an underlining meaning trying to disprove it with words like fraud yet your own self professed technical know how has been called out and questioned and quite effectively as well, for all your attempts to look slick you just look mad now.

Still haven't haven't answered why CPUs were important back in they day eh ? You still side stepping the fact that the RCP was a co-processor instead of a GPU eh ? Just because some processor is dedicated to doing the work of a GPU doesn't mean that it's a GPU. (Ever heard of software rendering, fraud ?) 

@Bold Dat all you know ? LOLOL



bonzobanana said:
curl-6 said:
bonzobanana said:
With any discussion about the latte fabrication you have to bear in mind the huge amount of stuff included in the latte, not only is a huge part the edram but there is also the arm cpu, audio dsp, 1MB wii u texture memory, 2MB wiii frame buffer, wii gpu, sections designed for high speed compression of the wii u gpu frame buffer and downscaled to fit in the wii gpu frame buffer. There is a huge amount of additional stuff.

The 176 gflops figure is completely realistic, not only does it fit in with the power consumption figures but it fits in with how the wii u is actually performing. Let's not forget the generational difference between 360/PS3 gpu's and the later radeon's is quite significant in performance.

If the latte really is 352 gflops and the architecture is much improved plus you have 32MB of high speed edram then what the hell is going wrong that the wii u that it is struggling to outperform 360/PS3 graphically? The figure of 176 gflops makes total sense, it simply works with all the information we have. If the latte really is 352 gflops then something has gone horribly wrong in the design of the wii u that is creating major issues. I don't believe this. I believe Nintendo have a designed a console at the absolute minimum price to merely match current gen performance overall. However as a wii u owner I'm more than happy to be proved wrong but the evidence surely dictates that of the possible range of gflops performance 176-352 gflops where once we were clinging on to believing it was 352 gflops infact the lower figure is much more realistic.

I guess an alternative view is that CPU is so weak that the compute functionality of the radeon gpu is being utilised for practically every game and compromising graphic output. I don't believe this myself. I also don't believe the wii u console is hard to develop for, I believe the complete opposite is true and I don't believe all developers are being lazy on wii u either.

Ultimately I believe the wii u is a low performance console designed to a specification for good quality cartoon graphics for Nintendo games and a huge profit for Nintendo and its shareholders (if it sold well). A continuation of their 'withered technology' philosophy that was so successful for gameboy, wii etc.

http://en.wikipedia.org/wiki/Gunpei_Yokoi

Current PS3/360 games are built on 8 years of optimization for their specific hardware. Wii U hasn't even been out for a year yet, no console is maxed out that fast. You need exclusives designed around a console's  specific hardware to really show off what it can do, and no Wii U exclusive so far has aimed for high end graphics, they all adopt simpler styles that preclude technological boundary-pushing.

The Wii U has yet to have its Uncharted 2, its Gears of War, its Mario Galaxy, that one game that really puts its chipset through its paces and shatters its established graphical standards.

We don't know how mature the development software is but the wii u is not a complicated design. It's using a well known and well documented radeon gpu (which one we aren't sure of) and I don't think there are any surprises when it comes to the 32 bit CPU which dates back to 1997. The wii u is not a ps3 or even a xbox 360 its a much simpler design. When the ps3 and 360 came out they were cutting edge hardware and new to market that is not the case with wii u. Also just about the most impressive gamecube/wii title is Rogue Squadron despite being a launch title.  Also Halo represents a very level of optimisation on the original xbox. The wii u is not cutting edge its a combination of well documented and mature components clearly this will have an effect on what level of optimisation it achieves. Did the wii make great strides over the gamecube in optimisation beyond the extra 64MB memory and 50% overclock plus dvd storage capacity. I certainly don't think so.

The gamecube itself hit the ground running with great performance unlike the ps2 because it was a much better design to develop for as is the wii u.

We don't know for a fact that Wii U GPU is well documented; Nintendo traditionally uses customized parts, the same is likely true here.

Also, it is very different from PS3/360, which all current multiplats are designed around. Forcing a game designed for one architecture onto another never gets the best results.

Wii was familiar hardware yet it did not show it's full power at launch. Its most technically advanced games came in its 3rd, 4th, and 5th years.

Rogue Squadron 2 was a freak occurance, and was in fact surpassed technically by its sequel two years later.

Halo 1 was nowhere near the most advanced Xbox game, it pales in comparison to its sequel, and to games like Conker Live and Reloaded, which came much later in the system's lifespan.



The Wii itself was never even fully utilized^