By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

Pemalite said:
oni-link said:

Yawn...*looks around* yups still the same old people going around and around and around the same topic!!! Anyways while you guys fight over transistors, power draw, ALU, TMU etc (I find that pretty boring now) the real gamers are gonna be enjoying this in less than 2 weeks!!!:

 


One thing Nintendo has *always* excelled in, IMHO is the art style. They don't need crazy Polygon counts to make something look pleasing to the eye.
With that in mind... Talk about crazy amounts of Aliasing, limited amounts of shadows and lighting and low-resolution textures! :P

Still, I genuinely hope you enjoy it in 2 weeks time!


Who cares game looks gorgeous except for the haters!!! YES I will be enjoying this game for a looooong time.  I actually will probably take a few weeks off posting to enjoy this game to it's full potential!!! Mario 3D in HD is really something to behold!!!

 



Around the Network
oni-link said:
Pemalite said:
oni-link said:

Yawn...*looks around* yups still the same old people going around and around and around the same topic!!! Anyways while you guys fight over transistors, power draw, ALU, TMU etc (I find that pretty boring now) the real gamers are gonna be enjoying this in less than 2 weeks!!!:

 


One thing Nintendo has *always* excelled in, IMHO is the art style. They don't need crazy Polygon counts to make something look pleasing to the eye.
With that in mind... Talk about crazy amounts of Aliasing, limited amounts of shadows and lighting and low-resolution textures! :P

Still, I genuinely hope you enjoy it in 2 weeks time!


Who cares game looks gorgeous except for the haters!!! YES I will be enjoying this game for a looooong time.  I actually will probably take a few weeks off posting to enjoy this game to it's full potential!!! Mario 3D in HD is really something to behold!!!

 

Best 3D World screen:



curl-6 said:
oni-link said:
Pemalite said:
oni-link said:

Yawn...*looks around* yups still the same old people going around and around and around the same topic!!! Anyways while you guys fight over transistors, power draw, ALU, TMU etc (I find that pretty boring now) the real gamers are gonna be enjoying this in less than 2 weeks!!!:

 


One thing Nintendo has *always* excelled in, IMHO is the art style. They don't need crazy Polygon counts to make something look pleasing to the eye.
With that in mind... Talk about crazy amounts of Aliasing, limited amounts of shadows and lighting and low-resolution textures! :P

Still, I genuinely hope you enjoy it in 2 weeks time!


Who cares game looks gorgeous except for the haters!!! YES I will be enjoying this game for a looooong time.  I actually will probably take a few weeks off posting to enjoy this game to it's full potential!!! Mario 3D in HD is really something to behold!!!

 

Best 3D World screen:

they all look beautiful!!!



fatslob-:O said:

Why would the latte be based off of the HD 4000 series ?.....

Just so y'know the intel iris pro or the GT3e has eDRAM too.....

@Bold That's a terrible assumption .......

Doesn't matter we'll get a DF analysis eventually to conclude this. 

"Based off of" is not the best phrase, more like "modeled" after. I'm pretty sure it's modeled after the R700 series, and that's because:

1. The leaked specs back in July of 2012 stated as such, and and those are accurate. ("modeled after" Open GL and R700, that doesn't mean take a 4770 and cut stuff from it)

2. Wii U development began in April 2009 according to Iwata Asks, the 5000 series wasn't even out yet, and I extremely doubt Nintendo would be thinking about using the most modern stuff (considering their track record lately, it's unlikely they would do that). 

Smaller die than the 55xx? 156.21 is not smaller than 104. Also, don't be surprised if Latte had even more transistors per mm2 than those cards, since it's produced on a more mature 40nm process than those 55xx cards in 2010

It's not a terrible assumption (I'm not assuming anything) because it all depends on how it's being manufactured, where it's being manufactured, how Nintendo is combatting this issue and what parts are being used. 40nm process is very mature now, yield issues will at the most minimal level. The largest of yield issues that came from TSMC for 40nm were from 2009-2010, Latte began manufacturing around mid 2012, it won't face the same yield issues TSMC was facing for the 5000 series. But aside that, yield issues or no yield issues, it doesn't change anything, if both have issues according to you, then why are you making this a big deal? Just let it go. 

My point with mentioning the eDRAM, is that you're riding off the eDRAM and forgetting that it's a part of the entire Wii U hardware system. Latte has an advantage in having extra potential performance of on-die eDRAM, conventional GPUs don't have this luxury. 

OK, now I'm done with this. I'm just going to let the games do the talking, because obviously I can't speak for the GPU completely, let it speak for itself.

curl-6 said:

Best 3D World screen:

Pretty!



fatslob-:O said:
Hynad said:
fatslob-:O said:
Hynad said:

Frame rate is bad in most games. Super Mario Galaxy runs very slowly, so does Last Story, and Xenoblade is a mixed bag.

https://www.youtube.com/watch?v=cmeWmtH96js

https://www.youtube.com/watch?v=zx4rHGENSSM

https://www.youtube.com/watch?v=DsU3nm7t2dk

These videos with similar specs to your PC says bery differently. Are you sure you aren't being dishonest about your results ? For shadow of the colossus you would need an i5-2500K overclocked to 4.3 ghz and a GTX 660 with SPEED HACKS to get it running at 1080p. 

I assure you I play Final Fantasy XII, both Kingdom Hearts games, and Dragon Quest VIII on my PC, and they run surprisingly well in 1080p except for  Dragon Quest VIII, the most demanding of them, that runs in 720p. I don't have SOTC, so I can't say how it runs on my PC.

I don,t know what else to say. xD Maybe I could try to make a video of them running. But I've never tried to do that, so I may need some time. =P

One thing you need to know about Dolphin, it runs better in DirectX 9 mode.

@Bold That part is true because a few games will miss some graphical effect when rednering in DX9 mode but that's done because of speed gains.

Were you running the WII games in DX11 mode ? BTW those games don't look very demanding to me and PCSX2 can run them pretty well but as for games like MGS3 and zone of the enders those will teach you a lesson to not have anything lower than an overclocked i5-2500K.  

When I mean that PS2 emulation is CPU limited I mean that it's really CPU LIMITED. The reason why the cpu power is very important for PS2 emulation is because the CPU essentially does most of the work for emulation. They have to emulate alot of compenents such as the EE and it's VUs which are very complex. Despite the fact that the PS2 has a multicore architecture the components needed very tight communication so a lower amount of threads were more preferable due to the fact that each component was required to pass information very quickly to another processing component and in turn caused PCSX2 to initially use 1 core but very quickly transisitioned to a usage of 2 cores because the developers sought out that the the GS could be isolated and in turn be made very threadable for more performance gains. The reason why PCSX2 can now use 3 cores is due to the fact that they have figured out a way to isolate VU1.

Hence why it is easier to emulate the WII rather than the PS2 because the gains in CPU performance hasn't skyrocketed like GPUs and the WII as well as GC is very dependent on the GPU to do most of it's graphics task. 

That "lecture" was not needed. -__-

I'll try ZotE and MGS3 when I get back from work later today, and share the results with you. 



Around the Network
Pemalite said:
curl-6 said:
fatslob-:O said:

We all know that it wasn't exactly off the shelf but it's design nowadays pretty outdated. It still more closer to pc gpus than you think. The gamecube basically represents alot of pre DX8 PC GPUs. It wasn't using anything exotic like sgi's coprocessor from the N64 or powerVR graphics which didn't support hardware T&L like the ati flipper.

I'm aware Flipper and Hollywood had similarities with DX7 era PC GPUs, point I'm making is Nintendo has never used an off-the-shelf GPU for its home consoles. 

If Wii U's GPU was off the shelf we would have known its exact make and model ages ago, we would know every little thing about it. The reason we don't is because it can't be identified as any existing GPU. It's a modified design to meet Nintendo's specific demands regarding power consumption, heat, processing power, etc.


Only on the surface. The TEV Could do things the Geforce 3/4Ti Couldn't do and vice versa and could even pull off some of the same SM1.0/SM1.1 etc' effects with a bit of work, so I wouldn't really say it's similar to Direct X 7 GPU's at all, when programmed it's way, the Flipper could easily pull it's weight.
With that said though, even Direct X 7 cards such as the Radeon 7500 had programmable pixel shaders to a certain degree, just games never used it because it wasn't standard in any major API's that games targeted.  (Which is the same Story with Tessellation on the Radeon 8500, 9700 series, at-least a few games used it.)

Ironically, Flipper and Hollywood isn't based on any of AMD or ATI's prior generation of GPU's as it was designed by a company known as ArtX before ATI bought them out, that team then wen't on to build the R300 series of GPU's which pretty much dominated nVidia's Geforce FX, there are some minor similarties in R300 and Flipper on a very very low level though, I still wouldn't even call them cousins thrice removed.

As for the Wii U's GPU, we know it's based on a Radeon 5000/6000 series of GPU's since it includes Tessellation and Geometry Shaders, what extent Nintendo chose to modify that class of chip remains to be seen, but it's certainly a closer relatation to PC GPU's than the Gamecube or Wii ever was.

Wait a minute. Does this mean that AMD also has silicon graphics engineers.



globalisateur said:
Scoobes said:

What customizations though? The APUs are based nearly entirely of parts normally found in PC architecture. Based on what's been revealed I only know of 2 customizations on the APUs themselves. On PS4, the increase in ACEs and the number of compute queues each ACE can handle, whilst the X1 has the eSRAM. On both you have a few minor additions (mainly for audio which would normally be handled on the motherboard anyway).

They're as PC-like as you can get.

I completely disagree.

How can you arrive at this conclusion? Why do you ignore the most important stuff? There are more than those 2 customizations. The PS4 APU (and the whole console) is highly customized by Sony:

- Volatile Bit cache for compute/graphics better simultaneous uses.

- Onion bus which can completely bypass GPU caches.

- Unified GDDR5 memory which is a first for a CPU (not obviously for a GPU).

- Low power mode which can just power the ram, nothing else, not even the CPU (nor the GPU obviously). (not sure here, can recent PCs do that?)

 

All those elements to my knowledge had never been done on a APU before. Even the PS4 motherboard glorious sleek design is very far away from a PC motherboard as seen on the PS4 tear down. Not even talking about the small size of the machine nor the ingenious cooling of the machine: obliqued shape for insuring air release, circular "slits" which insure air entrance even if console blocked by stuff, passive cooling of GDDR5 chips on the EM shield (the ones that are at the back of the motherboard!! so much usual on PC!!), advanced active cooling by state of the art Sony centrifugal fan. Finally the PS4 APU is the bigger ever designed.

It is as much customized you can get (nowadays) actually.

In my opinion.

Considering the rest of the architecture (the vast majority), those are fairly minor things that are have been in AMD's pipeline (and probably would be regardless of AMDs work on consoles). As for the low power mode, most laptops have had a sleep mode since before the start of the current gen. This is pretty much the same thing from what I'm aware of.

The APU is based on the latest AMD tech, but it's still based on PC tech.

And the cooling? There are so many different variations on how to cool PCs that whilst the PS4s cooling is clever (not surprising considering Sony are a hardware company), it's not anything revolutionary either. The PS3 cooling was more impressive imo.



fatslob-:O said:

Wait a minute. Does this mean that AMD also has silicon graphics engineers.

Sure does.



--::{PC Gaming Master Race}::--

Pemalite said:
fatslob-:O said:

Wait a minute. Does this mean that AMD also has silicon graphics engineers.

Sure does.

I wonder if their the ones who were more involved in designing the R7 260X and the R9 290 as well as the R9 290X chips. 

I still wonder why silicon graphics went belly up. 



fatslob-:O said:
Pemalite said:
fatslob-:O said:

Wait a minute. Does this mean that AMD also has silicon graphics engineers.

Sure does.

I wonder if their the ones who were more involved in designing the R7 260X and the R9 290 as well as the R9 290X chips. 

I still wonder why silicon graphics went belly up. 


They wen't belly-up more than once. :P
Basically not enough design wins.

Matrox took over the professional market, nVidia and AMD wen't with the gamer then compute markets.
3dfx was bought by nVidia, PowerVR went mobile, S3 was bought by Via then sold off to HTC.
XGI was spun-off from SiS and tried to break into the market and failed, SiS then bought them out.
Intel and AMD started to include a free-GPU in all their platforms which squeezed out IGP's from Via, nVidia, SiS from the market.
NEC faltered and left.
Tseng was bought out by the then-ATI.
Cirrus Logic, Trident and Texas Instruments sold off/left the GPU market.

In the end we are left with a Duopoly.
PC gamers essentially fund the technology that goes into the consoles, console's are low-profit business for GPU manufacturers it's not enough to cover R&D costs if the big three wanted a new GPU built from scratch, AMD might make a couple Billions dollars over 10 years from all three consoles if they are lucky, which for AMD is a great thing, they need all the cash they can get.

As for who designed the R9 290 series, well. Not allot of *new* engineering wen't into it, most of the work was done with Bonaire from an architectural perspective, just AMD took advantage of the mature 28nm to blow up the die size.
AMD has also cut engineers to try to become profitable again, which might explain WHY we will have had the same GPU's for 3 god damn years.



--::{PC Gaming Master Race}::--