Quantcast
NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Well we all know that the WII U can probably support pixel shaders too so what other excuse is there left ? 

Wii U supports pixel shaders, that is a given, going by the effects we're seeing it pull off. But that doesn't mean it is an off-the-shelf GPU, as Nintendo tends not to use these. If it was an off-the-shelf part it would have been identified immediately.

It'd be foolish to think that nintendo would have customized the GPU by themselves when they have even less experience in designing hardware than sony or amd. That's why nintendo looks to ibm and amd to do the job for them. BTW the latte likely comes from an existing GPU architecture from AMD. Think outside of the box curl. (Nintendo couldn't have possibly had the experience to simply change the architecture by itself.) 

Of course Ninendo doesn't customize Latte themselves, they simply tell AMD "we want x figures for power consumption, heat, processing power, etc, make us a chip that does this and we'll buy them from you."

I also don't think AMD would waste hundreds of millions of dollars to build a new GPU architecture specifically for the WII U because afterall AMD is in the red LMAO. The only addition AMD integrated to their VLIW 5 architecture was the eDRAM and it's not that complicated. Just like how intel was able to get 128 mb into their iris pro chips and we all know intel has a history of designing extremely craptastic graphics chips. 

Making a modified chip would not cost hundreds of millions of dollars. And they were happy to produce unique chips for the Wii.

They weren't exactly unique for the most part because the ati flipper had alot in common with PC GPUs compared to what was in the PS2 and N64. BTW ATI back in the day (Which is now AMD because they got bought out by them.) had alot of money to do it but the reason I say AMD didn't modify the GPU architecture for the WII U was because AMD ain't doing so hot compared to it's older days. Amd would be willing to create a new architecture IF they had the money. 



Around the Network
fatslob-:O said:

They weren't exactly unique for the most part because the ati flipper had alot in common with PC GPUs compared to what was in the PS2 and N64. BTW ATI back in the day (Which is now AMD because they got bought out by them.) had alot of money to do it but the reason I say AMD didn't modify the GPU architecture for the WII U was because AMD ain't doing so hot compared to it's older days. Amd would be willing to create a new architecture IF they had the money. 

A console GPU deal is a lucrative prospect; the amount of money they must have made on Wii's GPU, printed over 100 million times, was no doubt strong incentive to make a GPU for its successor. Flipper may have been less alien than PS2's Graphics Synthesizer, but it was still a unique part, not off the shelf.



curl-6 said:
fatslob-:O said:

They weren't exactly unique for the most part because the ati flipper had alot in common with PC GPUs compared to what was in the PS2 and N64. BTW ATI back in the day (Which is now AMD because they got bought out by them.) had alot of money to do it but the reason I say AMD didn't modify the GPU architecture for the WII U was because AMD ain't doing so hot compared to it's older days. Amd would be willing to create a new architecture IF they had the money. 

A console GPU deal is a lucrative prospect; the amount of money they must have made on Wii's GPU, printed over 100 million times, was no doubt strong incentive to make a GPU for its successor. Flipper may have been less alien than PS2's Graphics Synthesizer, but it was still a unique part, not off the shelf.

Ahhhhh but even after the fact that AMD made made lots of money manufacturing GPUs for console manufacturers they are still losing money. Like nvidia said, "Consoles have low profit margins." and their statement holds true for the most part because over time these GPUs get cheaper to manufacture so the profit margins reduce over time. The exact reason as to why AMD sought out every console manufacturer is so they could get developers to tailor their development towards GCN based GPUs to gain a performance advantage over nvidia in the PC market space so that they can get more people to buy their GPUs where there's higher profit margins. AMD mantle and hUMA are the stepping stone to gain alot of market share and I have hopes that they can redeem themselves by doing this. Your right about the flipper being slightly unique but it's much closer to PC GPUs than you think.



fatslob-:O said:
curl-6 said:
fatslob-:O said:

They weren't exactly unique for the most part because the ati flipper had alot in common with PC GPUs compared to what was in the PS2 and N64. BTW ATI back in the day (Which is now AMD because they got bought out by them.) had alot of money to do it but the reason I say AMD didn't modify the GPU architecture for the WII U was because AMD ain't doing so hot compared to it's older days. Amd would be willing to create a new architecture IF they had the money. 

A console GPU deal is a lucrative prospect; the amount of money they must have made on Wii's GPU, printed over 100 million times, was no doubt strong incentive to make a GPU for its successor. Flipper may have been less alien than PS2's Graphics Synthesizer, but it was still a unique part, not off the shelf.

Ahhhhh but even after the fact that AMD made made lots of money manufacturing GPUs for console manufacturers they are still losing money. Like nvidia said, "Consoles have low profit margins." and their statement holds true for the most part because over time these GPUs get cheaper to manufacture so the profit margins reduce over time. The exact reason as to why AMD sought out every console manufacturer is so they could get developers to tailor their development towards GCN based GPUs to gain a performance advantage over nvidia in the PC market space so that they can get more people to buy their GPUs where there's higher profit margins. AMD mantle and hUMA are the stepping stone to gain alot of market share and I have hopes that they can redeem themselves by doing this. Your right about the flipper being slightly unique but it's much closer to PC GPUs than you think.

Flipper had some elements in common with DX7 PC GPUs, but it wasn't off the shelf, they made the chips specifically for the Gamecube, and didn't use/sell them for anything else. By the time Wii rolled around, this architecture was long gone from the PC space, Wii's Hollywood GPU was and is the only register combiner era GPU still being made.



curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

They weren't exactly unique for the most part because the ati flipper had alot in common with PC GPUs compared to what was in the PS2 and N64. BTW ATI back in the day (Which is now AMD because they got bought out by them.) had alot of money to do it but the reason I say AMD didn't modify the GPU architecture for the WII U was because AMD ain't doing so hot compared to it's older days. Amd would be willing to create a new architecture IF they had the money. 

A console GPU deal is a lucrative prospect; the amount of money they must have made on Wii's GPU, printed over 100 million times, was no doubt strong incentive to make a GPU for its successor. Flipper may have been less alien than PS2's Graphics Synthesizer, but it was still a unique part, not off the shelf.

Ahhhhh but even after the fact that AMD made made lots of money manufacturing GPUs for console manufacturers they are still losing money. Like nvidia said, "Consoles have low profit margins." and their statement holds true for the most part because over time these GPUs get cheaper to manufacture so the profit margins reduce over time. The exact reason as to why AMD sought out every console manufacturer is so they could get developers to tailor their development towards GCN based GPUs to gain a performance advantage over nvidia in the PC market space so that they can get more people to buy their GPUs where there's higher profit margins. AMD mantle and hUMA are the stepping stone to gain alot of market share and I have hopes that they can redeem themselves by doing this. Your right about the flipper being slightly unique but it's much closer to PC GPUs than you think.

Flipper had some elements in common with DX7 PC GPUs, but it wasn't off the shelf, they made the chips specifically for the Gamecube, and didn't use/sell them for anything else. By the time Wii rolled around, this architecture was long gone from the PC space, Wii's Hollywood GPU was and is the only register combiner era GPU still being made.

We all know that it wasn't exactly off the shelf but it's design nowadays pretty outdated. It still more closer to pc gpus than you think. The gamecube basically represents alot of pre DX8 PC GPUs. It wasn't using anything exotic like sgi's coprocessor from the N64 or powerVR graphics which didn't support hardware T&L like the ati flipper.



Around the Network
fatslob-:O said:

We all know that it wasn't exactly off the shelf but it's design nowadays pretty outdated. It still more closer to pc gpus than you think. The gamecube basically represents alot of pre DX8 PC GPUs. It wasn't using anything exotic like sgi's coprocessor from the N64 or powerVR graphics which didn't support hardware T&L like the ati flipper.

I'm aware Flipper and Hollywood had similarities with DX7 era PC GPUs, point I'm making is Nintendo has never used an off-the-shelf GPU for its home consoles. 

If Wii U's GPU was off the shelf we would have known its exact make and model ages ago, we would know every little thing about it. The reason we don't is because it can't be identified as any existing GPU. It's a modified design to meet Nintendo's specific demands regarding power consumption, heat, processing power, etc.



curl-6 said:
fatslob-:O said:

We all know that it wasn't exactly off the shelf but it's design nowadays pretty outdated. It still more closer to pc gpus than you think. The gamecube basically represents alot of pre DX8 PC GPUs. It wasn't using anything exotic like sgi's coprocessor from the N64 or powerVR graphics which didn't support hardware T&L like the ati flipper.

I'm aware Flipper and Hollywood had similarities with DX7 era PC GPUs, point I'm making is Nintendo has never used an off-the-shelf GPU for its home consoles. 

If Wii U's GPU was off the shelf we would have known its exact make and model ages ago, we would know every little thing about it. The reason we don't is because it can't be identified as any existing GPU. It's a modified design to meet Nintendo's specific demands regarding power consumption, heat, processing power, etc.


Only on the surface. The TEV Could do things the Geforce 3/4Ti Couldn't do and vice versa and could even pull off some of the same SM1.0/SM1.1 etc' effects with a bit of work, so I wouldn't really say it's similar to Direct X 7 GPU's at all, when programmed it's way, the Flipper could easily pull it's weight.
With that said though, even Direct X 7 cards such as the Radeon 7500 had programmable pixel shaders to a certain degree, just games never used it because it wasn't standard in any major API's that games targeted.  (Which is the same Story with Tessellation on the Radeon 8500, 9700 series, at-least a few games used it.)

Ironically, Flipper and Hollywood isn't based on any of AMD or ATI's prior generation of GPU's as it was designed by a company known as ArtX before ATI bought them out, that team then wen't on to build the R300 series of GPU's which pretty much dominated nVidia's Geforce FX, there are some minor similarties in R300 and Flipper on a very very low level though, I still wouldn't even call them cousins thrice removed.

As for the Wii U's GPU, we know it's based on a Radeon 5000/6000 series of GPU's since it includes Tessellation and Geometry Shaders, what extent Nintendo chose to modify that class of chip remains to be seen, but it's certainly a closer relatation to PC GPU's than the Gamecube or Wii ever was.



fatslob-:O said:

All of those things are used for hUMA. Make no mistake that these things will appear in amd kaveri except for the GDDR5 ofcourse but we still have it on our dedicated graphics card. 

That's the key word: will. PS4's customizations are so great that they will be used in future off-the-shelf cards. So those customizations are worthwhile innovations at the same time.

Anyway, what will be the strengh of the Wii-U is not the hardware but as always its Nintendo exclusives: future Zelda, X, Mario Kart, Mario, Donkey Kong, (Metroid???) that me and probably everyone on this thread will get eventually. But we can't judge Wii-U power by unreleased game screenshots. Until now, Wii-u games are on average unfortunately on-par with current gen: see current multiplats comparisons and (non-first parties) developers opinion.



Yawn...*looks around* yups still the same old people going around and around and around the same topic!!! Anyways while you guys fight over transistors, power draw, ALU, TMU etc (I find that pretty boring now) the real gamers are gonna be enjoying this in less than 2 weeks!!!:



oni-link said:

Yawn...*looks around* yups still the same old people going around and around and around the same topic!!! Anyways while you guys fight over transistors, power draw, ALU, TMU etc (I find that pretty boring now) the real gamers are gonna be enjoying this in less than 2 weeks!!!:

 


One thing Nintendo has *always* excelled in, IMHO is the art style. They don't need crazy Polygon counts to make something look pleasing to the eye.
With that in mind... Talk about crazy amounts of Aliasing, limited amounts of shadows and lighting and low-resolution textures! :P

Still, I genuinely hope you enjoy it in 2 weeks time!