By using this site, you agree to our Privacy Policy and our Terms of Use. Close
curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Well we all know that the WII U can probably support pixel shaders too so what other excuse is there left ? 

Wii U supports pixel shaders, that is a given, going by the effects we're seeing it pull off. But that doesn't mean it is an off-the-shelf GPU, as Nintendo tends not to use these. If it was an off-the-shelf part it would have been identified immediately.

It'd be foolish to think that nintendo would have customized the GPU by themselves when they have even less experience in designing hardware than sony or amd. That's why nintendo looks to ibm and amd to do the job for them. BTW the latte likely comes from an existing GPU architecture from AMD. Think outside of the box curl. (Nintendo couldn't have possibly had the experience to simply change the architecture by itself.) 

Of course Ninendo doesn't customize Latte themselves, they simply tell AMD "we want x figures for power consumption, heat, processing power, etc, make us a chip that does this and we'll buy them from you."

I also don't think AMD would waste hundreds of millions of dollars to build a new GPU architecture specifically for the WII U because afterall AMD is in the red LMAO. The only addition AMD integrated to their VLIW 5 architecture was the eDRAM and it's not that complicated. Just like how intel was able to get 128 mb into their iris pro chips and we all know intel has a history of designing extremely craptastic graphics chips. 

Making a modified chip would not cost hundreds of millions of dollars. And they were happy to produce unique chips for the Wii.

They weren't exactly unique for the most part because the ati flipper had alot in common with PC GPUs compared to what was in the PS2 and N64. BTW ATI back in the day (Which is now AMD because they got bought out by them.) had alot of money to do it but the reason I say AMD didn't modify the GPU architecture for the WII U was because AMD ain't doing so hot compared to it's older days. Amd would be willing to create a new architecture IF they had the money.