Quantcast
View Post
fatslob-:O said:

Why would the latte be based off of the HD 4000 series ?.....

Just so y'know the intel iris pro or the GT3e has eDRAM too.....

@Bold That's a terrible assumption .......

Doesn't matter we'll get a DF analysis eventually to conclude this. 

"Based off of" is not the best phrase, more like "modeled" after. I'm pretty sure it's modeled after the R700 series, and that's because:

1. The leaked specs back in July of 2012 stated as such, and and those are accurate. ("modeled after" Open GL and R700, that doesn't mean take a 4770 and cut stuff from it)

2. Wii U development began in April 2009 according to Iwata Asks, the 5000 series wasn't even out yet, and I extremely doubt Nintendo would be thinking about using the most modern stuff (considering their track record lately, it's unlikely they would do that). 

Smaller die than the 55xx? 156.21 is not smaller than 104. Also, don't be surprised if Latte had even more transistors per mm2 than those cards, since it's produced on a more mature 40nm process than those 55xx cards in 2010

It's not a terrible assumption (I'm not assuming anything) because it all depends on how it's being manufactured, where it's being manufactured, how Nintendo is combatting this issue and what parts are being used. 40nm process is very mature now, yield issues will at the most minimal level. The largest of yield issues that came from TSMC for 40nm were from 2009-2010, Latte began manufacturing around mid 2012, it won't face the same yield issues TSMC was facing for the 5000 series. But aside that, yield issues or no yield issues, it doesn't change anything, if both have issues according to you, then why are you making this a big deal? Just let it go. 

My point with mentioning the eDRAM, is that you're riding off the eDRAM and forgetting that it's a part of the entire Wii U hardware system. Latte has an advantage in having extra potential performance of on-die eDRAM, conventional GPUs don't have this luxury. 

OK, now I'm done with this. I'm just going to let the games do the talking, because obviously I can't speak for the GPU completely, let it speak for itself.

curl-6 said:

Best 3D World screen:

Pretty!