drake4 said:
http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram, it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf. |
I read the thread and its laughable... You have two links that are outdated and debunked while other two are vague.
First article is typical "anonymous" source that is likely a troll and you need to take it with grain of salt also Wii U's final specifications and development kits were not our nor it was in mass production. Article was writen in April while Wii U went into mass production in May or June.
Second article... These guys worked on an older development kit and resources(shaders) could have been locked away(unavailable) at the time these guys developed Darksiders 2 Wii U version and that team was like mere 4 to 6 people. Final developments kits with latest specifications, API's/Firmware/OS were very near launch. Also they made this statement way before launch and before final specifications and mass productions of the Wii U.
Third article is nothing negative as you see it... These developers stated that Wii U is not much more powerful than PS3/X360 which is factual since it is not the leap as Xbox One or PlayStation 4 if you compare Wii U to Xbox 360/PlayStation 3 which is not fair since Wii U is not successor to these two consoles but its predecessor the Wii.
EA engineer was just FOS and I think that engineer never worked anything related to Nintendo's hardware in his whole career...
I am sorry, but you need to update yourself since NeoGAF denied and debunked 160 shader theory and that theory was a rumor yet you state a rumor as a fact and dimsiss had evidence that is Chipworks die shot of Wii U's GPU codenamed Latte. Congrats, you have no credibility.
If you even looked and searched involving AMD's GPU architecture and I already explained to you then you would not be spreading misinformation.
It is 320 SPU's for sure since for example a Bobcat VLIW5 iGPU has 16 SRAM cells per block and each block has 20 shaders plus there is no interpolator in Wii U's GPU thus we know it is not based around Radeon HD 4XXX series, but rather 5XXX or 6XXX series since developers said that Wii U is using DirectX 11 equilavent feature set plus since Bobcat has 16 SRAM cells per 20 shaders while in Wii U's GPU die shot we see double then it is likely that Wii U's GPU is VLIW4.
Could you explain me why these supposed 160 shaders take 85+ mm^2 of die space at 40nm when they should 60mm^2(shaders take a very small amount of space compared to Compute Units, Texture Mapping Units, Raster Output Units and soo on))?
I love your confirmation bias... In case lets say Donkey Kong Country Tropical freeze is 1080p60fps and so Mario Kart 8 and Smash Bros and Bayonetta 2 also 1080p60fps then that would mean that it is surely not 160 shaders because no matter how much efficient it would have been it would be impossible to do that even to code very asset and game engine to the metal(low level code)...
Thanks for the laughs...