drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said: I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code. I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things. GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders. It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design. One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon. |
its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.
|
No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?
EDIT:
Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...
|
http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram, it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf.
|
I read the thread and its laughable... You have two links that are outdated and debunked while other two are vague.
First article is typical "anonymous" source that is likely a troll and you need to take it with grain of salt also Wii U's final specifications and development kits were not our nor it was in mass production. Article was writen in April while Wii U went into mass production in May or June.
Second article... These guys worked on an older development kit and resources(shaders) could have been locked away(unavailable) at the time these guys developed Darksiders 2 Wii U version and that team was like mere 4 to 6 people. Final developments kits with latest specifications, API's/Firmware/OS were very near launch. Also they made this statement way before launch and before final specifications and mass productions of the Wii U.
Third article is nothing negative as you see it... These developers stated that Wii U is not much more powerful than PS3/X360 which is factual since it is not the leap as Xbox One or PlayStation 4 if you compare Wii U to Xbox 360/PlayStation 3 which is not fair since Wii U is not successor to these two consoles but its predecessor the Wii.
EA engineer was just FOS and I think that engineer never worked anything related to Nintendo's hardware in his whole career...
I am sorry, but you need to update yourself since NeoGAF denied and debunked 160 shader theory and that theory was a rumor yet you state a rumor as a fact and dimsiss had evidence that is Chipworks die shot of Wii U's GPU codenamed Latte. Congrats, you have no credibility.
If you even looked and searched involving AMD's GPU architecture and I already explained to you then you would not be spreading misinformation.
It is 320 SPU's for sure since for example a Bobcat VLIW5 iGPU has 16 SRAM cells per block and each block has 20 shaders plus there is no interpolator in Wii U's GPU thus we know it is not based around Radeon HD 4XXX series, but rather 5XXX or 6XXX series since developers said that Wii U is using DirectX 11 equilavent feature set plus since Bobcat has 16 SRAM cells per 20 shaders while in Wii U's GPU die shot we see double then it is likely that Wii U's GPU is VLIW4.
Could you explain me why these supposed 160 shaders take 85+ mm^2 of die space at 40nm when they should 60mm^2(shaders take a very small amount of space compared to Compute Units, Texture Mapping Units, Raster Output Units and soo on))?
I love your confirmation bias... In case lets say Donkey Kong Country Tropical freeze is 1080p60fps and so Mario Kart 8 and Smash Bros and Bayonetta 2 also 1080p60fps then that would mean that it is surely not 160 shaders because no matter how much efficient it would have been it would be impossible to do that even to code very asset and game engine to the metal(low level code)...
Thanks for the laughs...