By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

eyeofcore said:
drake4 said:
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.


No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?

EDIT:

Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...

http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram,  it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf.



Around the Network
drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.


No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?

EDIT:

Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...

http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram,  it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf.


I read the thread  and its laughable... You have two links that are outdated and debunked while other two are vague.

First article is typical "anonymous" source that is likely a troll and you need to take it with grain of salt also Wii U's final specifications and development kits were not our nor it was in mass production. Article was writen in April while Wii U went into mass production in May or June.

Second article... These guys worked on an older development kit and resources(shaders) could have been locked away(unavailable) at the time these guys developed Darksiders 2 Wii U version and that team was like mere 4 to 6 people. Final developments kits with latest specifications, API's/Firmware/OS were very near launch. Also they made this statement way before launch and before final specifications and mass productions of the Wii U.

Third article is nothing negative as you see it... These developers stated that Wii U is not much more powerful than PS3/X360 which is factual since it is not the leap as Xbox One or PlayStation 4 if you compare Wii U to Xbox 360/PlayStation 3 which is not fair since Wii U is not successor to these two consoles but its predecessor the Wii.

EA engineer was just FOS and I think that engineer never worked anything related to Nintendo's hardware in his whole career...

I am sorry, but you need to update yourself since NeoGAF denied and debunked 160 shader theory and that theory was a rumor yet you state  a rumor as a fact and dimsiss had evidence that is Chipworks die shot of Wii U's GPU codenamed Latte. Congrats, you have no credibility.

If you even looked and searched involving AMD's GPU architecture and I already explained to you then you would not be spreading misinformation.

It is 320 SPU's for sure since for example a Bobcat VLIW5 iGPU has 16 SRAM cells per block and each block has 20 shaders plus there is no interpolator in Wii U's GPU thus we know it is not based around Radeon HD 4XXX series, but rather 5XXX or 6XXX series since developers said that Wii U is using DirectX 11 equilavent feature set plus since Bobcat has 16 SRAM cells per 20 shaders while in Wii U's GPU die shot we see double then it is likely that Wii U's GPU is VLIW4.

Could you explain me why these supposed 160 shaders take 85+ mm^2 of die space at 40nm when they should 60mm^2(shaders take a very small amount of space compared to Compute Units, Texture Mapping Units, Raster Output Units and soo on))?

I love your confirmation bias... In case lets say Donkey Kong Country Tropical freeze is 1080p60fps and so Mario Kart 8 and Smash Bros and Bayonetta 2 also 1080p60fps then that would mean that it is surely not 160 shaders because no matter how much efficient it would have been it would be impossible to do that even to code very asset and game engine to the metal(low level code)...

Thanks for the laughs...



eyeofcore said:
drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.


No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?

EDIT:

Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...

http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram,  it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf.


I read the thread  and its laughable... You have two links that are outdated and debunked while other two are vague.

First article is typical "anonymous" source that is likely a troll and you need to take it with grain of salt also Wii U's final specifications and development kits were not our nor it was in mass production. Article was writen in April while Wii U went into mass production in May or June.

Second article... These guys worked on an older development kit and resources(shaders) could have been locked away(unavailable) at the time these guys developed Darksiders 2 Wii U version and that team was like mere 4 to 6 people. Final developments kits with latest specifications, API's/Firmware/OS were very near launch. Also they made this statement way before launch and before final specifications and mass productions of the Wii U.

Third article is nothing negative as you see it... These developers stated that Wii U is not much more powerful than PS3/X360 which is factual since it is not the leap as Xbox One or PlayStation 4 if you compare Wii U to Xbox 360/PlayStation 3 which is not fair since Wii U is not successor to these two consoles but its predecessor the Wii.

EA engineer was just FOS and I think that engineer never worked anything related to Nintendo's hardware in his whole career...

I am sorry, but you need to update yourself since NeoGAF denied and debunked 160 shader theory and that theory was a rumor yet you state  a rumor as a fact and dimsiss had evidence that is Chipworks die shot of Wii U's GPU codenamed Latte. Congrats, you have no credibility.

If you even looked and searched involving AMD's GPU architecture and I already explained to you then you would not be spreading misinformation.

It is 320 SPU's for sure since for example a Bobcat VLIW5 iGPU has 16 SRAM cells per block and each block has 20 shaders plus there is no interpolator in Wii U's GPU thus we know it is not based around Radeon HD 4XXX series, but rather 5XXX or 6XXX series since developers said that Wii U is using DirectX 11 equilavent feature set plus since Bobcat has 16 SRAM cells per 20 shaders while in Wii U's GPU die shot we see double then it is likely that Wii U's GPU is VLIW4.

Could you explain me why these supposed 160 shaders take 85+ mm^2 of die space at 40nm when they should 60mm^2(shaders take a very small amount of space compared to Compute Units, Texture Mapping Units, Raster Output Units and soo on))?

I love your confirmation bias... In case lets say Donkey Kong Country Tropical freeze is 1080p60fps and so Mario Kart 8 and Smash Bros and Bayonetta 2 also 1080p60fps then that would mean that it is surely not 160 shaders because no matter how much efficient it would have been it would be impossible to do that even to code very asset and game engine to the metal(low level code)...

Thanks for the laughs...

 

show me your source where 160 shaders was debunked. which reliable poster, as far as i know you were banned from both neogaf and beyond3d for being a fanboy, cause you had no idea what you were talking about, all you do is label people, even big nintendo fans as trolls or fanboys cause you think its a 320 shader gpu.

 

 

 

 

 



drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.


No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?

EDIT:

Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...

http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram,  it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf.


I read the thread  and its laughable... You have two links that are outdated and debunked while other two are vague.

First article is typical "anonymous" source that is likely a troll and you need to take it with grain of salt also Wii U's final specifications and development kits were not our nor it was in mass production. Article was writen in April while Wii U went into mass production in May or June.

Second article... These guys worked on an older development kit and resources(shaders) could have been locked away(unavailable) at the time these guys developed Darksiders 2 Wii U version and that team was like mere 4 to 6 people. Final developments kits with latest specifications, API's/Firmware/OS were very near launch. Also they made this statement way before launch and before final specifications and mass productions of the Wii U.

Third article is nothing negative as you see it... These developers stated that Wii U is not much more powerful than PS3/X360 which is factual since it is not the leap as Xbox One or PlayStation 4 if you compare Wii U to Xbox 360/PlayStation 3 which is not fair since Wii U is not successor to these two consoles but its predecessor the Wii.

EA engineer was just FOS and I think that engineer never worked anything related to Nintendo's hardware in his whole career...

I am sorry, but you need to update yourself since NeoGAF denied and debunked 160 shader theory and that theory was a rumor yet you state  a rumor as a fact and dimsiss had evidence that is Chipworks die shot of Wii U's GPU codenamed Latte. Congrats, you have no credibility.

If you even looked and searched involving AMD's GPU architecture and I already explained to you then you would not be spreading misinformation.

It is 320 SPU's for sure since for example a Bobcat VLIW5 iGPU has 16 SRAM cells per block and each block has 20 shaders plus there is no interpolator in Wii U's GPU thus we know it is not based around Radeon HD 4XXX series, but rather 5XXX or 6XXX series since developers said that Wii U is using DirectX 11 equilavent feature set plus since Bobcat has 16 SRAM cells per 20 shaders while in Wii U's GPU die shot we see double then it is likely that Wii U's GPU is VLIW4.

Could you explain me why these supposed 160 shaders take 85+ mm^2 of die space at 40nm when they should 60mm^2(shaders take a very small amount of space compared to Compute Units, Texture Mapping Units, Raster Output Units and soo on))?

I love your confirmation bias... In case lets say Donkey Kong Country Tropical freeze is 1080p60fps and so Mario Kart 8 and Smash Bros and Bayonetta 2 also 1080p60fps then that would mean that it is surely not 160 shaders because no matter how much efficient it would have been it would be impossible to do that even to code very asset and game engine to the metal(low level code)...

Thanks for the laughs...

 

show me your source where 160 shaders was debunked. which reliable poster, as far as i know you were banned from both neogaf and beyond3d for being a fanboy, cause you had no idea what you were talking about, all you do is label people, even big nintendo fans as trolls or fanboys cause you think its a 320 shader gpu.

 

 

 

 

 


I was banned on Beyond 3D because of my opinion and nobody gave any evidence against my claims and I was never registered on NeoGAF and I am a "fanboy" because of my opinion and I was labeled as a "Nintendo fanboy" without any evidence yet I never experienced any game or console from Nintendo and I don't own any software or platforms from Nintendo. Boo hoo I am being a "fanboy" because of my opinion that conflicts to the opinion of the masses... Boo hoo... Should I cry for not being a sheep like everyone else? No.

I don't think it is 320 Shaders... It is a fact. You have Latte's die shot from Chipworks and you just need to do research involving AMD's GPU's and do you have a friend that is a computer engineer?



eyeofcore said:
drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
drake4 said:
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

its been confirmed to be 160 shaders by many sources, the project cars dev confirmed its 160 shaders, black forest games as well, and neogaf, and the posters that confirmed 160 shaders are the biggest nintendo fans.


No. You are telling me that Wii U's die shot from Chipworks is fake? Are you going against a legitimate source that has provided die shot of Wii U's GPU in which you can see in blocks where GPU shaders are that there are 32 SRAM cells which means 256 SRAM cells total since there are 8 blocks and for reference a VLIW5 GPU in Bobcat APU's had 16 SRAM cells with 20 Stream Processing Units(shaders) while WIi U has 32 SRAM cells which means 40 Stream Processing Units(shaders). Are you really go against AMD's official documents involving their GPU architecture?

EDIT:

Do you have sources that these developers that you cite are confirming 160 Shaders theory(that is misinformation caused by trolls/extreme Sony or Microsoft fans)? Why would these developers break NDA, please explain me. Keep believing in that misinformation, a myth with no foundation and evidence. Already debunked...

http://gamrconnect.vgchartz.com/thread.php?id=175934&page=1 all you have to do is read this thread, there is just no way developers would be saying these things if it was a modern 320 shader gpu with the double the ram,  it would crush 360/ps3, it woudn't even be close, every port would be better with no optimization. and every tech head has already come into agreement that it is indeed a 160 shader gpu in the wii u die thread on neogaf, of course your probably smarter then everybody else, and dismiss everybody who says its a 160 gpu and call then sony/microsoft fanyboys and trolls, and the funniest part the tech heads who calle that are some of the biggest nintendo fans just look at there post history at neogaf.


I read the thread  and its laughable... You have two links that are outdated and debunked while other two are vague.

First article is typical "anonymous" source that is likely a troll and you need to take it with grain of salt also Wii U's final specifications and development kits were not our nor it was in mass production. Article was writen in April while Wii U went into mass production in May or June.

Second article... These guys worked on an older development kit and resources(shaders) could have been locked away(unavailable) at the time these guys developed Darksiders 2 Wii U version and that team was like mere 4 to 6 people. Final developments kits with latest specifications, API's/Firmware/OS were very near launch. Also they made this statement way before launch and before final specifications and mass productions of the Wii U.

Third article is nothing negative as you see it... These developers stated that Wii U is not much more powerful than PS3/X360 which is factual since it is not the leap as Xbox One or PlayStation 4 if you compare Wii U to Xbox 360/PlayStation 3 which is not fair since Wii U is not successor to these two consoles but its predecessor the Wii.

EA engineer was just FOS and I think that engineer never worked anything related to Nintendo's hardware in his whole career...

I am sorry, but you need to update yourself since NeoGAF denied and debunked 160 shader theory and that theory was a rumor yet you state  a rumor as a fact and dimsiss had evidence that is Chipworks die shot of Wii U's GPU codenamed Latte. Congrats, you have no credibility.

If you even looked and searched involving AMD's GPU architecture and I already explained to you then you would not be spreading misinformation.

It is 320 SPU's for sure since for example a Bobcat VLIW5 iGPU has 16 SRAM cells per block and each block has 20 shaders plus there is no interpolator in Wii U's GPU thus we know it is not based around Radeon HD 4XXX series, but rather 5XXX or 6XXX series since developers said that Wii U is using DirectX 11 equilavent feature set plus since Bobcat has 16 SRAM cells per 20 shaders while in Wii U's GPU die shot we see double then it is likely that Wii U's GPU is VLIW4.

Could you explain me why these supposed 160 shaders take 85+ mm^2 of die space at 40nm when they should 60mm^2(shaders take a very small amount of space compared to Compute Units, Texture Mapping Units, Raster Output Units and soo on))?

I love your confirmation bias... In case lets say Donkey Kong Country Tropical freeze is 1080p60fps and so Mario Kart 8 and Smash Bros and Bayonetta 2 also 1080p60fps then that would mean that it is surely not 160 shaders because no matter how much efficient it would have been it would be impossible to do that even to code very asset and game engine to the metal(low level code)...

Thanks for the laughs...

 

show me your source where 160 shaders was debunked. which reliable poster, as far as i know you were banned from both neogaf and beyond3d for being a fanboy, cause you had no idea what you were talking about, all you do is label people, even big nintendo fans as trolls or fanboys cause you think its a 320 shader gpu.

 

 

 

 

 


I was banned on Beyond 3D because of my opinion and nobody gave any evidence against my claims and I was never registered on NeoGAF and I am a "fanboy" because of my opinion and I was labeled as a "Nintendo fanboy" without any evidence yet I never experienced any game or console from Nintendo and I don't own any software or platforms from Nintendo. Boo hoo I am being a "fanboy" because of my opinion that conflicts to the opinion of the masses... Boo hoo... Should I cry for not being a sheep like everyone else? No.

I don't think it is 320 Shaders... It is a fact. You have Latte's die shot from Chipworks and you just need to do research involving AMD's GPU's and do you have a friend that is a computer engineer?

i'm still waiting for you source where 160 shaders was debunked, i checked the neogaf thread where is it was confirmed http://www.neogaf.com/forum/showthread.php?t=710765 there is no debunking, jusyt everybody agreeing.



Around the Network
eyeofcore said:
I got in contact with one homebrew programmer and I asked him if he has proof that there are dual core ARM Cortex A8 in Wii U and he replied that he got entire OS from a breaker/hacker/cracker that cracked the OS and ripped it off from internal flash storage. He said that entire OS is written in pure ARM code.

I estimate Wii U's CPU performance of 15 GFLOPS which is bare minimum not accounting that is heavily modified to implement multi core support and increase its efficiency/performance while not breaking backward compatibility with Wii. Homebrew programmer that I contacted that it has nothing to do with Power 7, but it has to do with Power 6 from which it got features needed for multi core and other things.

GPU is according to die shot at least 256 Shaders/SPU's while it is likely that is 320 Shaders.

It is not 160 shaders because there would have been less SRAM caches in all 8 blocks where there are shaders, 160 shaders is just a myth and misinformation caused by extreme/radical Sony/Microsoft fans that intentionally spread FUD and misinformation. I have two computer engineers to back me up and AMD's documents involving their GPU design.

One user at IGN spotted that both Xbox One and Wii U have a second layer that you can barely see on a die shot from Chipworks and we can only see first layer because the second layer is below so we could look at stacked chips/silicon.

References and links would be useful.



drake4 said:

i'm still waiting for you source where 160 shaders was debunked, i checked the neogaf thread where is it was confirmed http://www.neogaf.com/forum/showthread.php?t=710765 there is no debunking, jusyt everybody agreeing.N

*facepalm*

Member that created that thread stated in the thread title that it is a rumor and he said it is a rumor and he confirmed it is a rumor yet you state a rumor as a fact and people that agreed are Sony/Microsoft fans without little to no knowledge involving hardware let alone software that only validate lowest possible denominator.

Majority people agreeing about something =/= fact

Being in majority saying one thing does not mean that you and them are right, you are basing all your claims based around a rumor without any evidence nor foundation with die shots contradicting claims that you support.

Consensus at NeoGAF about 160 Shaders rumor is that have a doubt about it because of games that were or will be released on Wii U. Need For Speed Most Wanted U has little to no issue in framerate compared to PlayStation 3 and Xbox 360, no screen tearing, it has far better lightning(specially at night) and high quality textures are present in the game. Trine 2: Directors Cut looks better and runs better and developers them self said that Wii U version of Trine 2 would not be possible to run on Xbox 360 nor PlayStation 3 hardware without downgrade. Thats one of examples...

Also do you care to explain me why Bayonetta 2 looks way better than original Bayonetta since Bayonetta 2 should not be possible on 160 SPU GPU even when its more efficient...



Badassbab said:
ESRAM on Xbox One needs updating. I'm pretty sure it's 109GB/s as standard due to the 10% GPU speed increase but has a theoretical peak of 192GB/s?

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

Others please correct me if I'm wrong, but an change in clock speed of a cpu/gpu does not mean there is any change to the bus speed of memory and/or its pipelines.



RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.



superchunk said:
RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.


A typical AMD GPU with 160 Shaders would take 67mm^2 while I dissected Wii U's die shot by measuring each block and come to conclusion that there is at least 85mm^2 available for purely GPU related silicon(Stream Processing Units, Raster Output Units, Texture Mapping Units, Compute Units, Memory Controller DDR3, etc...).

Another indication that is not a 160 shader GPU is that Wii U's GPU codenamed "Latte" has 8 blocks where SPU's/Shaders are located with 32 SRAM Cell's/Cache and not 16 SRAM Cell's/Cache. AMD's GPU architecture involving VLIW5 is 16 SRAM Cell's/Cache connected to 20 SPU's/Shaders while in Wii U's GPU die shot we can clearly see 32 SRAM Cell's which indicates 40 SPU's/Shaders.

Anyway Wii U's GPU is fully custom as Chipworks said and I have found die shots of various dGPU's and iGPU's from AMD and Wii U's GPU does not even remotely similar except in SRAM Cell/Cache placements. You won't find any sources of developers actually saying numbers of Shaders/SPU's because they would broke NDA and be sued.