By using this site, you agree to our Privacy Policy and our Terms of Use. Close

The foxconn leaks are reliable as far as whatever can be seen by the naked eye, all of which were proven accurate. It's only the architecture stuff are all still just speculation.
However, the clockspeeds, 1.8GHz CPU w/921MHz GPU, during the stress test doesn't seem unrealistic for the retail unit. He said, the console has been stable on the stress test for 8 days straight. Knowing that, I don't see any reason why Nintendo would choose the low clockspeeds that eurogamer reported as their maximum frequencies for dock-mode.

Now for memory, Eurogamer said 4GB ram.
The foxconn guy speculated 4GB for the retail version, he saw 2 chips and assumes 2GB each. He saw 2 more on the devkit version and says it has 8GB. Now if the reason for going 1GB per die is cost, there would be no reason Nintendo would do that for the devkit version.
And if we look at switch Zelda, it clearly had better textures and more objects at a distance than theWiiU version, both of which points to switch having more RAM. Also, about bandwidth, Maxwell 2 in X1 is several steps head of the rv7xx in wiiu, it uses tile-base rendering, color compression, and more gpu cache than WiiU. All of which helps it reduce memory bandwidth requirements.
As for WiiU, we still don't know how much bandwidth the eDRAM provides. What extra graphical power was it suppose to bring? We haven't seen anything special such of free MSAA or 4x AA. If anything, imo, it's just there to offset the low main memory bandwidth and help with backward compatibility. It helps Nintendo use cheaper memory, reduce power, and provide BC, it met all their requirements.