By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Just about the name:
Nintendo Extra sounds a bit stupid. NEX doesn't though. It's somewhere between NES and next.

About hardware:
Xbone and PS4 CPU are quite close, same Jaguar with Xbone clocking slightly higher, 9,4% more.

GPU's are a different story with PS4 having 50% more shaders but Xbox clocking slightly higher again PS4 has 42% more shaderpower.

RAM is yet another story. Must be fast enough, especially to get data in and out of the GPU fast enough. So we need bandwith. Bandwith comes from clockspeed and buswidth. To keep PCB costs etc. down you usually go with clock. That is what Sony did.
One option is to just have the most important data go very fast>eDRAM or ESRAM as cache>Wii U, 360 and Xbone way.
Some GPU's are a bit more bandwidth efficient than others though.

So, CPU:
More modern process, more modern architecture>better IPC, better energy efficiancy and thus higher clock speed plus some minor improvements. If there's an ARM handling the actual OS that might free up CPU power as well.
GPU: Might be Polaris, though AMD's APU's so far are still planned with older GCN cores. PS4 basically being GCN 1.0 and Xbone being GCN 1.1, NX GPU could be GCN 1.2 which had some decent improvements.
About the 50% more, that might actually depend on what the GPU is doing.
There are differences in fillrate, tesselation efficiency, shaderpower etc.

Now for bandwidth: It's unlikely that Nintendo will be going with an expensive PCB. But if we take AMD's Tonga, that design is a bit more memory efficient than Pitcairn (PS4 CPU).
Then there's that 12GB RAM thing>192 or 384 bit memory interface. The latter one would be 50% wider than PS4's Xbones. So yes, in theory you could go with DDR3.

There are some serious questions though. For example:
Why that huge amount of RAM? 8GB total would probably be enaugh and well, you could go the Sony way with GDDR5.
NEX wouldn't seem that much more powerful but have almost double the free RAM though probably at least risking bandwidth issues.