By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Soundwave said:

I still wonder if a Tegra X2 can match up with an older Nvidia mobile GPU ... like lets take the Nvidia 830M from 2014:

This is a laptop GPU, Nvidia 830M specs:

DDR3 RAM (64-bit memory bus), 14.4GB/sec

256 CUDA Cores, 16 TMUs, 8 ROPs

Maxwell Architecture, 28nm

554 GFLOPS floating point performance 

17.31 Gigatexels/second

8.66 Gpixels/second

Under 25 watts power usage

 

The Tegra X1/X2 (custom Nintendo chip?)

LPDDR4 RAM (128-bit memory bus?) 51GB/sec

256 CUDA Cores, 16 TMUs, 16 ROPs

Pascal Architecture (?), 16nm FinFET

625 GFLOPS floating point performance (Tegra X2 supposed performance)

16 Gigatexels/second (Tegra X1, X2 could be better)

16 Gpixels/second (Tegra X1, X2 might be better)

A Nvidia 830M is capable of running a lot of PS4/XB1 ports even at pretty decent resolutions ...

Batman Arkham Knight 1366x768 resolution; 30 frames per second (PS4/XB1 only title):

https://www.youtube.com/watch?v=AQxNpEaCsiw

Assassin's Creed Unity: 1360x768 resolution (PS4/XB1 only game):

https://www.youtube.com/watch?v=4oOa66_FIHM

Star Wars: Battlefront:

https://www.youtube.com/watch?v=7MG5d2N73U4

I mean in portable mode these games wouldn't really even have to run at this high of a resolution, 1024x600 or even 960x540 would be good enough for a little 6-inch display. I hope Nintendo really pushes this Nvidia tech, though I half suspect they'll just kinda settle for so so performance. 

I think it's possible Nvidia will change the 1:1 ratio of the TMU's and ROP's that's on the X1 for X2.