By using this site, you agree to our Privacy Policy and our Terms of Use. Close
TheBraveGallade said:
NextGen_Gamer said:

I mean, it wouldn't cost anything? What post did MVG say all this in? I would like to read it.

Again, all of my speculation is based on Nintendo contracting NVIDIA to make a new, much better SoC for it. People don't realize just how awful the current Switch SoC is - and I say that as a huge fan of my Switch. But from a hardware-only perspective, Nintendo took an already out-of-date mobile chip, not known for being very efficient or fast (hence why no mobile phones really used it and NVIDIA started using their stockpile for the Shield system), and made it worse for the Switch. The Tegra X1 that is in it has 4 x ARM Cortex A57 cores, another 4 x ARM Cortex A53 cores, and a GeForce "Maxwell" era GPU with 256 CUDA cores. But for the Switch, Nintendo disabled the four Cortex A53 cores, and downclocked the GPU and remaining CPU cores severely in order to get decent battery life out of it.

If Nintendo asked NVIDIA to do a modern "re-do" of it, I could imagine it being 4 x ARM Cortex A78 cores, and a "Ampere" based GPU with 512 CUDA cores, built on Samsung's 8-nm process. Those specs could then run at VERY low clockspeeds, like 1 GHz for the CPU and 1GHz for the GPU, while still WAY outperforming the current Switch in its docked mode, and only consume probably like 2 or 3-watts of power. For reference, again because the Tegra X1 was never very good to being with, the Switch uses 9-watts in portable mode and about 16-watts while docked. That would mean my theoretical SoC above would be giving better battery life while running Switch games in their higher graphics modes. Now, some of that battery life increase would go back into the 1080p display, but even still, it would probably be a +30% more longevity over the current model.

Nintendo doesn't typically spend a lot in the R&D department, but one would hope that with the runaway success of the Switch, Nintendo would be smart to invest at least some of those profits into a big SoC upgrade.

while I think ninty would invest SOME money into R&D, I think 1080p might be overkill anyway for portable unless they are going full 8" screen size (which has its own issues. 720p at 7" is enough, and with DLSS, a 720p60 would pretty easily upscale to 4K60 with proper implementation, and would probably run 1080p30 nativly at minimum or at least 720p60 upscaled to 1080p60 at minimum.

and also remember, cost is an issue. going up by that much in performance, on a VERY congested node is going to be expensive... it might be better to go 10nm just for that.

Then you have to consider battery life and cooling system. They probably want to have you play something as intensive as BOTW for about 4 hours in a single play through. Does DLSS build up heat in the console? If so, how would the cooling system work?