By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How The Switch 2 Could Do 4K@120fps

JEMC said:
Ck1x said:

The only problem I see with this is that even Nvidia are off of the Carmel cores idea they just recently licensed out usage of Hercules, so that probably has the better chance of making it into Switch 2. When it comes to Nvidia we currently don't have a real world example of just how efficient their technology is on 7nm or lower, so it can be a little hard to imagine how many cores Nvidia could squeeze into a chip the size of Tegra X1 or slightly larger. 

All of the new gpu advancements that Nvidia are showing and proving work, couldn't benefit anyone more right now than Nintendo with their next system. I kind of get those GC reminiscent days of when that system was so well built and thought out that it was doing some graphical effects in hardware that the others just weren't set-up for. It completely allowed for some impossible looking games running on it, with also punching way above its status specs wise...

Correct me if I'm wrong (I know very little about AMR processors), but aren't those Hercules chips designed for security tasks? I wouldn't rule out the possibility that Nvidia has licensed them just to make their own SoC more safe.

As for the node and TDP of the SoC inside the Switch' successor, we should remember that we're not only talking about Nvidia, but also Nintendo. The Switch launched with original Tegra X1 chips made at a 20nm node (already dated) and, despite that chip being rated at 15W TDP, Nintendo lowered clocks to reduce heat and power consumption even more.

I don't think Nintendo will jump to 7, much less 5nm tech, and the chip will also have to use very little power to make it suitable for a handheld without burning the owners hands.

Considering it is the codename in the ARM client cpu roadmap, I'm not sure if it can have multiple purposes and whether or not Nintendo jumps to 7nm or 5nm would really depend on what type of deal Nvidia and Nintendo can leverage from Samsung, since they are actively looking to acquire more fabrication business away from TSMC.

Again the X1 wasn't that old considering Switch was originally supposed to release holiday 2016 and I'm sure Nvidia probably gave Nintendo a great deal seeing as how Switch easily cleared out whatever X1 stockpiles they might have had(Maybe even with future promise to deliver a cutting edge chip to Nintendo for the next Switch). Not to mention those were stagnant times for node reductions, many of the foundries were struggling to get finfet up and running. 

I mentioned before that Nintendo did a 1 billion dollar deal with IBM just for cpu tech, so they aren't strangers to spending money if it benefits the long game. They're all in on Switch as it's currently their only avenue for generating longevity in the hardware space, so we might conclude this as a sign of them getting more aggressive instead of less. I only mention that because they don't have another successful platform to fall back on like the did with 3ds, when WiiU was failing...

https://www.arm.com/company/news/2018/08/accelerating-mobile-and-laptop-performance

Last edited by Ck1x - on 15 April 2020

Around the Network

Switch 2 don't need to be 4k on handled mode , we barely can see the difference between 1080p or 4k on small screens. Probably on TV mode then possible. What They need to focus is big cartridge memory to contain 100 GB to contain all of the high quality 4k assets, even PS4 pro are struggling because it doesn't have ultra BD and Xbox One X need to download more then 100 GB to upgrade some games to run at 4K.



HollyGamer said:
Switch 2 don't need to be 4k on handled mode , we barely can see the difference between 1080p or 4k on small screens. Probably on TV mode then possible. What They need to focus is big cartridge memory to contain 100 GB to contain all of the high quality 4k assets, even PS4 pro are struggling because it doesn't have ultra BD and Xbox One X need to download more then 100 GB to upgrade some games to run at 4K.

Well we do know that both Microsoft and Sony are expecting file sizes to go down on the next systems because of not needing the redundant files stored on the hdd with the massively improved seek times of SSD... I do agree though, it would be pretty pointless for Switch 2 to try or need to produce native 4k visuals with such advanced Ai reproduction techniques available. 



Ck1x said:
HollyGamer said:
Switch 2 don't need to be 4k on handled mode , we barely can see the difference between 1080p or 4k on small screens. Probably on TV mode then possible. What They need to focus is big cartridge memory to contain 100 GB to contain all of the high quality 4k assets, even PS4 pro are struggling because it doesn't have ultra BD and Xbox One X need to download more then 100 GB to upgrade some games to run at 4K.

Well we do know that both Microsoft and Sony are expecting file sizes to go down on the next systems because of not needing the redundant files stored on the hdd with the massively improved seek times of SSD... I do agree though, it would be pretty pointless for Switch 2 to try or need to produce native 4k visuals with such advanced Ai reproduction techniques available. 

Microsoft and Sony have invested big in compression next-gen, that should bring file sizes down.



--::{PC Gaming Master Race}::--

Bigger sized cards would be good, but Nintendo really needs to put more storage in the console. 32Gb was already too little when the WiiU launched and it's a joke to have that much on the Switch.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
Pemalite said:




Bofferbrauer2 said:

Orin is rated for a TDP of 65W, that's way too much for a handheld format like the Switch. Also, it's heavily axed towards Deep Learning and AI, too much for it's performance to not become squandered or very hard to tap on (similar to the Cell in the PS3). I'd be more expecting a new Tegra chip with more consumer-related hardware, which NVidia probably wants to develop anyway for it's Shield line of products, which uses the same chips as the Switch does.

In other words, I'm more expecting something with 4-8 Tegra cores (Carmel or Hercules) and 384-512 Cuda Cores (Turing or next-gen) and a 128bit bus of 8-16GB LPDDR4X or LPDDR5 without too many bells or whistles, as they currently are too taxing for a handheld format, even in 7 or 5nm. And that should be enough to get PS4-like performance from a handheld anyway

Yeah. Nintendo will go where the technology goes. - They don't give a crap about making a clean break and doing what Nintendo does.

But Tegra is still improving.

Orin will be scalable just like Tegra X1/Maxwell.
Configurable TDP's and Semi-Custom alterations are a thing you know.

Tegra Xavier for example has 10w, 15w and 30w operating modes.

Plus we don't have all the information on what Orin actually is.


Bofferbrauer2 said:

But wouldn't that make taking the Switch out of the dock risky? I mean, that should get pretty hot like this...

As for the Surface Pro-X, again, try giving that one consistent high load. 2.1TFlops are it's peak performance, but it's far from holding those under full load. Case in point: It's supposed to be at 2.1 TFlops, but it's graphics score just edges out the UHD 620 in Intel 8250U laptops in 3D Mark's Night Raid graphics benchmark and get's easily beaten by a Vega 8 in a Ryzen 5 2500U despite the latter only claiming 1.1 TFlops.

In other words, the SQ1 at ~7W wouldn't do much better than the Tegra, if at all.

The emphasis on flops is misleading, it's not the be-all, end-all.

While Xavier can be configured down, it looses a lot of performance along the way. At 10W, it's GPU performance is barely above Switch under sustained stress, though the CPU is still ways above the A57 in the Switch. If tweaked properly, I think with Xavier one could achieve a 40-50% increase over OG Switch performance at same power draw.



Bofferbrauer2 said:

While Xavier can be configured down, it looses a lot of performance along the way. At 10W, it's GPU performance is barely above Switch under sustained stress, though the CPU is still ways above the A57 in the Switch. If tweaked properly, I think with Xavier one could achieve a 40-50% increase over OG Switch performance at same power draw.

Xavier is based on Volta.
Either way... It's certainly more than 50%.

The move from Tegra X1 to Tegra X2 would bring about a 50% improvement at the same Powerlevel... Because Pascal (Tegra X2) basically takes Maxwell (Tegra X1) and is re-engineered (I.E. More Transisters) to sustain higher clockrates without a significant power penalty.

We saw this occur on the PC. Significant clockrate improvements at the same powerlevel.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/6

Now Tegra Volta (Let's call it Tegra X3 for simplicity sake!) takes what was good about Pascal but makes a plethora of quality-of-life changes, that is more CUDA Cores, Tensor Cores (A.I upscaling!) and that was able to provide us with a 10-25% uplift of Titan Pascal on the PC. And that was with shit drivers.
https://www.anandtech.com/show/12170/nvidia-titan-v-preview-titanomachy/7
https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4

I wouldn't be surprised if we saw a 75%-100% low-balled performance increase once we start to account for the architectural efficiency gains, clockrates and more functional units.

We also need to keep in mind that Tegra X3 is built at 12nm, not a modern 7nm, so there is some potential still left on the table to reduce power consumption, increase clockrates or the number of functional units. (With the 512 CUDA variant.)

https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4

https://www.anandtech.com/show/13584/nvidia-xavier-agx-hands-on-carmel-and-more

https://www.anandtech.com/show/15070/nvidia-gives-jetson-xavier-a-trim-announces-nanosized-jetson-xavier-nx


Orin changes it all up again of course.




--::{PC Gaming Master Race}::--

Pemalite said:
Bofferbrauer2 said:

While Xavier can be configured down, it looses a lot of performance along the way. At 10W, it's GPU performance is barely above Switch under sustained stress, though the CPU is still ways above the A57 in the Switch. If tweaked properly, I think with Xavier one could achieve a 40-50% increase over OG Switch performance at same power draw.

Xavier is based on Volta.
Either way... It's certainly more than 50%.

The move from Tegra X1 to Tegra X2 would bring about a 50% improvement at the same Powerlevel... Because Pascal (Tegra X2) basically takes Maxwell (Tegra X1) and is re-engineered (I.E. More Transisters) to sustain higher clockrates without a significant power penalty.

We saw this occur on the PC. Significant clockrate improvements at the same powerlevel.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/6

Now Tegra Volta (Let's call it Tegra X3 for simplicity sake!) takes what was good about Pascal but makes a plethora of quality-of-life changes, that is more CUDA Cores, Tensor Cores (A.I upscaling!) and that was able to provide us with a 10-25% uplift of Titan Pascal on the PC. And that was with shit drivers.
https://www.anandtech.com/show/12170/nvidia-titan-v-preview-titanomachy/7
https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4

I wouldn't be surprised if we saw a 75%-100% low-balled performance increase once we start to account for the architectural efficiency gains, clockrates and more functional units.

We also need to keep in mind that Tegra X3 is built at 12nm, not a modern 7nm, so there is some potential still left on the table to reduce power consumption, increase clockrates or the number of functional units. (With the 512 CUDA variant.)

https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4

https://www.anandtech.com/show/13584/nvidia-xavier-agx-hands-on-carmel-and-more

https://www.anandtech.com/show/15070/nvidia-gives-jetson-xavier-a-trim-announces-nanosized-jetson-xavier-nx


Orin changes it all up again of course.


The result came from putting a Jetson TX1 and a Jetson Xavier both at 10W and against each other in gaming tasks and benches. Believe me when I say that Xavier doesn't have much reserves at low power in these tasks.

Both Xavier and Orin are axed towards AI and Deep Learning and away from Gaming. They are much more powerful than the X1, but can't tap into that power in Gaming tasks unless specifically programmed to, which would be the same problem as the PS3 had with the Cell processor. Hence why I said before already, I expect NVidia to create a new Tegra design based on Turing or Next-gen more axed to gaming instead of using a Xavier or Orin variant for the Switch and their Shield line of products.

About your last link: You saw how much the performance came crushing down in the Jetson Xavier NX compared to Xavier? And how the price shot up from the Jetson TX1 or even the original Nano (which is cut down and weaker than the Switch btw)? Due to this, I just don't expect even cutdown versions of Xavier or Orin as viable in a handheld console.



Bofferbrauer2 said:

The result came from putting a Jetson TX1 and a Jetson Xavier both at 10W and against each other in gaming tasks and benches. Believe me when I say that Xavier doesn't have much reserves at low power in these tasks.

To be fair, we also have 7nm on the table, which Xavier doesn't use, but could.
12nm is based on 14/16nm which in turn is based on 20nm.

Bofferbrauer2 said:

Both Xavier and Orin are axed towards AI and Deep Learning and away from Gaming. They are much more powerful than the X1, but can't tap into that power in Gaming tasks unless specifically programmed to, which would be the same problem as the PS3 had with the Cell processor. Hence why I said before already, I expect NVidia to create a new Tegra design based on Turing or Next-gen more axed to gaming instead of using a Xavier or Orin variant for the Switch and their Shield line of products.

In a console environment, we can assume developers would be building games for the hardwares various nuances... Plus Nintendo can drop a chunk of that TDP down by underclocking, undervolting and disabling some CPU cores or even the tensor cores if they don't want A.I upscaling.

Orin isn't even out yet though, to early to tell what it's electrical characteristics will be like.

Bofferbrauer2 said:

About your last link: You saw how much the performance came crushing down in the Jetson Xavier NX compared to Xavier? And how the price shot up from the Jetson TX1 or even the original Nano (which is cut down and weaker than the Switch btw)? Due to this, I just don't expect even cutdown versions of Xavier or Orin as viable in a handheld console.

nVidia can charge a premium for their "drive" hardware because their name is the best in the business right now, so to speak.

Plus it's nVidia with the nVidia tax.



--::{PC Gaming Master Race}::--