Forums - Nintendo Discussion - How The Switch 2 Could Do 4K@120fps

Ck1x said:
Bofferbrauer2 said:

Well, no node is mentioned, but considering it's only supposed to come out next year, 7nm would be pretty much the minimum, if not even 5nm or any variation of and in between those two.

On the other hand, it's pretty clear why the TDP is so large: 12 core CPU based on the A77, and while no clock speed or GPU details apart from being next gen are detailed, the bandwidth target of 200 GB/s is 50% higher than on Xavier's top-end chip, which means the GPU will get even bigger and probably reach close to 1650 size - and performance.

For comparison's sake, what's in Xavier is pretty much right in between the MX250 and MX350, both in size and performance, and those draw already about 20W, which leaves 10W for the 8-core Carmel CPU if it stays within it's 30W target.

@bolded: yeah, in Laptops, which have a much higher TDP as they can dissipate heat easier through the keyboard and have more space for a larger battery. And even then, to hold those clock speeds necessary to beat the PS4, they still need to use H chips, the 15W U chips are not enough. Ryzen 4000 U-series could change this, but we'll have to wait for tests of those.

Well I think realistically in docked mode a Switch 2 could possibly get away with pulling 20-30watts if they actually go with a magnesium alloy frame(which would help dissipate heat). The current Switch is about as thick as the new Suface Pro-X and that sports the SQ1 putting out 2.1 Tflops @ 20w max. So the possibilities are there but many factors would determine what they could put into this system, such as the consoles material build.

But of course it wouldn't need max power draw in handheld mode and something closer to current Switch's pull of 7-8w total would be more than enough. Especially with techniques like DLSS and VRS on a smaller 1080p screen, image quality would look pretty amazing and most wouldn't notice the visual imperfections and artifacts as much since they aren't blown up in size...

But wouldn't that make taking the Switch out of the dock risky? I mean, that should get pretty hot like this...

As for the Surface Pro-X, again, try giving that one consistent high load. 2.1TFlops are it's peak performance, but it's far from holding those under full load. Case in point: It's supposed to be at 2.1 TFlops, but it's graphics score just edges out the UHD 620 in Intel 8250U laptops in 3D Mark's Night Raid graphics benchmark and get's easily beaten by a Vega 8 in a Ryzen 5 2500U despite the latter only claiming 1.1 TFlops.

In other words, the SQ1 at ~7W wouldn't do much better than the Tegra, if at all.



Around the Network
OneTime said:
I doubt that Nvidia will invest in mobile GPUs at this point. The Tegra never really caught on in mobile: the Switch is basically the only customer for it.

Nintendo will need to find another vendor for the next Switch (of which there are plenty in the mobile space).

nVidia is continuing to invest and improve it's mobile Tegra lineup.

The Switch used Tegra Maxwell, technology that came out in 2015.

From there nVidia released Tegra Pascall in 2016..

Tegra Volta in 2019...

And we have Tegra Orin in 2021 likely using Ampere or Hopper.

Tegra is also used in Embedded Systems (Jetson), Vehicles (Drive), Shield (TV), VR Headsets (Magic Leap) and compute/A.I applications, it's not just about Tablets and Phones.

Heck even when Switch released nVidia had Tegra GPU's which could be 50% faster at the same TDP.

OneTime said:
Leynos said:

Nintendo basically stuck with ATI/AMD for close to that long. Nintendo is not stupid enough to build the tools with Nvidia to support Switch to suddenly start over with someone else.

If Nvidia don't produce a decent chip, I can guarantee that Nintendo will go elsewhere.  Potentially there may be an Nvidia laptop chip they can switch to, but that wouldn't be a direct Tegra replacement either.  

Mobile chips are mostly compatible with each other (it's all just Vulkan toolkit, there isn't really an Nvidia-only toolset these days).  Nintendo could look to Samsung or maybe a custom ARM Mali GPU - those guys have the mass market to do R&D.  Nvidia never built a marketshare in that space - I always assumed that they just gave Nintendo a good deal to cut their losses on Tegra...

Yeah. Nintendo will go where the technology goes. - They don't give a crap about making a clean break and doing what Nintendo does.

But Tegra is still improving.

Bofferbrauer2 said:

Orin is rated for a TDP of 65W, that's way too much for a handheld format like the Switch. Also, it's heavily axed towards Deep Learning and AI, too much for it's performance to not become squandered or very hard to tap on (similar to the Cell in the PS3). I'd be more expecting a new Tegra chip with more consumer-related hardware, which NVidia probably wants to develop anyway for it's Shield line of products, which uses the same chips as the Switch does.

In other words, I'm more expecting something with 4-8 Tegra cores (Carmel or Hercules) and 384-512 Cuda Cores (Turing or next-gen) and a 128bit bus of 8-16GB LPDDR4X or LPDDR5 without too many bells or whistles, as they currently are too taxing for a handheld format, even in 7 or 5nm. And that should be enough to get PS4-like performance from a handheld anyway

Orin will be scalable just like Tegra X1/Maxwell.
Configurable TDP's and Semi-Custom alterations are a thing you know.

Tegra Xavier for example has 10w, 15w and 30w operating modes.

Plus we don't have all the information on what Orin actually is.

JEMC said:

I don't think Nintendo will jump to 7, much less 5nm tech, and the chip will also have to use very little power to make it suitable for a handheld without burning the owners hands.

7nm should be very old, mature and cheap by the time we start looking at a Switch 2.

Bofferbrauer2 said:

But wouldn't that make taking the Switch out of the dock risky? I mean, that should get pretty hot like this...

As for the Surface Pro-X, again, try giving that one consistent high load. 2.1TFlops are it's peak performance, but it's far from holding those under full load. Case in point: It's supposed to be at 2.1 TFlops, but it's graphics score just edges out the UHD 620 in Intel 8250U laptops in 3D Mark's Night Raid graphics benchmark and get's easily beaten by a Vega 8 in a Ryzen 5 2500U despite the latter only claiming 1.1 TFlops.

In other words, the SQ1 at ~7W wouldn't do much better than the Tegra, if at all.

The emphasis on flops is misleading, it's not the be-all, end-all.



--::{PC Gaming Master Race}::--

4k is set back a few years and thank God cause they won't be pushing 8k now and we won't have this haves and have nots BS and a morr unified target for developers. We'll see a mass adoption of 4k once the economy starts ramping up again in the second part of the decade.



 

China Numba wan!!

JEMC said:
Ck1x said:

The only problem I see with this is that even Nvidia are off of the Carmel cores idea they just recently licensed out usage of Hercules, so that probably has the better chance of making it into Switch 2. When it comes to Nvidia we currently don't have a real world example of just how efficient their technology is on 7nm or lower, so it can be a little hard to imagine how many cores Nvidia could squeeze into a chip the size of Tegra X1 or slightly larger. 

All of the new gpu advancements that Nvidia are showing and proving work, couldn't benefit anyone more right now than Nintendo with their next system. I kind of get those GC reminiscent days of when that system was so well built and thought out that it was doing some graphical effects in hardware that the others just weren't set-up for. It completely allowed for some impossible looking games running on it, with also punching way above its status specs wise...

Correct me if I'm wrong (I know very little about AMR processors), but aren't those Hercules chips designed for security tasks? I wouldn't rule out the possibility that Nvidia has licensed them just to make their own SoC more safe.

As for the node and TDP of the SoC inside the Switch' successor, we should remember that we're not only talking about Nvidia, but also Nintendo. The Switch launched with original Tegra X1 chips made at a 20nm node (already dated) and, despite that chip being rated at 15W TDP, Nintendo lowered clocks to reduce heat and power consumption even more.

I don't think Nintendo will jump to 7, much less 5nm tech, and the chip will also have to use very little power to make it suitable for a handheld without burning the owners hands.

Considering it is the codename in the ARM client cpu roadmap, I'm not sure if it can have multiple purposes and whether or not Nintendo jumps to 7nm or 5nm would really depend on what type of deal Nvidia and Nintendo can leverage from Samsung, since they are actively looking to acquire more fabrication business away from TSMC.

Again the X1 wasn't that old considering Switch was originally supposed to release holiday 2016 and I'm sure Nvidia probably gave Nintendo a great deal seeing as how Switch easily cleared out whatever X1 stockpiles they might have had(Maybe even with future promise to deliver a cutting edge chip to Nintendo for the next Switch). Not to mention those were stagnant times for node reductions, many of the foundries were struggling to get finfet up and running. 

I mentioned before that Nintendo did a 1 billion dollar deal with IBM just for cpu tech, so they aren't strangers to spending money if it benefits the long game. They're all in on Switch as it's currently their only avenue for generating longevity in the hardware space, so we might conclude this as a sign of them getting more aggressive instead of less. I only mention that because they don't have another successful platform to fall back on like the did with 3ds, when WiiU was failing...

https://www.arm.com/company/news/2018/08/accelerating-mobile-and-laptop-performance

Last edited by Ck1x - on 15 April 2020

Switch 2 don't need to be 4k on handled mode , we barely can see the difference between 1080p or 4k on small screens. Probably on TV mode then possible. What They need to focus is big cartridge memory to contain 100 GB to contain all of the high quality 4k assets, even PS4 pro are struggling because it doesn't have ultra BD and Xbox One X need to download more then 100 GB to upgrade some games to run at 4K.



Around the Network
HollyGamer said:
Switch 2 don't need to be 4k on handled mode , we barely can see the difference between 1080p or 4k on small screens. Probably on TV mode then possible. What They need to focus is big cartridge memory to contain 100 GB to contain all of the high quality 4k assets, even PS4 pro are struggling because it doesn't have ultra BD and Xbox One X need to download more then 100 GB to upgrade some games to run at 4K.

Well we do know that both Microsoft and Sony are expecting file sizes to go down on the next systems because of not needing the redundant files stored on the hdd with the massively improved seek times of SSD... I do agree though, it would be pretty pointless for Switch 2 to try or need to produce native 4k visuals with such advanced Ai reproduction techniques available. 



Ck1x said:
HollyGamer said:
Switch 2 don't need to be 4k on handled mode , we barely can see the difference between 1080p or 4k on small screens. Probably on TV mode then possible. What They need to focus is big cartridge memory to contain 100 GB to contain all of the high quality 4k assets, even PS4 pro are struggling because it doesn't have ultra BD and Xbox One X need to download more then 100 GB to upgrade some games to run at 4K.

Well we do know that both Microsoft and Sony are expecting file sizes to go down on the next systems because of not needing the redundant files stored on the hdd with the massively improved seek times of SSD... I do agree though, it would be pretty pointless for Switch 2 to try or need to produce native 4k visuals with such advanced Ai reproduction techniques available. 

Microsoft and Sony have invested big in compression next-gen, that should bring file sizes down.



--::{PC Gaming Master Race}::--

Bigger sized cards would be good, but Nintendo really needs to put more storage in the console. 32Gb was already too little when the WiiU launched and it's a joke to have that much on the Switch.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Pemalite said:




Bofferbrauer2 said:

Orin is rated for a TDP of 65W, that's way too much for a handheld format like the Switch. Also, it's heavily axed towards Deep Learning and AI, too much for it's performance to not become squandered or very hard to tap on (similar to the Cell in the PS3). I'd be more expecting a new Tegra chip with more consumer-related hardware, which NVidia probably wants to develop anyway for it's Shield line of products, which uses the same chips as the Switch does.

In other words, I'm more expecting something with 4-8 Tegra cores (Carmel or Hercules) and 384-512 Cuda Cores (Turing or next-gen) and a 128bit bus of 8-16GB LPDDR4X or LPDDR5 without too many bells or whistles, as they currently are too taxing for a handheld format, even in 7 or 5nm. And that should be enough to get PS4-like performance from a handheld anyway

Yeah. Nintendo will go where the technology goes. - They don't give a crap about making a clean break and doing what Nintendo does.

But Tegra is still improving.

Orin will be scalable just like Tegra X1/Maxwell.
Configurable TDP's and Semi-Custom alterations are a thing you know.

Tegra Xavier for example has 10w, 15w and 30w operating modes.

Plus we don't have all the information on what Orin actually is.


Bofferbrauer2 said:

But wouldn't that make taking the Switch out of the dock risky? I mean, that should get pretty hot like this...

As for the Surface Pro-X, again, try giving that one consistent high load. 2.1TFlops are it's peak performance, but it's far from holding those under full load. Case in point: It's supposed to be at 2.1 TFlops, but it's graphics score just edges out the UHD 620 in Intel 8250U laptops in 3D Mark's Night Raid graphics benchmark and get's easily beaten by a Vega 8 in a Ryzen 5 2500U despite the latter only claiming 1.1 TFlops.

In other words, the SQ1 at ~7W wouldn't do much better than the Tegra, if at all.

The emphasis on flops is misleading, it's not the be-all, end-all.

While Xavier can be configured down, it looses a lot of performance along the way. At 10W, it's GPU performance is barely above Switch under sustained stress, though the CPU is still ways above the A57 in the Switch. If tweaked properly, I think with Xavier one could achieve a 40-50% increase over OG Switch performance at same power draw.



Bofferbrauer2 said:

While Xavier can be configured down, it looses a lot of performance along the way. At 10W, it's GPU performance is barely above Switch under sustained stress, though the CPU is still ways above the A57 in the Switch. If tweaked properly, I think with Xavier one could achieve a 40-50% increase over OG Switch performance at same power draw.

Xavier is based on Volta.
Either way... It's certainly more than 50%.

The move from Tegra X1 to Tegra X2 would bring about a 50% improvement at the same Powerlevel... Because Pascal (Tegra X2) basically takes Maxwell (Tegra X1) and is re-engineered (I.E. More Transisters) to sustain higher clockrates without a significant power penalty.

We saw this occur on the PC. Significant clockrate improvements at the same powerlevel.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/6

Now Tegra Volta (Let's call it Tegra X3 for simplicity sake!) takes what was good about Pascal but makes a plethora of quality-of-life changes, that is more CUDA Cores, Tensor Cores (A.I upscaling!) and that was able to provide us with a 10-25% uplift of Titan Pascal on the PC. And that was with shit drivers.
https://www.anandtech.com/show/12170/nvidia-titan-v-preview-titanomachy/7
https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4

I wouldn't be surprised if we saw a 75%-100% low-balled performance increase once we start to account for the architectural efficiency gains, clockrates and more functional units.

We also need to keep in mind that Tegra X3 is built at 12nm, not a modern 7nm, so there is some potential still left on the table to reduce power consumption, increase clockrates or the number of functional units. (With the 512 CUDA variant.)

https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4

https://www.anandtech.com/show/13584/nvidia-xavier-agx-hands-on-carmel-and-more

https://www.anandtech.com/show/15070/nvidia-gives-jetson-xavier-a-trim-announces-nanosized-jetson-xavier-nx


Orin changes it all up again of course.




--::{PC Gaming Master Race}::--