By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
OdinHades said:

They could just release a "Pro" Dock witch some eGPU in it. Slam some mid-tier Nvidia GPU into it, sell it for the same price as the console itself and I think they would have quite a nice offering and most certainly a bigger jump than just an all new handheld in a few years.

But whatever, what do I know.

If it had USB4 or Thunderbolt it may be possible, but I imagine the Switch 2 is using USB 3.2. I say may because I don't know the technicalities of using an EGPU on an ARM powered device, I don't think it's ever been done before, and if it would actually be possible on NVidia's T234/T239 chip.

It would have been an interesting upgrade option though.



Around the Network

Content removed.



Looking at some old posts of mine, and found this funny. 

sc94597 said:

I'd love for the Switch 2 to have a 40hz mode like the Steam Deck. That would be the best sweet-spot in my opinion. 40hz is a huge latency reduction over 30hz, while still being pretty attainable for the hardware.

Would be nice to play Metroid Prime 4 with ray-tracing, DLSS 1080p, at 40hz.

Even being moderately optimistic as I was, I didn't think Nintendo would go for a high frame-rate setup for their games. Still maybe Metroid Prime 5 (let's hope we get it) will have a ray-tracing mode at 40hz-60hz variable. Although that is roughly what Cyberpunk 2077's performance mode has turned out to be (absent the ray-tracing.) 

My prediction from September 2023 wasn't far off though. Although given that we knew pretty much everything about the hardware then, it's not much of a "prediction." Again, still didn't think we'd get 4k output in a modern 3D title though when I said "Some simple 2d/2.5d titles would do 1080p -> 1800p or 4k."



sc94597 said:

Looking at some old posts of mine, and found this funny. 

sc94597 said:

I'd love for the Switch 2 to have a 40hz mode like the Steam Deck. That would be the best sweet-spot in my opinion. 40hz is a huge latency reduction over 30hz, while still being pretty attainable for the hardware.

Would be nice to play Metroid Prime 4 with ray-tracing, DLSS 1080p, at 40hz.

Even being moderately optimistic as I was, I didn't think Nintendo would go for a high frame-rate setup for their games. Still maybe Metroid Prime 5 (let's hope we get it) will have a ray-tracing mode at 40hz-60hz variable. Although that is roughly what Cyberpunk 2077's performance mode has turned out to be (absent the ray-tracing.) 

My prediction from September 2023 wasn't far off though. Although given that we knew pretty much everything about the hardware then, it's not much of a "prediction." Again, still didn't think we'd get 4k output in a modern 3D title though when I said "Some simple 2d/2.5d titles would do 1080p -> 1800p or 4k."

Retro Studios games always run at 60fps, even on Wii Metroid Prime 3 was 60fps. If there's a Prime 5 by them, it will be too.

Heck, even on Wii or Switch 1, a lot of Nintendo games targeted those system's maximum 60fps output, from Mario Galaxy to Odyssey to Splatoon to Metroid Dread to ARMS.



Zippy6 said:
OdinHades said:

They could just release a "Pro" Dock witch some eGPU in it. Slam some mid-tier Nvidia GPU into it, sell it for the same price as the console itself and I think they would have quite a nice offering and most certainly a bigger jump than just an all new handheld in a few years.

But whatever, what do I know.

If it had USB4 or Thunderbolt it may be possible, but I imagine the Switch 2 is using USB 3.2. I say may because I don't know the technicalities of using an EGPU on an ARM powered device, I don't think it's ever been done before, and if it would actually be possible on NVidia's T234/T239 chip.

It would have been an interesting upgrade option though.

Latency is the killer.
USB is higher latency than PCI-E.

USB can take anywhere from 100-300 microseconds to do a single transaction.
PCI-E can take anywhere from 1-3 microseconds to do a single transaction.

In short, whilst you can use an external GPU via USB, you will introduce significant latency into the rendering, which will introduce frame pacing issues.

The technology world has moved from multi-GPU because of glaring issues like I alluded to above.

You are better off just making a bigger Tegra GPU and disabling half of the GPU whilst in portable mode which is far more feasible and efficient if you power gate that part of the chip.

Or just drive the clockspeeds up and down to hit various power targets which has been the Nintendo approach, thus the only investment they need to make is in the cooling.




www.youtube.com/@Pemalite

Around the Network

Lol damn made me remember back in the day when people had dual Nvidia 8800s.



Bite my shiny metal cockpit!

Leynos said:

Lol damn made me remember back in the day when people had dual Nvidia 8800s.

Last time I had multi-GPU's was back with the Radeon HD 6950 era, unlocked the shaders into 6970 and had four of them in crossfire... And I was driving 5760x1080 resolution.

It was mental, way ahead of the PS4, before the PS4 even launched.

But it was a nightmare to get everything running great.

The first Multi-GPU I had was with a pair of 3dfx Voodoo 2's in SLI back in the 90's. Absolutely obliterated consoles like the PS1 and Nintendo 64. 1024x768 @60fps was achievable... When the N64 was struggling with 640x480 @30fps.
Also had a ATI Rage Furry Maxx at one point, mental card, two Rage chips on a single PCB running in an early ATI-Crossfire implementation.


...Then it stopped when PC migrated to AGP.
...Then Multi-GPU came back when all the slots ended up being PCI-E/same again.

History loves to repeat itself, even with technology.




www.youtube.com/@Pemalite

curl-6 said:

Retro Studios games always run at 60fps, even on Wii Metroid Prime 3 was 60fps. If there's a Prime 5 by them, it will be too.

Heck, even on Wii or Switch 1, a lot of Nintendo games targeted those system's maximum 60fps output, from Mario Galaxy to Odyssey to Splatoon to Metroid Dread to ARMS.

While this was true for their past titles, having the ability to support framerates between 30fps and 60fps (or a variable rate between these two) means Retro isn't constrained to 60fps to achieve their artistic vision, which includes responsive controls/gameplay.

For example, 40fps can get you halfway between 30fps and 60fps in terms of frame-time/latency (16ms/60fps vs. 25ms/40fps vs. 33ms/30fps) without that much performance cost (+33%.) It is definitely a viable option they never had before in the past, and it would be silly for them to ignore it solely to keep up with tradition.

Also I was talking about 120fps when I mentioned high frame-rates being something I wasn't expecting. Sixty is pretty much average these days, although your point stands in that that wasn't necessarily the case in the 6th and 7th generations.  

Last edited by sc94597 - on 22 April 2025

Pemalite said:

Latency is the killer.
USB is higher latency than PCI-E.

USB can take anywhere from 100-300 microseconds to do a single transaction.
PCI-E can take anywhere from 1-3 microseconds to do a single transaction.

In short, whilst you can use an external GPU via USB, you will introduce significant latency into the rendering, which will introduce frame pacing issues.

Pretty much all Intel thunderbolt chipsets since the release of Titan Ridge and especially in recent years with ASmedia's ASM2464PD have greatly reduced latency over TB4/USB4 to the point where you can get quite stable framerates with little jitter above 60fps (when outputing to an external display.) Albeit with a performance penalty (~10 to 20% for ASM2464PD and 20% to 40% for Titan Ridge.)

We haven't gotten any tests yet, since no eGPUs have yet released with TB5 (though Asus has announced one for release soon), but Barlow Ridge is likely not going to bottleneck anything but the highest end GPU's, and only in the single digit % at that. It'll probably be very close to PCI-E 4.0 x 4 Oculink in terms of performance.

Alpine Ridge controllers (which are nearly a decade old now) did have significant latency issues. Same thing for any platform where the controller is a discrete chip separate from the CPU. 

The big problem with an eGPU setup is that they really aren't as plug-in-play/hot-swappable as people think. Even the Thunderbolt/USB4 setups have issues. Nintendo would have to go through a lot of R&D to get it to work seamlessly, and it just isn't worth the squeeze when, as you mentioned, they could just make better cooling and clock the CPU and GPU higher. 

Last edited by sc94597 - on 22 April 2025

Leynos said:

Lol damn made me remember back in the day when people had dual Nvidia 8800s.

SLI days? :)

I remember back when video cards where stand alone, and 3dfx cards where these "extra" cards, that you also ran, side by side, with the video card.
You then had a vga cable, that hooked from your video card, into the 3dfx card, and out, towards the monitor.

That was how you ran 3D games back then.
Before it was called Nvidia.

My bro got a voodoo card, I was jealous of what it could do.
I remember being blown away, seeing games run in 3d, back when everything else was 2d.