By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
borisr said:

What do you do on Deck except gaming? I dont understand. Its gaming platform as consoles. Usual desktop pc is not gaming platform at all.

The Steam Deck does have a desktop mode, so essentially it can also function as an ordinary PC (Linux kernel).

You can install apps and all, including emulators up to the PS3/Wii U.



 

 

 

 

 

Around the Network

I can only think on Switch 2 version of DLSS being better than the regular RTX DLSS.
If Id guess, the pairs on training would be specific for 540 original and to 1080p in the context of the switch. While RTX the pattern is 1080 native and target above.
So, being specific in order to maximize this settings, It could perform much better for the scenario where it was trained.
We have seen how bad it comes with 540 to 1080 or higher in ampere, with old RTX 2060 test from DF.
Now, it is much better on switch 2. Lots of artifacts what are seen in PC DLSS (even considering the 2060 test) are gone. Even considering the higher ends boards of series 5000, if we try to start from 540 for a fair comparison, I still think switch 2 will be better, the comon artifacts does seems to be present consistently, It is fooling the specialists so they think that there is not DLSS



Yes, Switch 2 with its meager Tensor cores is no doubt doing better than nVidia's PC GPUs...

/Picard-Riker-Worf triple facepalm



HoloDust said:

Yes, Switch 2 with its meager Tensor cores is no doubt doing better than nVidia's PC GPUs...

/Picard-Riker-Worf triple facepalm

To be fair, there are odd things about Switch 2's DLSS implementation and its performance costs.

For example, based on their simulated testing of DLSS latency on an RTX 2050 Digital Foundry thought upscaling to 4k 60fps would be impossible due to the frametime penalty (18.6 ms), but it looks like Fast Fusion is doing that. (Although DF's simulation was probably flawed from the start, in its assumptions.)

https://youtu.be/JJUn7Kc3W3A?si=3naNeN2QiM5T_gwf

When Nvidia trains these models, they probably are training dozens to hundreds, most of which they reject. It is possible that one of those rejected models or a distillation of a non-rejected model, or even a brand new model is what is running on Switch 2. Also a lot might be achieved through pruning a model when targeting a given input range, which is far narrower on Switch 2 than PC. Basically you might be able to reduce parameter count (and therefore inference compute) without reducing quality too much  because certain nodes might only activate when the input resolution is higher than what Switch 2 typically achieves. Pruning can improve the model's efficiency. Although this begs the question of why Nvidia hasn't tiered DLSS like this on PC for different hardware ranges. Development complexity might be part of why. Although it might just be the case that there isnt much to prune. 

Personally, having played a few games (Cyberpunk, Hogwart's Legacy) I don't think the Switch 2's DLSS implementations are much better or worse than what we've seen on PC performance modes @ 1080p. Even if the tell-tale artifacts are hard to identify, it could be just that we are making wrong assumptions about the models being used. We still don't know what sort of model or models are even running on Switch 2. The use of DRS and unique post-processing techniques make this even trickier. 



sc94597 said:
HoloDust said:

Yes, Switch 2 with its meager Tensor cores is no doubt doing better than nVidia's PC GPUs...

/Picard-Riker-Worf triple facepalm

To be fair, there are odd things about Switch 2's DLSS implementation and its performance costs.

For example, based on their simulated testing of DLSS latency on an RTX 2050 Digital Foundry thought upscaling to 4k 60fps would be impossible due to the frametime penalty (18.6 ms), but it looks like Fast Fusion is doing that. (Although DF's simulation was probably flawed from the start, in its assumptions.)

https://youtu.be/JJUn7Kc3W3A?si=3naNeN2QiM5T_gwf

When Nvidia trains these models, they probably are training dozens to hundreds, most of which they reject. It is possible that one of those rejected models or a distillation of a non-rejected model, or even a brand new model is what is running on Switch 2. Also a lot might be achieved through pruning a model when targeting a given input range, which is far narrower on Switch 2 than PC. Basically you might be able to reduce parameter count (and therefore inference compute) without reducing quality too much  because certain nodes might only activate when the input resolution is higher than what Switch 2 typically achieves. Pruning can improve the model's efficiency. Although this begs the question of why Nvidia hasn't tiered DLSS like this on PC for different hardware ranges. Development complexity might be part of why. Although it might just be the case that there isnt much to prune. 

Personally, having played a few games (Cyberpunk, Hogwart's Legacy) I don't think the Switch 2's DLSS implementations are much better or worse than what we've seen on PC performance modes @ 1080p. Even if the tell-tale artifacts are hard to identify, it could be just that we are making wrong assumptions about the models being used. We still don't know what sort of model or models are even running on Switch 2. The use of DRS and unique post-processing techniques make this even trickier. 

I guess we'll see the results in time - IIRC (I watched that 2 days ago) DF specifically called out DLSS disocclusions in Fast Fusion and problems with image quality, so waiting to see how Hogwarts fairs in their tests, since that was worst case scenario with edges falling apart in quite a few places in that video they were commenting on some time ago.



Around the Network



Hadn't realized the docked mode internal resolution goes all the way up to 1080p. Definitely explains why the game looks very clean at certain points. 

Overall the Switch 2 settings seem to be spanning the whole gamut from worse than PS4 (shadows) to PS5 equivalent (textures.) 

Makes you wonder what a fully Lovelace Switch 2 could've achieved.



Yeah, looks very descent - it seems in Phantom Liberty it does suffer from severe frame drops on occasion and they really had to dial down on traffic and NPCs counts overall, but quite good port.



Cyberpunk is pretty on Switch 2 and beat Series on certain aspects which is great for handheld



sc94597 said:

Makes you wonder what a fully Lovelace Switch 2 could've achieved.

Makes me wonder what the UDNA PlayStation Portable will achieve (which will double as a preview for the Switch 3, assuming Nintendo goes for a N2/18A/16A node at lower power). I just hope it's not too expensive...