By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:

1. sc94597 said:

Sure there is a compute time cost. A cost that was made quite trivial with Ampere, which the Switch 2's GPU is most likely to be an implementation of. 

If you are wondering what one should probably expect with the Switch 2, you can look up any video showcasing an RTX 3050  Laptop with DLSS on vs. off. 

Consider, for example, that an RTX 3050 mobile chip and a GTX 1650 mobile chip are very similar in performance (within 25% of each-other) without DLSS (and about comparable to what we should expect with the Orin ostensibly in the Switch 2.) But almost every tech youtuber recommends the 3050 mobile, solely because DLSS improves performance, by a lot.

Consider that with DLSS the 3050 mobile was able to get over the 60fps threshold, whereas it was averaging 55 fps without DLSS in Red Dead Redemption 2. Similar was done in Watch Dogs Legion, Control, and Call of Duty Warzone.  There was also a significant boost in Shadow of the Tomb Raider. 

We have no idea how the Switch 2's Tegra will stack up against the 3050. It could be multiples cut down and thus worst.

2. sc94597 said:

I know it isn't a technological barrier, but was XeSS support announced for consoles? 

Intel has made it platform agnostic. I think the only hard requirement is INT4 support and DP4a when XMX instructions are not available.

I doubt it will ever see wide adoption.

3. sc94597 said:

And yes, while the Series S does support FSR 2.0, FSR 2.0 still isn't quite as good as DLSS 2.0. So image quality might not be that different either. A Switch 2 game upscaling from say 900p -> 1440p (using DLSS) probably would have better image quality than a Series S game from 900p -> 1440p (with FSR 2.0) and if they both target a locked 30fps, then the Series S having a better CPU probably won't matter much. Modern CPU's barely bottleneck at sub-60fps framerates, and the predicted Switch CPU (8 core A78AE) is a decent enough ARM chip that at those lowish frame-rates it wouldn't matter. 

FSR 3.0 is rolling out currently and that brings with it a plethora of improvements that will benefit the Series S.

4. sc94597 said:

ARM is also a much more efficient architecture than x86 at low power profiles. 

...Ryzen seems to be doing well on that front.


5. sc94597 said:

I'd love for the Switch 2 to have a 40hz mode like the Steam Deck. That would be the best sweet-spot in my opinion. 40hz is a huge latency reduction over 30hz, while still being pretty attainable for the hardware.

Would be nice to play Metroid Prime 4 with ray-tracing, DLSS 1080p, at 40hz.

Honestly I would just like a variable refresh rate display in the Switch 2 rather than any fixed arbitrary refresh rate.

That way the display -always- matches the games output, dropped frame? Doesn't matter. You won't notice it.

We can't trust developers to ensure a consistent framerate at 30fps, let-alone 40 or 60fps, let the display make up for that instead.



6. sc94597 said:

The newest rumor is that it has 12GB of unified memory.

https://www.notebookcheck.net/Consumer-Nintendo-Switch-2-rumored-to-have-more-RAM-than-the-Xbox-Series-S.747820.0.html

Then Nintendo is likely employing a clamshell memory layout, where 4GB of Ram will operate slower than the 8GB, likely partitioned for the OS/Background tasks, similar to the Series X... Because a 192bit memory bus is a bit much for a cost sensitive mobile chip.

Either way, 12GB isn't unheard of in mobile devices at the moment.

1. While it is true we have no idea how the Switch 2's Tegra will be like, a low TDP mobile 3050 level of performance (like the one in the video) is in line with the upper-end rumors, especially if they switched from Ampere to Lovelace as some of the more recent rumors allude to. Furthermore, it doesn't invalidate the point I was making that DLSS is a significant improvement even for the lowest-end Ampere chips. The 25W mobile 2050 (which is technically GA107, despite the 20 title) also benefits significantly from DLSS despite being a significant cut down relative to the 3050 mobile. It's often the difference between a game being unplayable or being lockable to 30fps (or 60fps.)  

2. Right, the point is moot of whether or not it can technically support XeSS if no games do. Nvidia has the incentive (with Nintendo) to push DLSS hard on the Switch 2 in a way Intel doesn't have with respect to XeSS and current gen consoles. 

3. We still don't know how well FSR 3 will compare to DLSS. Like DLSS 3.0, it seems mostly to be a Optical Frame Generation release (bleh), but I am sure they are indeed improving their TAAU solution too. The concern I have with AMD's ability to keep up is that as these Deep Learning Models get better (especially when the modeling process itself is automated by AI), it's going to be very hard for humans to keep up with the heuristic methods they've done in the past. AMD will either have to go the machine-learning route themselves, or find some new innovation beyond TAAU. 

It's unfortunate because it gives us less competition in the GPU-space, but the work Nvidia is putting into deep-learning inference in 3d modeling (and gaming) is very wide and deep. Every other month they release a new paper about some method of inference. These are all potential future DLSS implementations.

4. Between 12W - 30W they are doing well indeed. Sub-12W, ARM is still king. 

5. Yeah, true, just having a capable VRR display is better. 

6. The dev kits ostensibly are coming with 16GB. Having 8 GB dedicated to the graphics fits the performance-spec of the device. Anything more probably wouldn't make sense anyway.