Forums - Nintendo Discussion - Could Next-Gen Switch Use Nvidia DLSS AI Upscaling?


https://www.youtube.com/watch?v=coOzBPGl-O8

Nvidia's DLSS upscaler is starting to become seriously good - and AI upscaling has been added to the latest Shield Android TV. So what happens if we apply *both* of these technologies to Switch titles? Could AI inferencing be a good match for a next-gen Switch? We've got some practical tests here that deliver some fascinating results.

---

End result, magic does not exist.
But it could work on some titles.



@Twitter | Switch | Steam

You say tomato, I say tomato 

"¡Viva la Ñ!"

Around the Network

Yes they can , because Switch 2 will be using new GPU that use AI scaling.



If the Switch 2 has Tensor cores, then yes it can have DLSS.
If it doesn't, then no. No it can not have DLSS.

Basically the technology already exists today for a hypothetical Switch 2 to have DLSS... If it used Tegra Xavier or Tegra Orin.



--::{PC Gaming Master Race}::--

Overall I think DLSS kinda sucks, there other methodes just as good or better.

But he makes a good point.... a more powerfull switch useing DSLL, might be able to match a base PS4 in terms of image quality.
Maybe Switch 2, is around what a base PS4 is today, via scaleing tricks.



At UNI 13 years ago i played with neural networks for my project on images and video enhancements. The videos used to take hours to process lol.

To think that this can be done in a video game and in real time now is insane. I'm sure people will come up with their own methods for this in future and def Nintendo should look into it. It might be the thing it needs to their next gen switch to compensate for the lack of raw power.



 

 

Around the Network

as far as I understand DLSS, the games that use it need to be processed intensely by a server farm to get to the point where the AI rendering result is as good as a higher native res and faster in performance - I don't think we heard how long it really takes to get there (we just know the earliest DLSS enabled titles didn't really beat out using slightly lower native res)

obviously server farms will get beafier and this whole process should become better and faster in the future, but still I'm quite sure it won't get to the point that everyone can use it for their games any time soon

Last edited by Lafiel - on 06 February 2020

Lafiel said:

as far as I understand DLSS, the games that use it need to be processed intensely by a server farm to get to the point where the AI rendering result is as good as a higher native res and faster in performance - I don't think we heard how long it really takes to get there (we just know the earliest DLSS enabled titles didn't really beat out using slightly lower native res)

obviously server farms will get beafier and this whole process should become better and faster in the future, but still I'm quite sure it won't get to the point that everyone can use it for their games any time soon

Its like checkerboarding... its a technique to use less to show more.

Honestly most games that come with a resolution scaleing option (like 1080p, at 70%) + Sharpening filter,
could be near equal or better than DLSS.


Atleast it was like that early on, DLSS isnt magic, its alot of work for little effort.


https://www.youtube.com/watch?v=dADD1I1ihSQ

https://www.youtube.com/watch?v=3DOGA2_GETQ

DLSS is overrated by far.
Resolution scaleing + sharpening filters beat it, in alot of games.
(without needing games to have additional packages of stuff for games, to use DLSS, or haveing server farms makeing these optimisations)

People need to understand that DLSS has downsides too.
The "deep learning" of the machine determines how things turn out.
Sometimes this manifests in a slight blurr + oil painting effect when you look at things.

Last edited by JRPGfan - on 07 February 2020

JRPGfan said:

Its like checkerboarding... its a technique to use less to show more.

Honestly most games that come with a resolution scaleing option (like 1080p, at 70%) + Sharpening filter,
could be near equal or better than DLSS.


Atleast it was like that early on, DLSS isnt magic, its alot of work for little effort.

I agree that the early implementations weren't worth it, but the most recent DLSS implementations in Wolfenstein Youngblood and Deliever us the Moon show that the technique can do what it promised to do - show visual results on par with a given native res, while offering clearly better performance (and clearly looking better than res scaling + sharpening at an equivalent performance).

We unfortunately don't know whether or not these results are easily reproduceable with the majority of titles or due to Ws:YB/DutM being well suited for DLSS. Neither do we know how long/how much processing time it takes Nvidia to train the AI enough to get to these results, which would determine how feasible (from a cost/time/energy point of view) it is to implement it in 100+ very demanding games per year.



Yes.
The problem with the first test was that he used a AI trained for generic movies.
The key point is to train the net for specific content.
The developers provide tons of 4k, antialiased, etc images to the net and the respective 360p/720p/1080p native image. train a week or two. And load the model along with the game. The result would be far, far better. Train to avoid jaggies while adding details would be piece of cake, specially if it only have to resolve types of images you know to encounter in the game.

I dont know where the processing power for forward a image in a net would be, I first supose that was a lot, so It could be incorporated to the dock(as it is resolved after rendering time), so, there would be a higher price dock that would upscale to 4k.(as optional accessory for high end users)
However, the video remembered me that new mobile chips of NVIDIA bring also the tensor cores embeded in the SoC, it could be the switch system, and far more cheaper than including in a dock specially for it...

Anyway, I already considered that. Even before the RTX boards with DLSS:
http://gamrconnect.vgchartz.com/thread.php?id=236112&page=1



I don't know, but AI scaling sounds interesting.