zeldaring said:
Thank you Pemalite. honestly i blame DF they basically explained DLSS like a advertisement, and didn't talk about any of the negatives. Everyone seems to think DLSS gives you double the GPU power magically lol. |
There seems to be a subset of users on this forum who cling to buzzwords and run with it without actually understanding what it is or even what it means.
We saw it with the Cell.
We saw it with GDDR5.
We saw it with the ACE units.
We saw it with the PS5's SSD.
We saw it with the WiiU's eDRAM.
We saw it with the Xbox One's eSRAM and the Power of the Cloud.
We literally see it with every single console that gets released. - What does it amount to in the end? Stuff all.
When will people stop falling for it?
sc94597 said:
Sure there is a compute time cost. A cost that was made quite trivial with Ampere, which the Switch 2's GPU is most likely to be an implementation of. If you are wondering what one should probably expect with the Switch 2, you can look up any video showcasing an RTX 3050 Laptop with DLSS on vs. off. Consider, for example, that an RTX 3050 mobile chip and a GTX 1650 mobile chip are very similar in performance (within 25% of each-other) without DLSS (and about comparable to what we should expect with the Orin ostensibly in the Switch 2.) But almost every tech youtuber recommends the 3050 mobile, solely because DLSS improves performance, by a lot. Consider that with DLSS the 3050 mobile was able to get over the 60fps threshold, whereas it was averaging 55 fps without DLSS in Red Dead Redemption 2. Similar was done in Watch Dogs Legion, Control, and Call of Duty Warzone. There was also a significant boost in Shadow of the Tomb Raider. |
We have no idea how the Switch 2's Tegra will stack up against the 3050. It could be multiples cut down and thus worst.
Soundwave said:
I've seen DLSS from 360p it's not like some super secret that only you know about. It's not that bad at all. Yes there are some artifacts, but what are we talking about here? Playing high end games on a portable machine? Lots of Switch games today look like a borderline blurry/hazy mess in undocked mode, this looks as good or better image quality wise than several titles on the Switch I could name On a 7-inch screen for an undocked mode, this wouldn't be that bad at all, it looks better than like DOOM Eternal and Witcher 3 and Xenoblade 3 undocked on Switch even at only 360p to draw from, 540p looks completely playable even on a 4K TV. |
Thanks for the video. If you watch it full screen on a decent display it looks absolutely shocking. Extremely blurry and undefined.
There simply isn't enough data to infer a clean and sharp image from 480P to 1440P/4k.
And I would expect it to look better than other native 360P-540P games on Switch, considering how far less powerful that is, but that doesn't mean 480P DLSS looks "good" by any stretch of the imagination... Native 1080P PS4 looks far cleaner.
May be "playable" for you, but I have better standards it seems.
zeldaring said:
I mean almost every impression i read and even NVDA recommends using it at 1440p. it has to be that while your playing it doesn't look good at all at low resolution using the upscaling. |
When moving there is less temporal data to draw from, so the image quality actually degrades, hence the need for higher base resolutions.
But when you stand still, they are able to accumulate data from similar frames and infer a higher quality output.
Oneeee-Chan!!! said:
I wanted to ask why pemalite determined that the original upscaled image was 360p or 480p without mentioning the next generation switch specifications. What would happen to the Series S in that case? I am sorry but his text was too long and I could not quote only the necessary parts. |
I was debating those who were thinking they can just run Switch 2 games at 360P and reconstruct it into a 4k image without any issues.
It doesn't look great in the real world.
Obviously Switch 2 specifications haven't been released.
But a mobile device has lower TDP headroom than a fixed console, so the Series S will always retain an advantage... Especially as time goes on and FSR continues to improve.
sc94597 said:
I know it isn't a technological barrier, but was XeSS support announced for consoles? |
Intel has made it platform agnostic. I think the only hard requirement is INT4 support and DP4a when XMX instructions are not available.
I doubt it will ever see wide adoption.
sc94597 said:
And yes, while the Series S does support FSR 2.0, FSR 2.0 still isn't quite as good as DLSS 2.0. So image quality might not be that different either. A Switch 2 game upscaling from say 900p -> 1440p (using DLSS) probably would have better image quality than a Series S game from 900p -> 1440p (with FSR 2.0) and if they both target a locked 30fps, then the Series S having a better CPU probably won't matter much. Modern CPU's barely bottleneck at sub-60fps framerates, and the predicted Switch CPU (8 core A78AE) is a decent enough ARM chip that at those lowish frame-rates it wouldn't matter. |
FSR 3.0 is rolling out currently and that brings with it a plethora of improvements that will benefit the Series S.
sc94597 said:
ARM is also a much more efficient architecture than x86 at low power profiles. |
...Ryzen seems to be doing well on that front.
sc94597 said:
I'd love for the Switch 2 to have a 40hz mode like the Steam Deck. That would be the best sweet-spot in my opinion. 40hz is a huge latency reduction over 30hz, while still being pretty attainable for the hardware.
Would be nice to play Metroid Prime 4 with ray-tracing, DLSS 1080p, at 40hz. |
Honestly I would just like a variable refresh rate display in the Switch 2 rather than any fixed arbitrary refresh rate.
That way the display -always- matches the games output, dropped frame? Doesn't matter. You won't notice it.
We can't trust developers to ensure a consistent framerate at 30fps, let-alone 40 or 60fps, let the display make up for that instead.
JimmyFantasy said:
- It should perform in the range of 2-3 tflops when docked.
|
2-3 Teraflops of what?
RDNA3 where each compute unit is now dual-issue which then introduces contention with resources reducing the performance per-teraflop relative to prior hardware?
Or are you talking about rapid packed math where it's double the teraflop for every single precision teraflop if the instructions are compatible? (Not always compatible, so you never get linear scaling.)
Teraflops truly is a bullshit denominator.
Then Nintendo is likely employing a clamshell memory layout, where 4GB of Ram will operate slower than the 8GB, likely partitioned for the OS/Background tasks, similar to the Series X... Because a 192bit memory bus is a bit much for a cost sensitive mobile chip.
Either way, 12GB isn't unheard of in mobile devices at the moment.