By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - How much do you care about the graphical leap between consoles at this point?

DonFerrari said:
Soundwave said:

Well DLSS 2.0 is only supported by RTX range cards, so you can't use this on a $200 card. Nvidia will probably then move up to DLSS 3.0 and simply state you must now have an Ampere based card (3060 or better) and so on and so on. So they can cover themselves that way. 

But for enclosed software hardware ecosystem like the Switch is, Nintendo can simply just build it into every development kit so that it's used basically for every game. On a system like Switch there's no benefit to not using it most of the time. 

Guess you missed the point. If DLSS would cover very big gaps in power then small ones won't even exist so even the lowest grade card with DLSS would basically remove all the reason for any card above it.

And you also ignore that there are already other reconstruction techniques that have been used, the on in PS4 was already making things not be noticed when making under 1400p to 4k.

And the UE5 engine DF couldn't really see the difference from 1440p to 4k.

Also you are ignoring that pixel count is just a very small part of the IQ on a game and also of the graphic budget.

That's for Nvidia to manage how they want, the fact is the technology works. People are hacking games like Control on the PC to get it to reconstruct up from resolutions as low as 512x288 and it works. My guess is Nvidia will quietly just pass the technology onto Nintendo and not push it too hard for the PC market. That would be most sensible for them, it makes a lot of sense for a Switch, for a company that wants to sell higher end PCs probably they quietly bury or de-emphasize the feature as time goes on. 

The PS4 sharpening filter/checkerboard effect is no where near as good as DLSS 2.0. Shit, DLSS 1.0 is nowhere near as good as DLSS 2.0. You're talking a different ball game here, this is an AI algorithm which can reconstruct even badly damaged photos, it looks to me like once working out a few kinks they saw massive improvement in image quality. It will probably get even better than that to boot with DLSS 3.0.

Pixel count which is screen resolution is not a "small part" of the overall graphics pipeline. Anyone knows for instance when you go from 1440p or 4K settings on your PC and go down to say "only" 720p, this is not some small difference. At 720p you can crank max graphical effects at much higher settings and/or get a much higher frame rate than you could get at 1440p or 4K, way higher. That's not some minor part of the puzzle. The downside was always that you were stuck with a lower resolution looking image obviously (blurrier, less detailed looking), that was always the trade off, but DLSS 2.0 basically cheats that bringing that image quality back up close to the 1440p/4K one. 

Last edited by Soundwave - on 24 May 2020

Around the Network
Pemalite said:



There is no guarantees that Nintendo will use DLSS with Switch 2 either, Nintendo does what Nintendo does.

My thinking exactly; Nintendo tends to make weird hardware decisions and not prioritize visual performance, so it would not surprise me if Switch 2 forewent DLSS.



curl-6 said:
Pemalite said:



There is no guarantees that Nintendo will use DLSS with Switch 2 either, Nintendo does what Nintendo does.

My thinking exactly; Nintendo tends to make weird hardware decisions and not prioritize visual performance, so it would not surprise me if Switch 2 forewent DLSS.

The Tegra X1 was a pretty damn powerful chip for a portable device that was supposed to release in 2016, Nintendo's hardware decisions are not that weird of late, the days I think of Miyamoto getting to do whatever he wants on hardware just because he has some preference are over. 

DLSS is not about cutting edge power ... it's the exact opposite ... it lets lowered powered hardware punch way above its weight. It's basically tailor made for a system like the Switch, Nvidia doesn't have a big incentive to push this for PC GPUs because otherwise people would just buy cheap GPUs. But for a Nintendo system it makes perfect sense. 

DLSS on Switch can mean they can keep their games locked at insanely low resolutions, you may never have to render even above 960x540 really (docked). And Tensor cores are basically part of all modern Nvidia graphics chips, even modern Tegra designs like the Tegra Xavier has Tensor cores. 

I'm sure Nintendo would like to have games like Monster Hunter World 2 and Resident Evil 4 Remake, they stand to make a nice $10/unit profit if the hardware can run those games, there's no reason to not do it. 

Even for "non graphics intensive" games DLSS can have a benefit ... a Switch 2 game that you want to run 1080p undocked for example (like a Kirby game platformer) could just run at 540p instead and lower the power consumption of the chip and consume far less battery. The DLSS algorithm doesn't care whether your image is a high end one or some simple 2D Kirby game, it will reconstruct them the same way, so it will take that 540p image and give you a 1080p one. I'm not sure on this, but I'm pretty sure this could also save from games having to be so large if they can release them with such low native resolutions and let the DLSS reconstruct it, maybe a game that would otherwise be 32GB can now fit on a 16GB cartridge. That saves Nintendo $$$$. 

There's lots of benefits. 

Last edited by Soundwave - on 25 May 2020

Can anyone tell me how big of a leap the cpu is compared to base ps4. 10X, 20? More?



Soundwave said:

The Tegra X1 was a pretty damn powerful chip for a portable device that was supposed to release in 2016, Nintendo's hardware decisions are not that weird of late, the days I think of Miyamoto getting to do whatever he wants on hardware just because he has some preference are over. 

Powerful, yes. Class leading? Not really.

Tegra X1 was not the fastest SoC on the market at the time the Switch debuted, Tegra X2 could have offered 50% more performance at the same TDP easily enough. (Your typical Maxwell to Pascal jump.)

Not to mention the Switch castrated clockrates to ensure that power consumption remained in check.

And... Additionally, only had 4GB of Ram which could have been boosted to 6-8GB fairly cheaply.

Soundwave said:

DLSS is not about cutting edge power ... it's the exact opposite ... it lets lowered powered hardware punch way above its weight. It's basically tailor made for a system like the Switch, Nvidia doesn't have a big incentive to push this for PC GPUs because otherwise people would just buy cheap GPUs. But for a Nintendo system it makes perfect sense. 

DLSS is reliant on the cloud and Tensor cores.
No internet? No DLSS.

What can the Switch not guarantee? That's right. Internet due to the Wifi connectivity.

...And because DLSS requires internet connectivity, it increases data traffic which not only reduces the consoles portable battery life, but if you are using mobile data, extra data charges on top of it. (Important if you have data caps.)

Soundwave said:

DLSS on Switch can mean they can keep their games locked at insanely low resolutions, you may never have to render even above 960x540 really (docked). And Tensor cores are basically part of all modern Nvidia graphics chips, even modern Tegra designs like the Tegra Xavier has Tensor cores. 

Nintendo likes to have a "clean" presentation.
That means minimal post-process effects and "enhancements" to the image like Anti-Aliasing. - We need to remember that DLSS does bring with it various artifacts in the visuals due to over-sharpening, it's not perfect in every game.
https://www.eurogamer.net/articles/digitalfoundry-2020-control-dlss-2-dot-zero-analysis

Not only that, but hardware tends to have an "efficiency curve". - Where once you exceed a certain resolution threshold, there is a corresponding exponential hit to performance.
The Switch only has 25GB/s of bandwidth, so that is a fairly chunky bottleneck, which is one of the bigger factors limiting it to achieving 1080P even in docked mode with the majority of games.

If Switch 2 targets 1080P and has 150GB/s or more of memory bandwidth, then running games at 540P is probably not going to net developers with much additional performance to bolster visual fidelity in other areas as clearly you aren't going to be fillrate limited.

KratosLives said:
Can anyone tell me how big of a leap the cpu is compared to base ps4. 10X, 20? More?

Between PS4 and PS5? Around 10x when leveraging Ryzens newer capabilities, might even be more, I did a deep-dive ages ago with a breakdown of capabilities.
Probably closer to around 5-8x when dealing with more traditional workloads.

But that is a catastrophic leap.



--::{PC Gaming Master Race}::--

Around the Network

Be glad Switch even had 4GB. Nintendo wanted 3GB. Capcom asked them to add 4GB of ram.



Bite my shiny metal cockpit!

Leynos said:
Be glad Switch even had 4GB. Nintendo wanted 3GB. Capcom asked them to add 4GB of ram.

And then proceeded to only put games on Switch that ran on systems of 512MB of RAM or less.



RolStoppable said:
Pemalite said:

DLSS is reliant on the cloud and Tensor cores.
No internet? No DLSS.

What can the Switch not guarantee? That's right. Internet due to the Wifi connectivity.

...And because DLSS requires internet connectivity, it increases data traffic which not only reduces the consoles portable battery life, but if you are using mobile data, extra data charges on top of it. (Important if you have data caps.)

First time I am reading this. So DLSS is dependent on external processing, a.k.a. the cloud. It's not an advanced form of resolution upscaling that is performed by the local device alone.

For a full detailing of how it works.
https://www.nvidia.com/en-au/geforce/news/nvidia-dlss-your-questions-answered/

https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

https://au.pcmag.com/sound-cards/66143/testing-nvidias-dlss-20-higher-frame-rates-for-free

In short, nVidia's super computer has pre-calculated 16k images for the game locally being rendered to, being compared to.
That is allot of data, DLSS is only supported on a few PC titles... And that data is constantly being updated.
Where-as with a console, you would expect it be unanimously supported... That is 4804 games on Switch, and that's just the half life of the console, that number could double. - That's a ton of data.

Leynos said:
Be glad Switch even had 4GB. Nintendo wanted 3GB. Capcom asked them to add 4GB of ram.

I should iterate that I am a tech enthusiast. - I always want more, I always want better.
Ram is cheap.



--::{PC Gaming Master Race}::--

curl-6 said:
Pemalite said:



There is no guarantees that Nintendo will use DLSS with Switch 2 either, Nintendo does what Nintendo does.

My thinking exactly; Nintendo tends to make weird hardware decisions and not prioritize visual performance, so it would not surprise me if Switch 2 forewent DLSS.

Let's hope that's not the case. That would be such an easy win for them, and "future-proof" a Switch 2 for a longer life-span.



Retro Tech Select - My Youtube channel. Covers throwback consumer electronics with a focus on "vid'ya games."

Latest Video: Top 12: Best Games on the N64 - Special Features, Episode 7

KratosLives said:
Can anyone tell me how big of a leap the cpu is compared to base ps4. 10X, 20? More?

Just going off of Jaguar vs Zen2.

Closest exsample's I could find (single thread performance in cinebench R20):

AMD A4-5100 (1.55ghz jaguar mobile) scores: 68
AMD Ryzen 3 4300U (upto 3,7ghz boost zen2 mobile) scores: 422    (PS5 is supposedly 3.5ghz)

Then factor in 8 threads vs 16 threads.
Also that things like AVX workloads were run at half rate, while on zen2 that ll be double.

Zen is bound to be a big upgrade, probably 6+ times in typical work loads.
In some unique and limited situation you might actually see something like x20.

Jaguar core is pretty damn weak by todays standards.