By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Nvidia reveals DLSS 5 , essentially applies AI filter to games in real time.

sc94597 said:
Soundwave said:

To be honest, I kind of am wondering why even bother with ray/path tracing/Lumen at all. Those things still take a ton of compute. 

The one example in the DF video that stood out was Starfield which doesn't support raytracing/path tracing/Lumen at all and is by DF's own labelling a "flat looking game" lighting wise. To their trained eyes they admitted the generative AI lighting added to the scenes made the game look like a path traced game. 

I mean if that's their initial reaction, that's likely easily good enough for "regular or even hardcore gamer Joe", DF was "fooled" and it's their job to pixel count things. 

That's another take away I kind of see here, why bother with extremely compute expensive technologies like path tracing and Lumen at all if you can get an image to "pop" lighting wise like that. Now I know there is a group of people who rightfully so hate that overlit look, but if you eventually get games where you don't even have reference to what the "normal lighting" is supposed to look like and you just have a game with a Neural rendering light engine (generative AI) ... you won't even know what the original looks/looked like and will just take at face value that is the lighting of the scene. 

The problem is that DLSS 5 is vendor-locked, and it would be weird to have a game looks different on AMD vs. Nvidia vs Intel. 

Also the technology isn't magic. There is a tradeoff between consistency and input data. The more useful data features you have the more consistent the output. 

Nvidia has been instead going the route of using Deep Learning to accelerate path tracing/ray tracing, and that makes sense. 

That may only be a problem for a temporary period though. AMD I'm sure will make their own knock off of this, just as developers support both DLSS upscaling and FSR in the same game, but Nvidia has something like a 90% marketshare of PC GPUs anyway.

And then a closed system like a hypothetical Switch 3 with things like exclusive Nintendo games that aren't on other platforms ... you may never, ever know any better. In fact I would say for a system like that it would be stupid to even try to focus on path tracing/ray tracing at all. Let the generative AI handle all the lighting, if PS4/5 range baked lighting enough reference data for it to create a look that imitates path tracing to the point that Digital Foundry was fooled ... well I mean it's hard to really justify the performance cost of ray/path tracing. 

Now it does look to me like the algorithm Nvidia is using is trained to over light and create a vivid picture that "pops" on purpose, it just looks at every scene and goes "Imma make this look flashy as fuck", but I hate to say it, most consumers are going to be happy with that. Most people, even graphics enthusiasts don't really care if a scene looks "accurate", they want it to look eye pleasing, and I think that's all Nvidia is going for. 



Around the Network
Soundwave said:
sc94597 said:

The problem is that DLSS 5 is vendor-locked, and it would be weird to have a game looks different on AMD vs. Nvidia vs Intel. 

Also the technology isn't magic. There is a tradeoff between consistency and input data. The more useful data features you have the more consistent the output. 

Nvidia has been instead going the route of using Deep Learning to accelerate path tracing/ray tracing, and that makes sense. 

That may only be a problem for a temporary period though. AMD I'm sure will make their own knock off of this, just as developers support both DLSS upscaling and FSR in the same game, but Nvidia has something like a 90% marketshare of PC GPUs anyway.

And then a closed system like a hypothetical Switch 3 with things like exclusive Nintendo games that aren't on other platforms ... you may never, ever know any better. In fact I would say for a system like that it would be stupid to even try to focus on path tracing/ray tracing at all. Let the generative AI handle all the lighting, if PS4/5 range baked lighting enough reference data for it to create a look that imitates path tracing to the point that Digital Foundry was fooled ... well I mean it's hard to really justify the performance cost of ray/path tracing. 

Now it does look to me like the algorithm Nvidia is using is trained to over light and create a vivid picture that "pops" on purpose, it just looks at every scene and goes "Imma make this look flashy as fuck", but I hate to say it, most consumers are going to be happy with that. Most people, even graphics enthusiasts don't really care if a scene looks "accurate", they want it to look eye pleasing, and I think that's all Nvidia is going for. 

Then most consumers are idiots and idiots, as usual, are ruining everything.



CaptainExplosion said:
Soundwave said:

That may only be a problem for a temporary period though. AMD I'm sure will make their own knock off of this, just as developers support both DLSS upscaling and FSR in the same game, but Nvidia has something like a 90% marketshare of PC GPUs anyway.

And then a closed system like a hypothetical Switch 3 with things like exclusive Nintendo games that aren't on other platforms ... you may never, ever know any better. In fact I would say for a system like that it would be stupid to even try to focus on path tracing/ray tracing at all. Let the generative AI handle all the lighting, if PS4/5 range baked lighting enough reference data for it to create a look that imitates path tracing to the point that Digital Foundry was fooled ... well I mean it's hard to really justify the performance cost of ray/path tracing. 

Now it does look to me like the algorithm Nvidia is using is trained to over light and create a vivid picture that "pops" on purpose, it just looks at every scene and goes "Imma make this look flashy as fuck", but I hate to say it, most consumers are going to be happy with that. Most people, even graphics enthusiasts don't really care if a scene looks "accurate", they want it to look eye pleasing, and I think that's all Nvidia is going for. 

Then most consumers are idiots and idiots, as usual, are ruining everything.

You're spot on with everything you're saying. 

I remember working at Best Buy for a holiday season ages ago, but I recall a lot (the majority by far) of people preferring a TV's "vivid" mode settings over proper color calibration. People just want a flashy, bright image. So much so that I remember one family coming back in to the store and berating a co-worker because they made the "TV look worse", when he just calibrated it properly, we had to just flip the settings back to vivid mode and they were happy with that, lol. 



Soundwave said:
CaptainExplosion said:

Then most consumers are idiots and idiots, as usual, are ruining everything.

You're spot on with everything you're saying. 

I remember working at Best Buy for a holiday season ages ago, but I recall a lot (the majority by far) of people preferring a TV's "vivid" mode settings over proper color calibration. People just want a flashy, bright image. So much so that I remember one family coming back in to the store and berating a co-worker because they made the "TV look worse", when he just calibrated it properly, we had to just flip the settings back to vivid mode and they were happy with that, lol. 

One good thing about stupid people is how easy it is for them to die. I'm not saying I'm gonna kill these idiots, but at least they're not long for this world.



CaptainExplosion said:
Soundwave said:

You're spot on with everything you're saying. 

I remember working at Best Buy for a holiday season ages ago, but I recall a lot (the majority by far) of people preferring a TV's "vivid" mode settings over proper color calibration. People just want a flashy, bright image. So much so that I remember one family coming back in to the store and berating a co-worker because they made the "TV look worse", when he just calibrated it properly, we had to just flip the settings back to vivid mode and they were happy with that, lol. 

One good thing about stupid people is how easy it is for them to die. I'm not saying I'm gonna kill these idiots, but at least they're not long for this world.

I open the thread, lots of new replies, jump to the newest one and...

What the actual fuck is even this post, lol



Around the Network
BraLoD said:
CaptainExplosion said:

One good thing about stupid people is how easy it is for them to die. I'm not saying I'm gonna kill these idiots, but at least they're not long for this world.

I open the thread, lots of new replies, jump to the newest one and...

What the actual fuck is even this post, lol

Just trying to understand what we can do to survive the AI future that's being forced upon us.



CaptainExplosion said:
sc94597 said:

The problem with this stance is that it implies nothing can be done other than somehow reverse technology in a Dune-esque fashion. I can't think of an example of this happening historically. 

But we can do something else. We can change the social system and move beyond capitalism. Capitalism is what is creating these x-risks. 

How do you move beyond capitalism when it's so ingrained in most of the world?

And aren't there some jobs that can't be done with AI? An AI therapist is out of the question, considering ChatGTP made Adam Raine and Sewell Setzer III kill themselves, and AI judges would be at too big a risk at giving the wrong people life or death sentences.

While workers still have labor power they need to form soliditarian associations that with-hold labor until demands/protections are met. That's the first step.

While some jobs will be resilient, how are they going to exist without consumer demand? Even people who are still employable will be affected by this decrease. 

Even if your goal is to get rid of AI data centers you're not going to be able to do it with capitalism still the present socio-economic system. It needs to be displaced to achieve that goal. 



sc94597 said:
CaptainExplosion said:

How do you move beyond capitalism when it's so ingrained in most of the world?

And aren't there some jobs that can't be done with AI? An AI therapist is out of the question, considering ChatGTP made Adam Raine and Sewell Setzer III kill themselves, and AI judges would be at too big a risk at giving the wrong people life or death sentences.

While workers still have labor power they need to form soliditarian associations that with-hold labor until demands/protections are met. That's the first step.

While some jobs will be resilient, how are they going to exist without consumer demand? Even people who are still employable will be affected by this decrease. 

Even if your goal is to get rid of AI data centers you're not going to be able to do it with capitalism still the present socio-economic system. It needs to be displaced to achieve that goal. 

How do we do that?



Soundwave said:

That may only be a problem for a temporary period though. AMD I'm sure will make their own knock off of this, just as developers support both DLSS upscaling and FSR in the same game, but Nvidia has something like a 90% marketshare of PC GPUs anyway.

And then a closed system like a hypothetical Switch 3 with things like exclusive Nintendo games that aren't on other platforms ... you may never, ever know any better. In fact I would say for a system like that it would be stupid to even try to focus on path tracing/ray tracing at all. Let the generative AI handle all the lighting, if PS4/5 range baked lighting enough reference data for it to create a look that imitates path tracing to the point that Digital Foundry was fooled ... well I mean it's hard to really justify the performance cost of ray/path tracing. 

Now it does look to me like the algorithm Nvidia is using is trained to over light and create a vivid picture that "pops" on purpose, it just looks at every scene and goes "Imma make this look flashy as fuck", but I hate to say it, most consumers are going to be happy with that. Most people, even graphics enthusiasts don't really care if a scene looks "accurate", they want it to look eye pleasing, and I think that's all Nvidia is going for. 

The issue is that AMD and Nvidia will have two different models that achieve different results, and third party developers would have to work with both to get their desired result. That requires more labor, not less. 

My point about consistency was more about temporal consistency. Image to image, and text to image models aren't very temporally consistent in real-time. That is largely because they don't have enough input features to work from. DLSS 5 at the very least has the buffer data. There will be actual limits to how little data they can use with thr DLSS 5 method. 

Nvidia seems to be moving in the direction of smaller, more numerous, modular, pretrained shaders and materials and then online learners to accelerate path-tracing by filling in gaps. If you follow their white-paper history that has been the consistent trend. This is also the direction AMD is following, and it has been built in directly into the APIs like Direct X12 and Vulkan. This allows for neural-rendering to not be vendor locked, while also still allowing for vendors to use their own specific hardware implementations to accelerate it. It means developers also don't have to do vendor specific implementations. 

DLSS 5 seems to be more of a complementary technology and stop-gap until actual neural rendering pipelines are fully mature.



CaptainExplosion said:
sc94597 said:

While workers still have labor power they need to form soliditarian associations that with-hold labor until demands/protections are met. That's the first step.

While some jobs will be resilient, how are they going to exist without consumer demand? Even people who are still employable will be affected by this decrease. 

Even if your goal is to get rid of AI data centers you're not going to be able to do it with capitalism still the present socio-economic system. It needs to be displaced to achieve that goal. 

How do we do that?

There is a field of history that explores how it has been done in the past. Your question is answered by exploring that history. There is no simple answer that can fit in a video game forum post, nor is there a shortcut for what needs to be done.

The sooner a critical mass of working people realize that, the more time and leverage we'll have.