By using this site, you agree to our Privacy Policy and our Terms of Use. Close
sc94597 said:
Soundwave said:

To be honest, I kind of am wondering why even bother with ray/path tracing/Lumen at all. Those things still take a ton of compute. 

The one example in the DF video that stood out was Starfield which doesn't support raytracing/path tracing/Lumen at all and is by DF's own labelling a "flat looking game" lighting wise. To their trained eyes they admitted the generative AI lighting added to the scenes made the game look like a path traced game. 

I mean if that's their initial reaction, that's likely easily good enough for "regular or even hardcore gamer Joe", DF was "fooled" and it's their job to pixel count things. 

That's another take away I kind of see here, why bother with extremely compute expensive technologies like path tracing and Lumen at all if you can get an image to "pop" lighting wise like that. Now I know there is a group of people who rightfully so hate that overlit look, but if you eventually get games where you don't even have reference to what the "normal lighting" is supposed to look like and you just have a game with a Neural rendering light engine (generative AI) ... you won't even know what the original looks/looked like and will just take at face value that is the lighting of the scene. 

The problem is that DLSS 5 is vendor-locked, and it would be weird to have a game looks different on AMD vs. Nvidia vs Intel. 

Also the technology isn't magic. There is a tradeoff between consistency and input data. The more useful data features you have the more consistent the output. 

Nvidia has been instead going the route of using Deep Learning to accelerate path tracing/ray tracing, and that makes sense. 

That may only be a problem for a temporary period though. AMD I'm sure will make their own knock off of this, just as developers support both DLSS upscaling and FSR in the same game, but Nvidia has something like a 90% marketshare of PC GPUs anyway.

And then a closed system like a hypothetical Switch 3 with things like exclusive Nintendo games that aren't on other platforms ... you may never, ever know any better. In fact I would say for a system like that it would be stupid to even try to focus on path tracing/ray tracing at all. Let the generative AI handle all the lighting, if PS4/5 range baked lighting enough reference data for it to create a look that imitates path tracing to the point that Digital Foundry was fooled ... well I mean it's hard to really justify the performance cost of ray/path tracing. 

Now it does look to me like the algorithm Nvidia is using is trained to over light and create a vivid picture that "pops" on purpose, it just looks at every scene and goes "Imma make this look flashy as fuck", but I hate to say it, most consumers are going to be happy with that. Most people, even graphics enthusiasts don't really care if a scene looks "accurate", they want it to look eye pleasing, and I think that's all Nvidia is going for.