By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Nvidia reveals DLSS 5 , essentially applies AI filter to games in real time.

Random_Matt said:

Looks completely like someone used Google Gemini. As much as I hate it, you do not have to use DLSS5; although that is what is going to sell nvidia GPU.

Sure but at what cost.  I'm serious.  This needs 2 5090 cards.  Its starting to feel like they are going to use Instagram filters to push $6000 cards.  The practice of generating value by forcing the AI generated look onto games and pretending this is what people seek.



Around the Network

God, this looks uncanny and outright awful. This must be one, if not the worse demos nVidia has ever released.



Development costs are through the roof. AI will be leveraged significantly to lower costs.

Like it or not, it is coming.



“Consoles are great… if you like paying extra for features PCs had in 2005.”

I'm a little torn on this. The effect is pretty amazing, there's no denying that. But it's also a little uncanny valley and weird. Feels like an Instagram filter that has to hover around in real time and try to keep up with what's happening.
Of course, as anything, this will get better and better as we go. I think this will work better as a ground up solution rather than after the fact.



sc94597 said:

If this were shown four years ago I think the perspective would be totally different. People online are, for many good reasons but also many bad ones, skeptical of anything that has "AI" associated with it. For some reason DLSS has flown under the radar until now, despite not being much different from any other Deep Learning technology. On the other-hand, I think among the "normies" the general perspective will be that it looks good on net, and Nvidia will have a success with this. 

There is also this bizarre idea that some rendering workloads are more "pure" than others, but at the end of the day everything is machine code at the lowest level. Game rendering has always had "shortcuts" in way of efficiency. 

I predict developers will find a good equilibrium and even the enthusiasts will appreciate neural-rendering, probably as soon as two years from now when the next generation consoles release with it as a major feature. 

"Normies" are going to gobble this up. 

Beyond A.I skeptisim though there is just genuine criticism over what its representing. Even 4 years ago this bottom image would look fake or photoshopped.

DLSS <4 is different because it maintains the vision of a game with only the slightest misrepresentations (occasional textures/particles fx etc). This is several 100 magnitudes more disruptive. 

Just dissecting some of what people are responding to:
 
The level of fidelity is in no way matching the movements/animations and it comes across as extra weird/uncanney valley.

The lighting itself in the bottom remains stylised but in a way that isn't coherent to the environment. This hero lighting is sometimes done intentionally in games but here it just looks like the ai model is applying it on everyone and making each frame look like a photoshopped movie poster,  another hallmark of generic generative ai.

The ai model is clearly trying to exact a certain "look" out of the characters as opposed to just increasing the quality of lighting. Looking at the guy on the left, is there a genuine light source making is right side so brightly contrasted? I'm not sure. Again even before Ai this would come across as a highly doctored image.

You're 100% right that many normies will eat this up, but I think the criticism is also legit and not just kneejerk. They'll be a huge divide in how people take to this kind of execution, similar to how you see some people cream their pants over fan-made Mario/Zelda UE5 demonstrations whilst other people point out how souless and generic they often look. I emphasise this execution because this could definitely be refined to actual look authentic, right now it doesn't.

I'm not necessarily concerned because this is a toggle but it does reflect a direction of games away from art and laboured/intentional outcomes, into benign realism. That Assassins Creed screenshot example to me is actually the best exhibit. All tone is sucked out of the image.

IMO this was a poor showing of the technology and Nvidia+the developers should of have been a bit more careful and held back until it looked right

Last edited by Otter - 3 hours ago

Around the Network
curl-6 said:
sc94597 said:

People say this, but I can't think of a single game where you "need frame gen or reconstruction" to play it at say console-level settings with modern hardware. High-end features might require these to be able to play the game at enthusiast settings (like with path tracing), but even horribly optimized games still play okay without DLSS or frame-gen. Usually the actual culprit of poor optimization isn't some developer intent but invasive technologies like Denuvo, financial/labor constraints, hardware limitations (lack of mesh-shaders or decent HW RT acceleration) or poorly suited game engines (thinking of Monster Hunter Wilds and RE Engine for open world games in general, as an example.) 

Which concrete example are you thinking of and why do you think it is a developer intention to depend on upscaling and not an actual technical or organizational limitation? 

I'm not a PC gamer so you'd have to ask them which are the most egegious examples, but off the top of my head, Monster Hunter Wilds last year was a horribly unoptimized title that all but demanding reconstruction/frame gen/etc if you didn't have high end kit.

It's less that it's the developer's intention and more that any avenue to cut corners will inevitably end up being exploited and abused by greedy suits who don't care about quality and just want to wring out every last penny of profit.

Let's take the view though that it's about options; part of that is that when you offer people options, people are allowed to say "I don't want that, fuck off."

Oblivion Remaster was a train wreck.  Final Fantasy 16 was piss poor.  And TLoU Part 1 left a lot to be desired.  

I tend to agree with your take, I think AI makes most developers lazy.  They will sit back and expect people to use AI to fix.  



“Consoles are great… if you like paying extra for features PCs had in 2005.”

What I find funny is how the water/rain effects in Resident Evil specifically look worse. The alley behind Grace is dark, wet, then it's turned on and it looks lighter and all water reflection vanishes. It basically gaussian blurred the water effects.



Hmm, pie.

I think this is interesting tech, I'm skeptical about how it will work in practice.

Lets be frank, everybody making fun of it would try it out if its simply an option in the graphics settings.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

curl-6 said:

The legal stuff and others works done by AI are plagued with errors though because LLMs hallucinate and can't differentiate true from false data.

Given the flipside is deepfake revenge/child porn, rampant misinformation, a pretense for suits to lay off workers, slop infesting every corner of the internet, scams, environmental destruction, prices for RAM and stuff going through the roof, a bubble that threatens to crash the economy and more, I'd say the bad still far outweighs any good.

It's amusing that people keep repeating this since they have no idea how much water is used by typical agricultural and industrial processes. A large datacenter consumes roughly as much water as the beef used by six burger joints, and golf courses in the US consume around 25x more of it than all datacenters combined. Not to mention the runoff from a datacenter is warmer water... the runoff from a blue jeans factory will literally kill you.

As for errors, retrieval-augmented generation has by large reduced the rate of hallucinations in the past year or so. Even for offline mode, the hallucination rate changes considerably across models. They can actually learn to distinguish between what is encoded in their weights and what is not, though no one besides maybe Anthropic is paying much mind to solving it in reinforcement learning.

CosmicSex said:

Sure but at what cost.  I'm serious.  This needs 2 5090 cards.  Its starting to feel like they are going to use Instagram filters to push $6000 cards.  The practice of generating value by forcing the AI generated look onto games and pretending this is what people seek.

The algorithmic cost for deep learning techniques drops massively year on year. It'll probably be running on most (50-series?) consumer cards by the time it launches. At least one would hope so, Nvidia probably isn't stupid enough to release it for just a single very expensive, supply-constrained card.



 

 

 

 

 

Exactly my thoughts. The original version actual looks more natural even if the lighting is less detailed