By using this site, you agree to our Privacy Policy and our Terms of Use. Close
DonFerrari said:
Pemalite said:

Yeah I did. But essentially you are decreasing by more than half. - Couldn't be bothered to redo the calculation, but my point was still made.

Sure thing. Nintendo capped the performance of both CPU and GPU either for cost or battery consumption. Besides the display that was for cost and worsened the consumption.

Didn't see your opinion on DLSS itself on how much it really will change the landscape. Because I have a hard time believing you'll be able to make the 100 USD and 1000USD same gen cards look almost the same with one having DLSS and the other not having. Or on what would the GP budget be used if the 1000 USD go for DLSS.

DLSS doesn't increase lighting, texture details, geometric complexity and such.
It just cleans up the image so it seems higher resolution than it actually is... This is a path that we have been going down for years, Sony put the approach front-and-center with checkerboard rendering on the Playstation 4 Pro... And like DLSS also brought with it caveats like artifacts.

So it will never turn a $100 USD GPU into a $1,000 one, nVidia wouldn't cannibalize it's Geforce Titan profit margins like that.

Not only that, but nothing is stopping anyone from taking that high-end GPU, settings all the graphics to 11 and doing the exact same thing, but just better.

The idea with any frame reconstruction is that you can sit at a decent resolution (Again, every GPU has an optimal resolution range!) in order to bolster visual effects and then just rely on upscaling the image.

Soundwave said:

What you're saying here doesn't really make sense, a $100 DLSS card versus a $1000 DLSS card ... the $1000 Nvidia card will still have DLSS, so it will be able to process high end ray tracing effects and things of that nature. 

Not that $100 modern Nvidia cards are even available, the cheapest DLSS capable card that's in production is the RTX 2060 Super is $400. 

It's only when compared to AMD that yes it becomes a valid comparison is that way. AMD doesn't have DLSS, which means cheaper Nvidia cards can outperform their more expensive ones. But Nvidia to Nvidia comparison doesn't work the same way.

AMD has it's own alternative technologies to DLSS such as DirectML for example. - AMD's approach shifts the burden from itself (As it's relying on Microsofts technology) to developers though, where-as nVidia is using a propriety approach.
https://www.overclock3d.net/news/software/microsoft_s_directml_is_the_next-generation_game-changer_that_nobody_s_talking_about/1

DirectML which Next-Gen consoles will have:


You also have other approaches like AMD's image sharpening:
https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Obviously temporal reconstruction, checker-boarding and other frame reconstruction techniques are other such approaches developers have/will take to achieve fake-4k. - Which is "good enough"



--::{PC Gaming Master Race}::--