By using this site, you agree to our Privacy Policy and our Terms of Use. Close
hinch said:
Captain_Yuri said:

I think the main issue is gonna be input latency which is why Nvidia didn't put it on pre-4000. 4090 already is very close to having input latency of Native in some games. Doing frame generation on Ampere let alone Turing might make the input latency too unbearable. 80fps frames with 15fps like input latency.

Yeah thats true. Could be buggy and sub-optimal rn with current drivers. And it may be hardware limitation.

Still not convinced that it couldn't be done (well) on previous cards with some work but we're going to have to take Nvidia's word for it.

Yea and personally, I am not impressed with the current version of DLSS 3 anyway. Even though I am getting the 4090, I'll stick with DLSS 2.0 for now until they iterate on DLSS 3 some more. Spidermans web on DF videos were going in and out and there were some slight but weird artifacting loll. While DLSS 3 will eventually be great, I think DLSS 2 will still remain the go to for even 4000 series despite the framerate smoothness increases.

You also can't have V-sync on otherwise you get a big input latency penalty of 100+ ms according to DFs video. So maybe in 6 months, it may be worth using for 4000 series buyers.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850