By using this site, you agree to our Privacy Policy and our Terms of Use. Close
zeldaring said:
Soundwave said:

From that test, they're running Cyberpunk 2077 with Playstation 5 (PS5, not PS4) equivalent settings and the 2050 was getting about 30 fps. It looks to me like if they used a dynamic version of DLSS Performance and Balanced combined (going into performance mode for more challenging scenes) they would get more of a locked 30. 

For Control, they state PS5/XSX run it at 1440p on low settings (yes, low) with medium ray traced reflections ... the 2050 was matching that with ray traced reflections at 1080p DLSS Balanced. 

It was also able to run one of the better looking next-gen exclusives in A Plague's Tale quite well (30 fps even with the rat swarm scene). 

None of these games have the benefit of being optimized by a dev team specifically for the 2050 hardware either like the PS5/XSX (and Switch 2) will have benefit of. Performance would be better if a dev team of 20-30 people sat down with the hardware and worked on a port just for that hardware for 3-6 months. Invariably you're going to get better performance when developers do that.  

That video is not I'd want to be sharing if your point is a 2050 isn't extremely capable as a chip. It was able to run several PS5-equivalent comparisons there and showed quite well for a low power chip with only 4GB RAM. 

Someone said it would be near serie s from that video obviously its not, and dlss is not magic and really depends on the game. Besides if you saw the dlc for cyber it had to run at 720p with upscaling to get 60fps games are only getting more demanding as we leave last gen consoles behind. 

The last point of your post is the aspect that I think people are missing.  There was a massive gpu shortage and games leveled off in requirements.  The shortage is over and gaming will get jumps from here.  We are already seeing it now.  

Either way pikmin 5 and LM4 on a 2050....  yes please.  

The irony of DLSS is the intent.  It is an AI model.  The more data points of input the better results.  It works well for 1080 to 1440p or 1440p to 1080p.  I have no idea people think it takes low resolution and magically makes it high resolution.  Low resolution doesn't have enough data points to be effective.  It creates artifacts when used at low resolutions.

Last edited by Chrkeller - on 13 February 2024

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED