By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Jizz_Beard_thePirate said:
Bofferbrauer2 said:

While I mostly agree, a high-end card from just 1 generation ago (though the next one is just around the corner) should still be able to play any game in max settings at just 1080p. So the fact that the original 3080 can't is just a disgrace for NVidia and really shows how that entire Ampere lineup apart from the 3090 and the 3060 were simply too memory-constrained.

I mean, if you are gonna make the "max settings" argument... Radeon gets destroyed on many games with Ray Tracing on vs 3080 such as on a game like Cyberpunk that released in 2020. So by that logic, Radeon should be ashamed for doing so terribly at "max settings" even when it launched since it should have been able to do so in "every game" even at 1080p in 2020. Hell $1000 6900XT just matches the 2 year old 2080 Ti in Cyberpunk. And remember, back in those days, Radeon only had the FSR 1 garbage that looked horrid while DLSS 2.0 came out in 2020 so using upscaling on Radeon gpus required eye bleach.

The reality is that game optimization has went to shit this gen. Some games require a lot of vram which cucks certain Ampere gpus while other games require upscaling which cucks Radeon gpus. Not to say Ampere had the right amount of Vram for a 3080/3070 gpu. I think 3080 should have launched in 12GB flavours minimum from the start instead of releasing one later on. But Radeon also should have released an Ai upscaler midway through RDNA 2 instead of taking until RDNA 4 to hopefully release one. Intel really shouldn't be beating Radeon in the upscaling race yet here we are.

To be fair, I noticed signs of things going to shit way back when the last Deus ex came out. Seeing a bunch of high end cards struggling with that game's higher settings felt like such a joke, and even to this day, it's still obvious that SE didn't bother to optimise that game for even the higher end (to this day it's settings can still cripple here and there, and there's zero excuse for a now old game like that). 

The more we end up having to rely on FSR/DLSS, the more the devs and the GPU manufacturers are just going to rely on us using it ourselves, over basic level optimisation passes. At this moment in time we've really no clear cut way or even a solidified group to point out that this band-aid solution is quickly becoming a problem, and optimisation for high, all the way to low end cards being an issue.

Until that changes, we're just going to be stuck with higher and higher reliance on upscaling tech, and the sad thing is, even with that band-aid, you still won't be getting max settings, or the advertised "cherry on top" quality setting that these games try to show off. Like yes, we can still get Path Tracing in 2077, but without the upscaling, we get far worse results and it still isn't stable, but with upscaling, we get artifacting, some blur here and there and input delay, which I honestly do not seeing trading 3 properties for some fake frames to being all that worth it (I say this as a guy who has always been running native since forever, and I find artificial means over the real deal kind of off-putting).

Optimisation this gen (combined with Epic's latest engine) is shit here and there, yes, but this isn't going to change until people pipe up on both the game end, and GPU end as one, not as a smattering few voicing their woes on just Twitter (I honestly find social media more of a fragmented and problematic issue now more than ever, so I can see why things aren't changing at a decent enough pace for nearly anything these days).



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"