Bofferbrauer2 said:
While I mostly agree, a high-end card from just 1 generation ago (though the next one is just around the corner) should still be able to play any game in max settings at just 1080p. So the fact that the original 3080 can't is just a disgrace for NVidia and really shows how that entire Ampere lineup apart from the 3090 and the 3060 were simply too memory-constrained. |
I mean, if you are gonna make the "max settings" argument... Radeon gets destroyed on many games with Ray Tracing on vs 3080 such as on a game like Cyberpunk that released in 2020. So by that logic, Radeon should be ashamed for doing so terribly at "max settings" even when it launched since it should have been able to do so in "every game" even at 1080p in 2020. Hell $1000 6900XT just matches the 2 year old 2080 Ti in Cyberpunk. And remember, back in those days, Radeon only had the FSR 1 garbage that looked horrid while DLSS 2.0 came out in 2020 so using upscaling on Radeon gpus required eye bleach.
The reality is that game optimization has went to shit this gen. Some games require a lot of vram which cucks certain Ampere gpus while other games require upscaling which cucks Radeon gpus. Not to say Ampere had the right amount of Vram for a 3080/3070 gpu. I think 3080 should have launched in 12GB flavours minimum from the start instead of releasing one later on. But Radeon also should have released an Ai upscaler midway through RDNA 2 instead of taking until RDNA 4 to hopefully release one. Intel really shouldn't be beating Radeon in the upscaling race yet here we are.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850