hinch said:
Pains me to say with AMD dropping the ball this gen and Nvidia doing Nvidia things.. Intel Arc could be a good alternative in the lower price tier brackets for GPU's. Assuming they can get the A770 closer to the RTX 3070, this wouldn't be a bad buy at all. More performant than RX 6000 series in RT and double the VRAM than the 3070, at a lower cost. |
Yee. Hopefully there will be some Raja fine wine kicking in sooner than later. I remember the days when a $300 1060 was as powerful if not more as the last gen $549 980. Was incredible value but alas those days are gone. If Intel can get their shit together, hopefully those days will one day come back.
Chazore said:
I was just about to say that I think it's that time for me where AAA gaming is greatly outpacing my 1080ti, but then you made me realise it's not my GPU, it's the stupid amounts that are lacking in optimisation these days. I still remember what a marvel battlefront I-II were like and while they did put strain on my GPU at 1440p, I was still able to run those two games at my fps cap (72) and still have headroom left over. I think I'ma just bow out of AAA gaming y'know, because this year is off to a shitty start, and last year didn't fare any better. I already find AAA pricing is pushing me out of my price range, but bad optimisation is making any sort of risk of purchase a no-go for me. I feel like we're becoming fast reliant on the band aids that are FSR/DLSS, instead of what we previously relied on (raw power and native res), that it's making GPU power levels completely meaningless, if they cannot do what is stated on their tin labels. |
Yea the requirements for the visual output makes little to no sense anymore. Cyberpunk imo is the benchmark of this generation for pc gaming because both the Raster and Ray Tracing looks incredible in that game and also the fact that it's open world makes it even more impressive. Yet for some reason, Cyberpunk runs better on a 4090 than Deadspace does in Raster... Like what? And Deadspace uses the old frostbite engine.
And tbh same goes for Forspoken. Both of them use old engines that should be running like a dream yet both of these games perform like ass on every platform. Makes zero sense what so ever. So yes, I think you are correct. The devs have largely phoned it in and are passing these games off as complete even though they are unoptimized just cause we now have good upscaling methods. Quite the shame really.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850