hinch said:
Yeah we see games already going high on VRAM and we're just scratching the surface for graphically demanding games this generation. With more engines using RT, we're going to need as much as we can get. And true, Nvidia skimped on VRAM across the stack for Ampere and even Ada in the lower stacks. Outside their flagship end and refreshes (plus the 3060 12GB lol) and it didn't last long until people had to eventualy start lowering settings to play them. Definately worth getting as much VRAM as you can for longevity. A 4090 will definately last you long long time. With the latest DLSS tech and frame generation, even better. I'm half tempted to get the 7900XT, with the recent price drop and free game. 20GB VRAM will last a while, one should hope. Since AMD and Nvidia have been playing games with the market and drawing out the mid ranged launches, I really need an upgrade that's not old gen hardware lol. |
The 7900XT is perhaps the best value for money on the market right now, especially in sheer raster performance. The 4070 Ti only has a leg up when DLSS is used, which is likely used alongside RT (which it can't handle all that well at any rate). If I were sticking with 1440p another cycle, I think I'd get either the 7900XT or the 7900XTX. I only had AMD cards before my current one (good 'ole 980 Ti), and I have no brand loyalty or shame. :P I go where the performance and value is.
I first considered it ludicrous to even think of getting a 4090, but the rest of the 40xx lineup is simply worse value, all things considered. This goes double for 4K gaming.
As for VRAM; I remember maxing out mine on the 980 Ti when it was brand new. The game was GTA V. Developer approach, and indeed GPU manufacturer approach, to visual fidelity right now reminds me of the automotive industry in the 90's and early 2000's. They tacked on huge turbos and bored out the cylinders rather than apply finesse and tuning to existing solutions. Without any sort of proper regulation, besides requiring ABS brakes and catalytic converters, it increased performance and fun factor across the board, but at the cost of constant breakdowns, high consumption, and expensive part replacements. Not to mention the toll on the environment.
I think we'll be seeing a strict regime of regulations hit the electronics indsustry in the coming 5-7 years. They already started with TVs here in the EU, enforcing max wattage on new sets. Gaming rigs that require 1-1.2KW to power, crypto farms, and constant use and charging of smart devices whose battery time relative to use technically hasn't improved for years will ensure this (capacity increases, but so does power requirements and number of demanding features).