hinch said:
I'd say most people could get away with a 3060Ti/3070 class GPU to last a whole generation (of console ports), depending on what you play and what resolution. And like you said more games are using some sort of upscaling tech and lets not forget things like mesh shaders, VRS and other DX stuff which will give cards more performance. Something like a 3080 would be absolutely fine for several years too for 4K gaming. Don't get me wrong, it be interesting to see how these cards will perform and I'm clearly not in the target audience for a XX90 class GPU anyways so there's that. As long as the 80 class cards and 70 aren't clearly overpriced and take nuclear reactor to run all is well xD |
To showcase how powerful the 3090 is, Nvidia used 8K gaming benchmarks. Who the f*ck games at 8K? And now think of the successor to that beast that is rumored to deliver 2-2.5 times more power. Why would you need that? Sure, sure, game requirements will increase as the console generation goes, and there must be someone out there that games with two or even three 4K monitors, but it feels like we've reached a point where hardware has, once again, outpaced the needs for it.
And you mention the 3060Ti/3070, but what about its successors? With a 4060Ti that could deliver close to 3080 performance (only with 8GB because Nvidia has to cripple them somehow), you'll be served for years to come for a fraction of the price and power consumption of those leaked monsters.
It's like both AMD and Nvidia are so obsessed on having the performance crown that they don't care about the sacrifices to get it.
Please excuse my bad English.
Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.







