haxxiy said:
Come on, is that sort of thinking that yielded absolute computing 'gems' like Netburst, Fermi, or Vega. Great designs like Athlon 64, Maxwell, or Zen, began by increasing efficiency. It's one of the main drivers of innovation in just about any market, after all. So, I'd say that's very relevant. Much like lower resolutions seem relevant in a context where less than 3% of people play at 4K according to the most recent hardware survey. And Nvidia surely wants to increase the adoption of DLSS, which will be rendering games at 1080p or even lower. In this context, it just feels weird that the architecture apparently chokes at lower resolutions - and looking at framerates it isn't CPU bottleneck either, so there's that. |
I mean, one of main cause of the efficiency issue with Ampere is going with Samsung's 8N which we know is most likely going to change with Hopper. Nvidia made a meh move going with Samsung which is something that we all know and it's not like anyone is denying the power hungry aspect of Ampere.
But a lot of it has to do with whether or not that power consumption gives you the performance which was one of the main issues with Vega. Vega cards had high power consumption figures while giving you similar or less performance than cards with less power consumption and in the case of Vega VII, missing features like RT cores and Tensor Cores. VII 313 Watts vs 2080 226 Watts for example. If RDNA 2 can make Ampere look like that, then people will get on that hype train. Efficiency obviously matters but so does actual performance.
Yes but steam's hardware survey also shows 2080/2080 Ti are less than 1% respectively... The types of people that will be spending $700+ on a GPU are not gonna be the xx60 crowd...
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850