By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
Captain_Yuri said:

I mean, one of main cause of the efficiency issue with Ampere is going with Samsung's 8N which we know is most likely going to change with Hopper. Nvidia made a meh move going with Samsung which is something that we all know and it's not like anyone is denying the power hungry aspect of Ampere.

But a lot of it has to do with whether or not that power consumption gives you the performance which was one of the main issues with Vega. Vega cards had high  power consumption figures while giving you similar or less performance than cards with less power consumption and in the case of Vega VII, missing features like RT cores and Tensor Cores. VII 313 Watts vs 2080 226 Watts for example. If RDNA 2 can make Ampere look like that, then people will get on that hype train. Efficiency obviously matters but so does actual performance.

Yes but steam's hardware survey also shows 2080/2080 Ti are less than 1% respectively... The types of people that will be spending $700+ on a GPU are not gonna be the xx60 crowd...

Samsung's 8 nm has a 10 nm BEOL versus a 20 nm BEOL on the 12 nm TSMC process... huge difference right there. Going by feature size, that's at least a node and a half of improvements.

Perhaps AMD is using a better node right now, perhaps they aren't... but we can't compare that, since we have literally nothing to base ourselves on. I'm just pointing out there was almost no improvement even with a new architecture on top of the newer node. That was the point of the Radeon VII comparison. Vega was bad compared to Turing, yes. But at least it demonstrates a response to a more advanced node, with the same architecture to boot, as immature as 7 nm was back then.

As for the number of people gaming with 2080s and above, that's a fair point. But do remember that those cards were horribly overpriced and underperformed accordingly on the market. Perhaps Nvidia will have more success with Ampere, I don't know.

Yea that's a good point. It will be interesting to see where GPUs go as the node shrinks become harder and harder.

Yea they did and rightfully so but you can go look at other GPUs that are $700+ and they are also very low %. Not to mention, the 1080p results get skewed because of the laptop crowd as majority of the laptops ranging from $400-$3000 have 1080p screens. It's rare that most people would get a 4k screen with their laptops and 1440p is hella rare. It's reasons like this why I don't bother paying attention to Steam's hardware survey as the results are always skewed heavily. Personally I have 3 computers, 1 is my desktop that has 3440 x 1440p and two laptops, both are 1080p and I have steam on all 3.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850