By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
  Captain_Yuri said:

Alright, from what I can tell, Nvidia's claims were pretty on point. While it's not a straight 2x 2080 in every game, it's still quite massive in a lot of them. DF's % also match what we are seeing in the final results. It's also a pretty big performance gain from the 2080 Ti as well.

This is gonna be one fap worthy card boys!

Compared to the RTX 2080, that's 25% more in 1080p, 51% in 14440p and 66% in 2160p.

Compared to the RTX 2080 Ti, it's 14% to 31% faster, depending on the resolution.

Not that great to be honest, considering how much power it is guzzling in a more advanced node. In TechPowerUp's benchmark average, it's actually less efficient or comparable to RDNA 1.0 and Turing in anything but 4K.

AMD actually had a larger perf/watt increase even with the Radeon VII compared to Vega, and that was using the exact same architecture in a nascent, low power-optimized, manufacturing node...

Yea at lower resolutions, it's not as massive of a leap but who the heck is gonna buy this card to play games at 1080p? This a card that is clearly targeted towards 4k where it sees quite a massive improvement. So I wouldn't doubt it's less efficient in lower resolutions than 4k.

And yea, the performance/watt gain vs Turing isn't very good but I'd like to see AMD bring better performance than what we are seeing with 3080 cause if all they end up doing is giving us a more power efficient card at lower performance, I couldn't care less. If RDNA 2 is more powerful though, then it would peak my interest.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850