By using this site, you agree to our Privacy Policy and our Terms of Use. Close

6000 series are pretty terrible products when you realize that AMD is asking people to spend $650/$1000 on a GPU to Turn Off settings. It's amusing how on games that use heavy Ray Tracing, even Turing does better than the 6800 XT and that's before DLSS. You know... The GPUs that everyone said will age badly?

And the biggest kicker is, the Raster performance isn't even that good depending on where you look. 4k, the 6800 XT loses to a 3080. 1440p, it Trades blows or is in the lead by a small margin depending on the game, 1080p, it wins, in VR, it loses. Hell it gets killed on some of the older titles like The Witcher 3 at every resolution. Looking at Cyberpunk, Ray Tracing isn't even being supported at launch. You know, one of the most highly anticipated games of the generation?

6000 series is just for AMD fans only and no one else. The Vram capacity argument has been nonsense in the past and continues to be nonsense today proven by the benchmarks. For most people, it's better to wait until AMD has a proper GPU that is actually worth the price they are asking instead of forking over $650/$1000 for potato ray tracing performance, good Raster performance and zero answer to DLSS. Cause at those prices, you really shouldn't need to turn off settings to justify a purchase.

As for TSMC vs Samsung. I do think Nvidia going back to TSMC might be their Pascal type of leap in all fronts.

Last edited by Jizz_Beard_thePirate - on 22 November 2020

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850