By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - AMD RX 6000 GPU's revealed, seem to go head to head with Geforce 30 series at lower prices

Captain_Yuri said:
JRPGfan said:

I heard a rumor that these cards do Raytraceing better than the 2080ti does.
Not as fast as the 3080 ect, but more than fast enough (to use it in games).

Also seems AMD has a huge edge in 1440p gameing (abit smaller in 4k).

Its crazy AMD is back at top-end again, its been so long (feels like AMD just let nvidia always be faster, and make bigger die cards).

The thing about no counterpart to DLSS yet, thats been said it'll launch sometime early 2021.

The question is of course, how much better and which card cause the 3080 is 25-40% faster than 2080 Ti when RT is enabled depending on the game.

These are cherry picked benchmarks so I will wait for the actual reviews. The 1440p benchmarks are actually quite closer than I was thinking as there's 4/10 titles where AMD has a huge edge against the 3080.

Yea it is pretty crazy that after what? 3-4 generations of GPUs, we finally have an AMD GPU that can go toe to toe in Raster. There's no way to put it other than amazing.

Then the question is which games will support AMD's version of DLSS. Not to mention, how's the quality of the image?

Theres only 12 games so far (afaik) that make use of Nvidia DLSS 2.0 (the older versions suck worse than scaleing imagine + sharpening filter).
DLSS techniques shouldn't be a reason to buy or not a card, going forwards I suspect.

That 3Dmark benchmark with raytraceing showed a 22% lead for the 3080 FE vs the 6800XT (which will shrink with driver optimisations going forwards). Thats heavy loads of raytraceing, more than most games will make use of. So even in ray traceing situations the advantage to the nvida cards wont be that big.



Around the Network
JRPGfan said:
Captain_Yuri said:

The question is of course, how much better and which card cause the 3080 is 25-40% faster than 2080 Ti when RT is enabled depending on the game.

These are cherry picked benchmarks so I will wait for the actual reviews. The 1440p benchmarks are actually quite closer than I was thinking as there's 4/10 titles where AMD has a huge edge against the 3080.

Yea it is pretty crazy that after what? 3-4 generations of GPUs, we finally have an AMD GPU that can go toe to toe in Raster. There's no way to put it other than amazing.

Then the question is which games will support AMD's version of DLSS. Not to mention, how's the quality of the image?

Theres only 12 games so far (afaik) that make use of Nvidia DLSS 2.0 (the older versions suck worse than scaleing imagine + sharpening filter).
DLSS techniques shouldn't be a reason to buy or not a card, going forwards I suspect.

That 3Dmark benchmark with raytraceing showed a 22% lead for the 3080 FE vs the 6800XT (which will shrink with driver optimisations going forwards). Thats heavy loads of raytraceing, more than most games will make use of. So even in ray traceing situations the advantage to the nvida cards wont be that big.

Right... And there's not much of a reason to think AMD's version won't be as bad as DLSS 1.0 as they are starting from scratch... Who knows how long it can take for AMD to have a competitor to DLSS let alone one that looks as good as 2.0 as Nvidia continues to improve DLSS... Specially since they did not mention a specific hardware accelerator similar to Nvidia's Tensor Cores unless I missed it.

If the benchmark you are referring to is the one where the 72CU performs like a 2080 Ti when RT is enabled. That does not bode well for the rest of 6000 series RT performance. Because that means that the 2080 Ti's/3070's Raster competitor will perform like 2070 Super when RT is enabled and etc. And because of that, AMD would have an overall worse implementation of Ray Tracing than Turing.

Considering Watch Dogs Legion needs a 3090 + DLSS performance mode to run 4k 60fps with Ray Tracing on Ultra according to the leaked videos, I hope that benchmark isn't legit.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

On this site, we had a Nintendo fan who said Nvidia has better memory bandwidth efficiency than Amd, Radeon 6900XT has 512 GB/s memory bandwidth while Geforce 3090 has 936 GB/s memory bandwidth, they have about the same performance. RDNA2 memory bandwidth efficiency is for beyond AMPERE. You are not getting Next-gen games on your Switch 2.

For PS5 amd didn't reveal the ipc so we can't figure out the gaming performance yet, we haft to w8.

Anyway Rdna2 is just a small hit on Nvidia, I expect Rdna3 will be the killing blow, it should be some kind of chiplet design with 1.5x performance over big navi. Be released end of next year or beginning of 2022.

Last edited by Trumpstyle - on 28 October 2020

6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Trumpstyle said:

On this site, we had a Nintendo fan who said Nvidia has better memory bandwidth efficiency than Amd, Radeon 6900XT has 512 GB/s memory bandwidth while Geforce 3090 has 936 GB/s memory bandwidth, they have about the same performance. RDNA2 memory bandwidth efficiency is for beyond AMPERE. You are not getting Next-gen games on your Switch 2.

For PS5 amd didn't reveal the ipc so we can't figure out the gaming performance yet, we haft to w8.

Anyway Rdna2 is just a small hit on Nvidia, I expect Rdna3 will be the killing blow, it should be some kind of chiplet design with 1.5x performance over big navi. Be released end of next year or beginning of 2022.

weren't you saying this about big navi tho?

shifting goalposts once again

EDIT:

purely based on power it looks like AMD>NVIDIA
But add in RT and DLSS and it changes the narrative
these AMD cards feel like they aren't aimed at gaming tbh



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

The RX 6800 should have been $500 to match the RTX 3070. It's practically the same card, but AMD is asking for $80 more.
The RX 6800 XT being on par with the RTX 3080 for $50 cheaper is good, which might make Nvidia either match or go lower in price. It's a $50 difference, so maybe they won't.
The RX 6900 XT though, oh man. On par with a RTX 3090 and is $500 cheaper. Nvidia has no choice but to match that. AMD did real good on this, massive W!

With that being said, one L, one OK, and one W. Now AMD HAS to fix their driver issues if they want to succeed. That is their biggest problem right now, second being no DLSS equivalent for their GPUs. They caught up with Nvidia in price to performance, but are still behind in other areas.



Around the Network

The 6800XT seems to be the value king in that lineup and likely my next GPU.



--::{PC Gaming Master Race}::--

The RTX 3060 suddenly isn't such an obvious choice anymore! That said, looks like it'll still be a while before anything in its price range comes out, so there's that... At this rate, it looks like I'll be upgrading my CPU next year instead of this one, which is disappointing.



Why are people saying the 6800 should be $500? Acording to this this beats the 2080ti and has 16gb or vram. And the 3070 performs at best as good as the 2080ti but it looses in Manny games and only has 8gb of vram. $80 more seems to be a right price as it performs better than the 3070 and has double the vram. I expect a 6700xt to perform I the level of a 3070 and will cost less like the 3080 competition.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

That is very good, better performance with lower price. Seems very good prospect. Even more when with jumps NVidia made people didn't expect it to surpassable.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

With their hesitation to display RT results and their own version of DLSS, I still remain sceptical. Still we should wait for benches on actual games, from multiple sources (like GN and Jayztwocents, not just Linus or DF for fanboys) to see what's going on.

DLSS and RT performance is what I'm looking for in a new card, and if AMD cannot deliver, I'll stay on course for that 3080/ti model.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"