By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Nvidia Ampere announced, 3090, 3080, 3070 coming later this year

Random_Matt said:
Zkuq said:
Will have to wait for more details, but it's seeming very likely that I'll get an RTX 3060 once it's announced and comes out. It ought to be a huge upgrade from my current GTX 770 I got back in 2014... That said, I'll probably wait until next summer and get a really fast SSD then, and there's probably a non-zero chance it would make sense to wait until roughly then before upgrading my graphics card as well. That said, I kind of doubt Nvidia is going to release anything more suitable by then, and it doesn't seem like AMD could possibly have a reasonable answer to Ampere so AMD might also be out for this round.

https://mobile.twitter.com/RedGamingTech/status/1301884380747624449

I dont know why people are doubting AMD so much. Specially since the leaks descrived Ampere's performance almost perfectly many months ago. The only thing that was wrong was the pricing of each card was $100 lower.

If the leaks hold true for AMD as well they will be head to head with the 3080.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Around the Network
eva01beserk said:
Random_Matt said:

https://mobile.twitter.com/RedGamingTech/status/1301884380747624449

I dont know why people are doubting AMD so much. Specially since the leaks descrived Ampere's performance almost perfectly many months ago. The only thing that was wrong was the pricing of each card was $100 lower.

If the leaks hold true for AMD as well they will be head to head with the 3080.

Hope so, but hope they do not restrict HBM to CDNA.



Random_Matt said:
eva01beserk said:

I dont know why people are doubting AMD so much. Specially since the leaks descrived Ampere's performance almost perfectly many months ago. The only thing that was wrong was the pricing of each card was $100 lower.

If the leaks hold true for AMD as well they will be head to head with the 3080.

Hope so, but hope they do not restrict HBM to CDNA.

I would not be suprised if the top card had it. And if im not mistaked HBM is stil locked to 4GB stackes, so the minimum the top end could be then is 12GB. If so that would be faster and more capacity right off the bat. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

eva01beserk said:
Random_Matt said:

Hope so, but hope they do not restrict HBM to CDNA.

I would not be suprised if the top card had it. And if im not mistaked HBM is stil locked to 4GB stackes, so the minimum the top end could be then is 12GB. If so that would be faster and more capacity right off the bat. 

What about HBM2e?



eva01beserk said:
Random_Matt said:

https://mobile.twitter.com/RedGamingTech/status/1301884380747624449

I dont know why people are doubting AMD so much. Specially since the leaks descrived Ampere's performance almost perfectly many months ago. The only thing that was wrong was the pricing of each card was $100 lower.

If the leaks hold true for AMD as well they will be head to head with the 3080.

It's cause of AMD's track record.

Look at AMD vs Turing.

With Turing, Nvidia not only raised the prices but kept the performance essentially the same. So a 2080 for example costed $700 which was the same as a 1080 Ti's MSRP a year earlier and the performance very minimal in terms of improvement. One of the main reasons for the price increase was because of the RT cores and Tensor cores which when it launched, no games supported Ray Tracing and DLSS 1.0 looked like arse with little to no support as well. 1080 Ti's at the same time could be had for $500 while giving you similar performance of the $699 2080, just without ray tracing and DLSS.

But AMD didn't have an answer at that moment. Fast forward to 2019, AMD released their first 7nm GPU called the Vega VII and they launched it at $700 MSRP. At best, it could keep up with the now almost 2 year old 1080 Ti but on average it was slower than the 1080 Ti which was on 16nm vs the 7nm that VII was in. On top, it didn't have Ray Tracing and DLSS yet it costed the same as a 2080. Not to mention the tons of driver problems and other issues that it had at launch.

Skip ahead a few months, AMD releases their second 7nm GPUs now based on RDNA. The 5700 and 5700 XT. This is also around the time when Nvidia refreshes their GPUs and makes them cheaper. They launched the 5700 XT initially for I think $450 but then lowered it when Nvidia lowered theirs. It still couldn't beat now the 2.5 year old 1080 Ti on average but it was better priced and made VII irrelevant. The problem is... It still doesn't have RT cores or Tensor Cores. So on one hand, you can get a 5700 XT which is priced similarly to the 2000 series and performs pretty well on Rasterization or you can get a 2000 series that will give you RT and Tensor Cores as well + RT is getting more adoption. And the launch also had driver problems.

Fast forward to 2020, Nvidia now has more games that show off Ray Tracing and has released DLSS 2.0 with more and more games adopting both. Meanwhile, RDNA 1 is effectively forgotten as it doesn't have any care worthy feature sets. May as well be a 1000 series competitor in terms of feature set... Fast Forward, MS announces Direct X 12 Ultimate feature set, 2000 series GPUs are fully compatible but 5000 series aren't by the looks of things.

https://www.anandtech.com/show/15637/microsoft-intros-directx-12-ultimate-next-gen-feature-set

So in TLDR:

AMD with 7nm vs Nvidia with 16 nm. AMD lost.

AMD with 7nm vs Nvidia with 14 nm. AMD lost.

AMD with 7nm+ vs Nvidia with 8N nm. AMD... ?

Now don't get me wrong. I do think AMD has a chance but based on their track record... Yea I'll have to see it to believe it as they aren't in the same situation as Ryzen vs Intel... I do hope they do bring it this time though as it could only mean PC prices will get even better than Ampere's incredible pricing.

Last edited by Jizz_Beard_thePirate - on 04 September 2020

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
shikamaru317 said:

Summary:

-AMD is aiming for lower power consumption than Nvidia's Ampere, says he previously heard that the highest end RDNA 2 chip may have a 250 watt TDP.

-Radeon 6700 is targeted at the 3070, but unfortunately won't be ready until early 2021. More power efficient than 3070.

-6800 and 6900 aiming for release by the end of October, but could slip to early November.

-His sources aren't 100% sure how these will compete performance wise against Ampere, largely because clock rates haven't been finalized yet.

-Expects the 6800 and 6700 to about match the 3080 and 3070 respectively while using less power, but thinks that 3090 will outperform the 6900, which is likely aimed at the eventual 3080 ti/Super, rather than the 3090

-Unsure about how well RDNA 2 will compete against Ampere on raytracing, but says he thinks that RDNA 2 will surpass Turing's raytracing performance but fall short of Ampere's ray tracing performance

-Says he thinks that RDNA 2 will overall have the edge in both pricing and power efficiency, but thinks that Ampere will have the performance edge on certain GPU tiers, and at certain types of performance, especially raytracing.

-Says that if Ampere had been even 10% slower, that RDNA 2 would have the performance edge on every tier, that is how close he expects the gap between the two to be

-Says the plan is currently for AMD to undercut Nvidia on pricing across the board, but may increase pricing on tiers where they have the performance edge, to price match Nvidia at those tiers.

-Says that he expects the 6700 and 3070 to outperform the RDNA 2 GPU's in Series X and PS5 by a pretty large margin.

This has been my thinking now for a long time, RDNA2 will beat Ampere in Perf/W and Perf/MM2 and Geforce 3090 will probably have something like 5% performance edge.

The ray-tracing performance is uncertain I think RDNA2 will have 1.5x edge here over Turing but does this mean big navi having 50% more ray-tracing performance compared to Geforce 2080 ti or around 40% more performance geforce 3090? (Note I said 20% in a previous post, got tricked by nvidia slide). We shall see.

For consoles performance, I expect PS5 to land at Geforce 2080 super.

Forgot to mention, big Navi might actually have 24GB Vram, all the leakers has been saying 16GB Vram, but they might been talking about Navi22 without realising it. Check Coreteks tweets.

Last edited by Trumpstyle - on 04 September 2020

6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Any evidence/guesses about when we'll get a 3080ti? I'm undecided about which high end card to get and really wanna know the final specs/price/date for it.



TallSilhouette said:
Any evidence/guesses about when we'll get a 3080ti? I'm undecided about which high end card to get and really wanna know the final specs/price/date for it.

We're in the same boat, I might as well wait for Ti version since I'll have to replace more or less every part in my rig anyway. Future-proof!



TallSilhouette said:
Any evidence/guesses about when we'll get a 3080ti? I'm undecided about which high end card to get and really wanna know the final specs/price/date for it.

Unless AMD forces something, most likely it will be mid-late next year.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Random_Matt said:
Zkuq said:
Will have to wait for more details, but it's seeming very likely that I'll get an RTX 3060 once it's announced and comes out. It ought to be a huge upgrade from my current GTX 770 I got back in 2014... That said, I'll probably wait until next summer and get a really fast SSD then, and there's probably a non-zero chance it would make sense to wait until roughly then before upgrading my graphics card as well. That said, I kind of doubt Nvidia is going to release anything more suitable by then, and it doesn't seem like AMD could possibly have a reasonable answer to Ampere so AMD might also be out for this round.

https://mobile.twitter.com/RedGamingTech/status/1301884380747624449

I can't stand Red Gaming Tech. He has propagated so much false information in the past, not to mention he has an obvious confirmation bias for "Team Red".

Try to get information from independent sources like Digital Foundry or Anandtech.

eva01beserk said:

I dont know why people are doubting AMD so much. Specially since the leaks descrived Ampere's performance almost perfectly many months ago. The only thing that was wrong was the pricing of each card was $100 lower.

If the leaks hold true for AMD as well they will be head to head with the 3080.

AMD has been "talking up" it's GPU's for years and consistently always under delivered, I think that is where the pessimism is coming from... Can't really blame them.
Hopefully AMD knocks it out of the park though, competition is beneficial for us all.

shikamaru317 said:

Summary:

-Finalized specifications for the highest end chipset should be nailed down by 1st week of October

-AMD is aiming for lower power consumption than Nvidia's Ampere, says he previously heard that the highest end RDNA 2 chip may have a 250 watt TDP.

-Radeon 6700 is targeted at the 3070, but unfortunately won't be ready until early 2021. More power efficient than 3070.

-6800 and 6900 aiming for release by the end of October, but could slip to early November.

-His sources aren't 100% sure how these will compete performance wise against Ampere, largely because clock rates haven't been finalized yet.

-Expects the 6800 and 6700 to about match the 3080 and 3070 respectively while using less power, but thinks that 3090 will outperform the 6900, which is likely aimed at the eventual 3080 ti/Super, rather than the 3090

-Unsure about how well RDNA 2 will compete against Ampere on raytracing, but says he thinks that RDNA 2 will surpass Turing's raytracing performance but fall short of Ampere's ray tracing performance

-Also unsure on how well AMD's reported DLSS competitor will compete against DLSS

-Says he thinks that RDNA 2 will overall have the edge in both pricing and power efficiency, but thinks that Ampere will have the performance edge on certain GPU tiers, and at certain types of performance, especially raytracing.

-Says that if Ampere had been even 10% slower, that RDNA 2 would have the performance edge on every tier, that is how close he expects the gap between the two to be

-Says the plan is currently for AMD to undercut Nvidia on pricing across the board, but may increase pricing on tiers where they have the performance edge, to price match Nvidia at those tiers.

-Says that he expects the 6700 and 3070 to outperform the RDNA 2 GPU's in Series X and PS5 by a pretty large margin.

-Doesn't know if AMD has a competitor for RTX IO, but wouldn't be surprised if they do, since they helped Sony and MS with their decompression tech. 

Remember to get your grains of salts folks.

Basically the leak doesn't shift the goalpost at all for AMD, it's basically what we saw RDNA vs Turing.



--::{PC Gaming Master Race}::--