By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Nvidia Ampere announced, 3090, 3080, 3070 coming later this year

Did they use the term DLSS 3.0 or is that upgrade available to current 20 cards as well?



Around the Network
numberwang said:

Did they use the term DLSS 3.0 or is that upgrade available to current 20 cards as well?

As far as I can tell, it's available to 2000 series cards as well.

There is no 3.0 as of yet.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

zero129 said:
Wow. wasn't expecting this!. 3070 it is . hmm but then 200 more would get me 3080 :-/....Damnit!!.

Thus the cycle begins



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

eva01beserk said:
I guess im not the only one who saw a disparity with the TFlop number advertise and the performance gained advertised. did NVIDIA cards became less efficient with this gen? Does it have anything to do with TSMC abandoning them and the poor quality of samsung?

Does anybody know the size of these chips in comparison to touring? Just some rumors I heard that they are going with a huge chip to get more performance. Judging by the watt needed it seems that it might be the case.

I dont know guys, but the rumors seem sorta accurate and AMD seems to be going to give NVIDIA a run for its money. While I dont think they could possibly match the 3090, AMD did claim 2x of the 5700xt wich will be around of the 3080. If they price it right now that nvidea went first I see a big blow out in the future.

I've looked into it a bit. The big problem here is Nvidia using misleading numbers, which are technically true but have no bearing in real world applications.

Ampere is using a new kind of shader that is able to execute two instructions per clock. This is similar to hyperthreading in CPUs. However it's of course less efficient in real world applications than having 2 full shaders. Nvidia now proceeds to treat those new shaders as double the shaders from which they also derive the FLOPS. This brings us into a bit of a predicament because now the shader count and FLOPS are not comparable to Nvidia's own cards anymore.

So now you have 2 ways of looking at it. If you take the logical cores and theoretical FLOPS at face value you could say that the new shaders are less efficient compared to Turing, which would be true. But that kinda devalues the engineering that has actually been done. I would like to look at it differently and just half the proposed shader count and FLOPS and say that while they have not much increased from Turing, they have massively increased in efficiency.

As for the die size, we have hard numbers on that. To my personal surprise the GA102 used for the 3080 and 3090 is actually slightly smaller than the TU102 which was used for the 2080ti. At the same time it is almost twice as densely packed, which explains the massive performance increase.

TU102 - 754 mm², 24.7M/mm² (2080ti)

TU104 - 545 mm², 25.0M/mm² (2080)

GA102 - 627 mm², 44.7M/mm² (3090, 3080)

GA104 - 450 mm², 40.0M/mm² (3070)



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

eva01beserk said:

You clearly dint even read the slides it seems, or even bother to see the digital foundry video. They clearly said the 3080 is around 80% the performance of the 2080 in like 5 diferent games. How on earth did you go on to say that its then going to be twice the performance of the 2080ti? Nvedia claimed 2x the performance, digital foundry said 1.8x the performance. This is why I say the 3070 wont be as powerful as a 2080ti. 

Yes nvedia flops have always been better, wich is why I tried to correlate performance between nvideas cards and you brought up amd. But if you compare tflop to tflop from touring to ampere, performance is lost and its what I was arguing from the very begining.

Yes, I saw I was wrong about the price, then I edited and corrected it. But you where wrong also as you claimed it was $500 and are trying to pretend you always said $450. And at the same time not addressing that the $450 dollar 5700xt beat in performance the $500 rtx 2070, not only that it got pretty close to the $700 rtx 2080. I would say for not having hardware based ray tracing, that is very good. 

And no, digital foundry did not backed what nvidea said, they where 20% below and I would argue thats a very big gap to be discarded as marging of error. 

There is no real info on the 3070 real world performance. We have digital foundry comparing the 3080 to the 2080. From there I just calculated from the paper specs. on paper, the 3080 is 33.33% better than a 3070. the comparison digital foundry made demonstrated the 3080 is around 80% better than a 2080. I know its not how it should be done, but we have no other data, but I did a quick rule of 3 to asses the 3070 is 20% better than a 2080. But the problem is that the 2080ti is at least 30% better than a 2080. thats why Im giving it a marging of error and say that they will at most match the 2080ti. 

I did read the slides and watch the video that's how I know that nothing really backs anything you've try to say because the were times in the DF video where performance jumped to like 96% the 80% is an overall average and 1.8 in 4K with raytracing and all at ultra falls into the region of twice the performance that's how such numbers work.

Your Tf part again is not making sense because what is shown is contradicting you if performance is lost how are games running much better with what you claim is a smaller jump? How is it DF are saying the news cards are allowing things to be done while running 4K with raytracing and 60fps? That indicates quite the opposite of what you're arguing and yes AMD would be brought up because you're using this argument to make their case.

5700xt doesn't beat the 2070 the latter is slightly better at the same performance and the former isn't close to the 2080Ti where did you even get that from the 2080Ti is 35% better than the 5700xt and that's with out features like raytracing, even the 2080 is 18% better in performance and the 2080 Super which replaced the 2080 is 25% better. These numbers are from benchmark where

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2070-vs-AMD-RX-5700-XT/4029vs4045
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-AMD-RX-5700-XT/4027vs4045
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-AMD-RX-5700-XT/4026vs4045
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080S-Super-vs-AMD-RX-5700-XT/4050vs4045

The last part I don't even know who taught you math or what kind of math you're trying to apply because your own logic contradicts you in this part because you say that on paper 3080 is 30% better and gets 80% better performance which not only contradicts your earlier notion on less performance per flop but you go on to say that 3070 being 20% better will only match the 2080Ti in real performance after calculating 30% lead to 80% increase performance on the 3080? Under your own math 20% better specs on the standard 2080 would be roughly around 50-60% increase in performance, the 2080Ti has a 15% increase over the standard 2080 this means under your own math the 3070 would surpass the 2080Ti as they've said.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-RTX-2080/4027vs4026

Last edited by Wyrdness - on 02 September 2020

Around the Network

I mean when the price of the GPU is likely $300 more than the XSX/PS5 in their entirety, the power difference should come as no surprise. What I'm slowy realising though is how unprepared it feels for sony/MS to go into this generation with no built in image reconstruction features. It could be that neither wants to speak about it yet as it sounds like a power concession but that's unlikely.



Otter said:

I mean when the price of the GPU is likely $300 more than the XSX/PS5 in their entirety, the power difference should come as no surprise. What I'm slowy realising though is how unprepared it feels for sony/MS to go into this generation with no built in image reconstruction features. It could be that neither wants to speak about it yet as it sounds like a power concession but that's unlikely.

I think that has less to do with the willingness of the console manufacturers and more with the ability of AMD. I'm sure they would've loved a feature similar to DLSS, but I doubt AMD has anything to offer on that part. Which kinda is a shame because DLSS would be absolutely perfect for consoles who already struggle with performance and image clarity. Of course they could've gone with Nvidia, but that would've been a real mess technologically, economically and politically. Less so for Xbox, but still.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
Otter said:

I mean when the price of the GPU is likely $300 more than the XSX/PS5 in their entirety, the power difference should come as no surprise. What I'm slowy realising though is how unprepared it feels for sony/MS to go into this generation with no built in image reconstruction features. It could be that neither wants to speak about it yet as it sounds like a power concession but that's unlikely.

I think that has less to do with the willingness of the console manufacturers and more with the ability of AMD. I'm sure they would've loved a feature similar to DLSS, but I doubt AMD has anything to offer on that part. Which kinda is a shame because DLSS would be absolutely perfect for consoles who already struggle with performance and image clarity. Of course they could've gone with Nvidia, but that would've been a real mess technologically, economically and politically. Less so for Xbox, but still.

Thus, the next Nintendo console will highly likely  run DLSS if they keep with Nvidia at this point.



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Peh said:
vivster said:

I think that has less to do with the willingness of the console manufacturers and more with the ability of AMD. I'm sure they would've loved a feature similar to DLSS, but I doubt AMD has anything to offer on that part. Which kinda is a shame because DLSS would be absolutely perfect for consoles who already struggle with performance and image clarity. Of course they could've gone with Nvidia, but that would've been a real mess technologically, economically and politically. Less so for Xbox, but still.

Thus, the next Nintendo console will highly likely  run DLSS if they keep with Nvidia at this point.

I could actually see myself buy one if that's the case.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

So, I'm wondering whether the 3090 is simply a "new" Titan type GPU, or is it meant to replace the Ti version of --80? Will a 3080 Ti show up in 6-12 months? I'm getting a new GPU this gen for sure, but if there's a 3080 Ti on the horizon, I'll hold off and get everything I need all at once when it arrives (need more RAM, bigger PSU and a 4K monitor).