By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Nvidia Ampere announced, 3090, 3080, 3070 coming later this year

Mummelmann said:
So, I'm wondering whether the 3090 is simply a "new" Titan type GPU, or is it meant to replace the Ti version of --80? Will a 3080 Ti show up in 6-12 months? I'm getting a new GPU this gen for sure, but if there's a 3080 Ti on the horizon, I'll hold off and get everything I need all at once when it arrives (need more RAM, bigger PSU and a 4K monitor).

It is the replacement for Titan. I fully expect a 3080ti within the next 12 months  that is about 5% below the 3090 but only costs around $1000.

Which is why I feel so bad for most likely splurging on the 3090 very soon.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Around the Network
vivster said:
Mummelmann said:
So, I'm wondering whether the 3090 is simply a "new" Titan type GPU, or is it meant to replace the Ti version of --80? Will a 3080 Ti show up in 6-12 months? I'm getting a new GPU this gen for sure, but if there's a 3080 Ti on the horizon, I'll hold off and get everything I need all at once when it arrives (need more RAM, bigger PSU and a 4K monitor).

It is the replacement for Titan. I fully expect a 3080ti within the next 12 months  that is about 5% below the 3090 but only costs around $1000.

Which is why I feel so bad for most likely splurging on the 3090 very soon.

Yeah, I figured this was likely, this was the case with my card (I still have my 980Ti). I think I'll hold off then, perhaps prices on RAM and displays will drop some more as well, Corona has wreaked havoc on hardware prices here right now. In my opinion, a Ti version is the best choice, it has a lot more punch than the basic cards but costs much less than the kahuna that sits 40-50% or more above in price.



Kyuu said:
Consoles going AMD for graphics is feeling more and more like a big mistake, backwards compatibility be damned. Between DLSS/Tensor Core magic, brute force, and value... lowend/affordable Nvidia GPUs are looking to crush consoles' in no time. This is especially bad for Xbox.

Consoles are gonna be fine. Don't forget with consoles it's never about performance, but just the best possible performance/cost ratio. From an economic standpoint going with Nvidia for this gen would be a terrible idea. They have a good partnership with AMD and AMD will at some point catch up enough with Nvidia so that even the generation after this it will be better to go AMD.

Let's not pretend that performance and feature rich hardware is of any concern for anyone who buys a console.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Trumpstyle said:
Not good enough, RDNA2 will beat it.

Lol, I don't t think so. 



vivster said:
eva01beserk said:
I guess im not the only one who saw a disparity with the TFlop number advertise and the performance gained advertised. did NVIDIA cards became less efficient with this gen? Does it have anything to do with TSMC abandoning them and the poor quality of samsung?

Does anybody know the size of these chips in comparison to touring? Just some rumors I heard that they are going with a huge chip to get more performance. Judging by the watt needed it seems that it might be the case.

I dont know guys, but the rumors seem sorta accurate and AMD seems to be going to give NVIDIA a run for its money. While I dont think they could possibly match the 3090, AMD did claim 2x of the 5700xt wich will be around of the 3080. If they price it right now that nvidea went first I see a big blow out in the future.

I've looked into it a bit. The big problem here is Nvidia using misleading numbers, which are technically true but have no bearing in real world applications.

Ampere is using a new kind of shader that is able to execute two instructions per clock. This is similar to hyperthreading in CPUs. However it's of course less efficient in real world applications than having 2 full shaders. Nvidia now proceeds to treat those new shaders as double the shaders from which they also derive the FLOPS. This brings us into a bit of a predicament because now the shader count and FLOPS are not comparable to Nvidia's own cards anymore.

So now you have 2 ways of looking at it. If you take the logical cores and theoretical FLOPS at face value you could say that the new shaders are less efficient compared to Turing, which would be true. But that kinda devalues the engineering that has actually been done. I would like to look at it differently and just half the proposed shader count and FLOPS and say that while they have not much increased from Turing, they have massively increased in efficiency.

As for the die size, we have hard numbers on that. To my personal surprise the GA102 used for the 3080 and 3090 is actually slightly smaller than the TU102 which was used for the 2080ti. At the same time it is almost twice as densely packed, which explains the massive performance increase.

TU102 - 754 mm², 24.7M/mm² (2080ti)

TU104 - 545 mm², 25.0M/mm² (2080)

GA102 - 627 mm², 44.7M/mm² (3090, 3080)

GA104 - 450 mm², 40.0M/mm² (3070)

I looked a bit into in to, but I dint think of it like hyper threading. If so we would need to half it to 15tf for the 3080, compared to the 10tf of the 2080. with the 80% performance gained showed, we can say nvedia gained a 20% real world performance per flop or per shader. Cant say I know much of their history but it sounds like a big improvement. But I would understand why they would advertise the big number :p

Die size is impressive. But I guess we still have to wait for amd to compare. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Around the Network
Wyrdness said:
eva01beserk said:

You clearly dint even read the slides it seems, or even bother to see the digital foundry video. They clearly said the 3080 is around 80% the performance of the 2080 in like 5 diferent games. How on earth did you go on to say that its then going to be twice the performance of the 2080ti? Nvedia claimed 2x the performance, digital foundry said 1.8x the performance. This is why I say the 3070 wont be as powerful as a 2080ti. 

Yes nvedia flops have always been better, wich is why I tried to correlate performance between nvideas cards and you brought up amd. But if you compare tflop to tflop from touring to ampere, performance is lost and its what I was arguing from the very begining.

Yes, I saw I was wrong about the price, then I edited and corrected it. But you where wrong also as you claimed it was $500 and are trying to pretend you always said $450. And at the same time not addressing that the $450 dollar 5700xt beat in performance the $500 rtx 2070, not only that it got pretty close to the $700 rtx 2080. I would say for not having hardware based ray tracing, that is very good. 

And no, digital foundry did not backed what nvidea said, they where 20% below and I would argue thats a very big gap to be discarded as marging of error. 

There is no real info on the 3070 real world performance. We have digital foundry comparing the 3080 to the 2080. From there I just calculated from the paper specs. on paper, the 3080 is 33.33% better than a 3070. the comparison digital foundry made demonstrated the 3080 is around 80% better than a 2080. I know its not how it should be done, but we have no other data, but I did a quick rule of 3 to asses the 3070 is 20% better than a 2080. But the problem is that the 2080ti is at least 30% better than a 2080. thats why Im giving it a marging of error and say that they will at most match the 2080ti. 

I did read the slides and watch the video that's how I know that nothing really backs anything you've try to say because the were times in the DF video where performance jumped to like 96% the 80% is an overall average and 1.8 in 4K with raytracing and all at ultra falls into the region of twice the performance that's how such numbers work.

Your Tf part again is not making sense because what is shown is contradicting you if performance is lost how are games running much better with what you claim is a smaller jump? How is it DF are saying the news cards are allowing things to be done while running 4K with raytracing and 60fps? That indicates quite the opposite of what you're arguing and yes AMD would be brought up because you're using this argument to make their case.

5700xt doesn't beat the 2070 the latter is slightly better at the same performance and the former isn't close to the 2080Ti where did you even get that from the 2080Ti is 35% better than the 5700xt and that's with out features like raytracing, even the 2080 is 18% better in performance and the 2080 Super which replaced the 2080 is 25% better. These numbers are from benchmark where

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2070-vs-AMD-RX-5700-XT/4029vs4045
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-AMD-RX-5700-XT/4027vs4045
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-AMD-RX-5700-XT/4026vs4045
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080S-Super-vs-AMD-RX-5700-XT/4050vs4045

The last part I don't even know who taught you math or what kind of math you're trying to apply because your own logic contradicts you in this part because you say that on paper 3080 is 30% better and gets 80% better performance which not only contradicts your earlier notion on less performance per flop but you go on to say that 3070 being 20% better will only match the 2080Ti in real performance after calculating 30% lead to 80% increase performance on the 3080? Under your own math 20% better specs on the standard 2080 would be roughly around 50-60% increase in performance, the 2080Ti has a 15% increase over the standard 2080 this means under your own math the 3070 would surpass the 2080Ti as they've said.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-RTX-2080/4027vs4026

I see your trying to play some kind of gotcha game and twisted my words in a ridiculous way. Thats fine.

Let me just leave you with this real world comparison of the 2070s vs the 5700xt.

Mind you this the super.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

eva01beserk said:

I see your trying to play some kind of gotcha game and twisted my words in a ridiculous way. Thats fine.

Let me just leave you with this real world comparison of the 2070s vs the 5700xt.

Mind you this the super.

It isn't twisting your words it's applying your own logic and stance to the whole argument you're presenting and because your stance hasn't been consistent that's why the math part looks bizarre either way it's you who brought that so it's down to you. The video effectively just proved my point in my last post :/ I'm not sure if you're watching the videos you're referencing at this point tbh in fact the video enhanced it with the 2070s comparison in some cases like with Control it had a 16% performance increase over the 5700xt in real world applications and when dlss is turned on that gap increases in fact with dlss even the 2060 matched the 5700xt in that video while the other 20 series cards beat it out.



These new cards finally mean that maxed out 4k 60fps across the board is now possible, including Ray Tracing thanks to DLSS.

I reallllllly wanna see when the 3060 is announced how much better it is than the new consoles compared to it's price.



There's only 2 races: White and 'Political Agenda'
2 Genders: Male and 'Political Agenda'
2 Hairstyles for female characters: Long and 'Political Agenda'
2 Sexualities: Straight and 'Political Agenda'

Wyrdness said:
eva01beserk said:

I see your trying to play some kind of gotcha game and twisted my words in a ridiculous way. Thats fine.

Let me just leave you with this real world comparison of the 2070s vs the 5700xt.

Mind you this the super.

It isn't twisting your words it's applying your own logic and stance to the whole argument you're presenting and because your stance hasn't been consistent that's why the math part looks bizarre either way it's you who brought that so it's down to you. The video effectively just proved my point in my last post :/ I'm not sure if you're watching the videos you're referencing at this point tbh in fact the video enhanced it with the 2070s comparison in some cases like with Control it had a 16% performance increase over the 5700xt in real world applications and when dlss is turned on that gap increases in fact with dlss even the 2060 matched the 5700xt in that video while the other 20 series cards beat it out.

So not just my words you are twisting, but the videos themselves. At least now I know you are watching them, you really had to to then twisted in that manner. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

ArchangelMadzz said:
These new cards finally mean that maxed out 4k 60fps across the board is now possible, including Ray Tracing thanks to DLSS.

I reallllllly wanna see when the 3060 is announced how much better it is than the new consoles compared to it's price.

I think some are saying the 3080 can do 4k60 with rt without the need for dlss. you can probably push 120fps with dlss. At least going with the digital foundry video and adding previously known 2080 performance. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.