By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Nvidia Ampere announced, 3090, 3080, 3070 coming later this year

Pemalite said:
eva01beserk said:

@Pemalite   
I got a question if you dont mind.
But what would bottleneck a gpu more when talking vram, its final bandwidth or it the amount of vram?
Obviously we want faster and more, but wich to prioritize?

You need to balance the two.
If you don't have enough Graphics Ram, the GPU will be requesting data from system memory and if the data isn't in system memory it will request it from the SSD or Hard drive and you will thus have a corresponding hit to performance, the Geforce 1060 with 3GB of memory is the perfect example of this.

But if you have tons of slow memory, then you are holding back the fillrate, the Geforce 1030 DDR4 is the perfect example of this as well.

Obviously the more bandwidth or the more memory you provide the higher the bill of materials.

eva01beserk said:

I was mainly asking because of the 3070 and 3080 have the same amount but different bandwidth's. So im not sure if the 3080 would reach some kind of limit in some cases, where the 3070 would not as it would get there anyways. 

The 3070 has 8GB @512GB/s of bandwidth.
The 3080 has 10GB @720GB/s of bandwidth.

There is one thing we need to keep in mind here though... The 3070 has 47.8% less functional units than the 3080, so it's bandwidth and memory needs are also that much less.

The higher the resolution you game at, generally the more video memory and bandwidth you want as well so you can keep all that data stored locally on the GPU.

Well I got the vram amount wrong of the 3080 so that blows my question out the water.

But that helps understand better, Since I only saw the amount as the marketing thing I just always thought throw more at it and you compensate but they hit different bottlenecks.

Thanks anyways and @CGI-Quality Not a pc gamer it was just for personal knowledge. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Around the Network

In the question and answer they did on reddit on 3000 series they mention that DLSS 2.1 added support for VR. This is actually the location where it got me most excited if it taken advantage of and we see higher resolution VR head set.

This is my personal taste so other people might vary but for me when it comes to what I would like to see in different categories

E-sports - High refresh rate, resolution less important

Single player experiences - 4k, HDR and other eye candy like ray tracing etc but most the time am fine with 60hz on the refresh rate

VR - Greater then 4k and High refresh rates and I would give up some eye candy for those 2 things because you basically looking at a screen though a magnified glass so not noticing pixels require a very high resolution and because of the nature of VR not having a min of 90hz become a issue. Higher then 90 would be better.

If DLSS allow that high resolution with a high refresh rate and still get some of the eye candy that would be awesome for vr



Cyran said:

In the question and answer they did on reddit on 3000 series they mention that DLSS 2.1 added support for VR. This is actually the location where it got me most excited if it taken advantage of and we see higher resolution VR head set.

This is my personal taste so other people might vary but for me when it comes to what I would like to see in different categories

E-sports - High refresh rate, resolution less important

Single player experiences - 4k, HDR and other eye candy like ray tracing etc but most the time am fine with 60hz on the refresh rate

VR - Greater then 4k and High refresh rates and I would give up some eye candy for those 2 things because you basically looking at a screen though a magnified glass so not noticing pixels require a very high resolution and because of the nature of VR not having a min of 90hz become a issue. Higher then 90 would be better.

If DLSS allow that high resolution with a high refresh rate and still get some of the eye candy that would be awesome for vr

The problem is the applications have to natively support DLSS. To fix this issue Nvidia actually wanted to introduce DLSS 3, but they haven't done so yet and considering how quiet they've been about it I doubt we'll see it anytime soon.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

CGI-Quality said:

I'll also explain the MASSIVE jump in CUDAs. NVIDIA was looking to greatly improve the Ampere SM (streaming multiprocessors) over Turing. This is in FP32 (or single precision floating-point format/operations). It is also where the theoretical peak (teraflop count) is measured.

One new datapath includes 16 FP32 CUDAs capable of 16 FP32 operations per clock. The other? 16 FP32 CUDAs and 16 INT32 (an immutable value, so it can't be changed). The result of this new design, each Ampere partition can execute either 32 FP32 operations per clock or 16 FP32 and 16 INT32 operations per clock (however you choose to split it up). When combined, the four partitions can achieve 128 single precision floating-point operations per clock, which DOUBLES the FP32 rate of the Turing streaming multiprocessor (or 64 FP32 and 64 INT32 operations per clock). In less scientific terms, Ampere's SM has 128 CUDAs vs Turing's 64. This is..... a rather big deal!

Ultimately, when you double the processing speed (and double the data paths as a necessity to that), it helps many more things on the card.

I feel more like it's a cop-out or a bad compromise. Turing got it right by having dedicated paths for INT and FP loads. Ampere is basically just a cheap way to increase FP cores without sacrificing too much space to INT cores. That leads to less efficient cores. For example if you take the worst case scenario of having always loads of 64FP32 and 64INT32 on every SM you'd have the exact same performance as Turing per cycle. Basically the only reason why we see big performance improvements at all is that games have generally higher loads of FP32 than INT32 (and of course the increased clocks and SM count).

I'm very interested how they'll improve that with Hopper.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
Cyran said:

In the question and answer they did on reddit on 3000 series they mention that DLSS 2.1 added support for VR. This is actually the location where it got me most excited if it taken advantage of and we see higher resolution VR head set.

This is my personal taste so other people might vary but for me when it comes to what I would like to see in different categories

E-sports - High refresh rate, resolution less important

Single player experiences - 4k, HDR and other eye candy like ray tracing etc but most the time am fine with 60hz on the refresh rate

VR - Greater then 4k and High refresh rates and I would give up some eye candy for those 2 things because you basically looking at a screen though a magnified glass so not noticing pixels require a very high resolution and because of the nature of VR not having a min of 90hz become a issue. Higher then 90 would be better.

If DLSS allow that high resolution with a high refresh rate and still get some of the eye candy that would be awesome for vr

The problem is the applications have to natively support DLSS. To fix this issue Nvidia actually wanted to introduce DLSS 3, but they haven't done so yet and considering how quiet they've been about it I doubt we'll see it anytime soon.

I agree but at least it path forward where at some point we could see some truly great leaps in VR because if we being truthfully without some thing like DLSS we not going to have the GPU power to hit the idea resolution+refresh rates targets in VR for multiple more generations of GPU.



Around the Network

As a 1080ti owner the 3080 does tempt me but I'll probably wait till the 4000 series before upgrading and upgrade my CPU as well. After the lackluster 2000 series it feels nice to see a real leap again for prices better than last time.



3080 is 700$ in the US, and over 1600$ here in Israel :(



Norion said:
As a 1080ti owner the 3080 does tempt me but I'll probably wait till the 4000 series before upgrading and upgrade my CPU as well. After the lackluster 2000 series it feels nice to see a real leap again for prices better than last time.

Now that you mention that cpu upgrade. It makes me doubt the 3070 even more. Since it has les memory bandwidth and les memory total than the 2080ti. if your not upgrading from a pc that already has pci4 support, people are not get more performance than a 2080ti. It might be better for them to jumt to it since it will probably cost as much a 3070. Or you can have a lot of system ram to compensate, but im thinking that for thouse aiming at it, they probbly dont have it.

Granted if you have the means go for anything, but thouse starting with an old build you guys are probably better  off with a 2080ti, or your gona be spending more than just $500 for a 3070 alone. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Will have to wait for more details, but it's seeming very likely that I'll get an RTX 3060 once it's announced and comes out. It ought to be a huge upgrade from my current GTX 770 I got back in 2014... That said, I'll probably wait until next summer and get a really fast SSD then, and there's probably a non-zero chance it would make sense to wait until roughly then before upgrading my graphics card as well. That said, I kind of doubt Nvidia is going to release anything more suitable by then, and it doesn't seem like AMD could possibly have a reasonable answer to Ampere so AMD might also be out for this round.



Zkuq said:
Will have to wait for more details, but it's seeming very likely that I'll get an RTX 3060 once it's announced and comes out. It ought to be a huge upgrade from my current GTX 770 I got back in 2014... That said, I'll probably wait until next summer and get a really fast SSD then, and there's probably a non-zero chance it would make sense to wait until roughly then before upgrading my graphics card as well. That said, I kind of doubt Nvidia is going to release anything more suitable by then, and it doesn't seem like AMD could possibly have a reasonable answer to Ampere so AMD might also be out for this round.

https://mobile.twitter.com/RedGamingTech/status/1301884380747624449