By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Personally I am expecting AMD's RDNA2 to get the least amount of real world gains compared to Nvidia's Ampere. The reason is there isn't much of a node shrink with RDNA2, just an architectural change. Meanwhile Nvidia is going to have both a node shrink + architectural. I wouldn't be surprised if AMD's RDNA 2 pricing is close to Ampere as well since they couldn't massively undercut them with RDNA 1 without tensor cores and without ray tracing so they certainly can't do it with Ray Tracing.

Also this is AMD's first attempt at ray tracing vs Nvidia's second generation ray tracing. Hopefully their performance is good but I have lots of doubts that it can even keep pace with 3080.

Edit: Removed a sentence

Last edited by Jizz_Beard_thePirate - on 20 June 2020

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
JEMC said:
Bofferbrauer2 said:

Don't be so sure about that.

If the 50% efficiency improvement over Navi/RDNA for RDNA2 holds true, then competing with the 3090 shouldn't be out of question. Even with just 64 CU and the clock speeds of the 5700XT the 2080Ti would be beaten by a country mile due to the latter's low clock speeds. If NVidia don't clock the 3080 and 3090 with at least 200 Mhz more than the 2080Ti did, then Big Navi could spell major trouble to NVidia.

You mention the 50% improvement from AMD that, and I can't stress this enough, is just their target goal that they may achieve it or not, we won't know until launch. But aren't you forgetting that Ampere will also bring performance improvements? Don't you remember the rumors from earlier this year that talked about a 40% improvement?

AMD won't be the only one to improve here, and during the last gens we've seen how the newer xx70 card performed like the xx80Ti from the previous gen (970 ∼ 780Ti, 1070 ∼ 980Ti and the 2070 failed that because of the focus in Ray Tracing, tho the Super variant matched the performance of the 1080Ti).

If that pattern follows again this gen, and even wif it ends having ith the same shader/core count as the rumor from yesterday, the 3080 will be much faster than the 2080Ti thanks to the improvements of the new architecture and the jump to 7nm, that will allow it to clock much higher.

Don't worry, I didn't forget it - in fact,I hope NVidia does good on those improvements. But my fear was more fueled by the TGP of 350W for the 3090, which simply doesn't sound like the performance per watt improvement is really that big unless the clock speed also went up quite a bit.

Captain_Yuri said:

Personally I am expecting AMD's RDNA2 to get the least amount of real world gains compared to Nvidia's Ampere. The reason is there isn't much of a node shrink with RDNA2, just an architectural change which for all we know is just adding in ray tracing. Meanwhile Nvidia is going to have both a node shrink + architectural. I wouldn't be surprised if AMD's RDNA 2 pricing is close to Ampere as well since they couldn't massively undercut them with RDNA 1 without tensor cores and without ray tracing so they certainly can't do it with Ray Tracing.

Also this is AMD's first attempt at ray tracing vs Nvidia's second generation ray tracing. Hopefully their performance is good but I have lots of doubts that it can even keep pace with 3080.

WTF? AMD has already denied that rumor and said it's an architectural redesign to make it consume much less for the same performance. That's like if I had said that Ampere would be just another rehash of Maxwell with Raytracing slightly more firmly bolted on than with Turing.

Last edited by Bofferbrauer2 - on 20 June 2020

Bofferbrauer2 said:
JEMC said:

You mention the 50% improvement from AMD that, and I can't stress this enough, is just their target goal that they may achieve it or not, we won't know until launch. But aren't you forgetting that Ampere will also bring performance improvements? Don't you remember the rumors from earlier this year that talked about a 40% improvement?

AMD won't be the only one to improve here, and during the last gens we've seen how the newer xx70 card performed like the xx80Ti from the previous gen (970 ∼ 780Ti, 1070 ∼ 980Ti and the 2070 failed that because of the focus in Ray Tracing, tho the Super variant matched the performance of the 1080Ti).

If that pattern follows again this gen, and even wif it ends having ith the same shader/core count as the rumor from yesterday, the 3080 will be much faster than the 2080Ti thanks to the improvements of the new architecture and the jump to 7nm, that will allow it to clock much higher.

Don't worry, I didn't forget it - in fact,I hope NVidia does good on those improvements. But my fear was more fueled by the TGP of 350W for the 3090, which simply doesn't sound like the performance per watt improvement is really that big unless the clock speed also went up quite a bit.

Those rumors with the 350W figure also mention the use of GDDR6X and, as has been said in this thread several times, that type of memory hasn't been announced yet. Who's going to make them a memory that hasn't been designed yet?

I'd use a grain of salt with that rumor, starting with the name. All the xx90 cards were dual GPU parts and the -Ti suffix has brand value. While it's not impossible, 'm having a hard time believeing that they'll change all that without a good reason.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Bofferbrauer2 said:

Captain_Yuri said:

Personally I am expecting AMD's RDNA2 to get the least amount of real world gains compared to Nvidia's Ampere. The reason is there isn't much of a node shrink with RDNA2, just an architectural change which for all we know is just adding in ray tracing. Meanwhile Nvidia is going to have both a node shrink + architectural. I wouldn't be surprised if AMD's RDNA 2 pricing is close to Ampere as well since they couldn't massively undercut them with RDNA 1 without tensor cores and without ray tracing so they certainly can't do it with Ray Tracing.

Also this is AMD's first attempt at ray tracing vs Nvidia's second generation ray tracing. Hopefully their performance is good but I have lots of doubts that it can even keep pace with 3080.

WTF? AMD has already denied that rumor and said it's an architectural redesign to make it consume much less for the same performance. That's like if I had said that Ampere would be just another rehash of Maxwell with Raytracing slightly more firmly bolted on than with Turing.

Alright maybe I was a bit too presumptuous with that sentence loll

Last edited by Jizz_Beard_thePirate - on 20 June 2020

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

At least the dust won't let you see the dead animals inside.

And that's also the reason my current PC has a positive airflow.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

nVidia will definitely have the edge, they are a few years ahead of AMD at this point in architectural efficiency.

AMD has managed to keep pace with nVidia by introducing die-shrinks far more aggressively... However, that advantage disappears with Ampere, nVidia has a ton of transistor room to play with at 7nm.




www.youtube.com/@Pemalite

Putting my plan to buy a new laptop on hold while trying to fix my current one. Already ran into the first stupid thing. There is a Lenovo Diagnostics tool you can run from USB to check your components. The .exe you have to run to create the medium won't install because it only runs FROM Lenovo devices. So to create the tool to fix your broken Lenovo device you need a working Lenovo device.

Looks like I'll have to go back to trusty Hiren's BootCD.

edit: Well, this isn't good news. It shows a hardware error on the NVME controller.

edit: gave up on fixing it. It's way too fiddly for my gorilla hands and I'm more likely to break it than to fix it. To get to the SSD you have to remove 22 tiny screws and a dozen tiny cables to be bale to remove the mainboard. I'm currently looking for a laptop again.

Last edited by vivster - on 21 June 2020

If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Dang looks like I found my perfect monitor. LG 48" CX OLED TV 4k (120hz HDMI 2.1). May be a bit too big but should be fine with custom ultrawide (1600p?) resolution. A bit pricey though at £1500.

Last edited by hinch - on 21 June 2020

hinch said:

Dang looks like I found my perfect monitor. LG 48" CX OLED TV 4k (120hz HDMI 2.1). May be a bit too big but should be fine with custom ultrawide (1600p?) resolution. A bit pricey though at £1500.

Get it and tell me how good it is for gaming. I plan to get the 65" for me for both of my PCs.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.