By using this site, you agree to our Privacy Policy and our Terms of Use. Close
hinch said:
Captain_Yuri said:

I wouldn't say a 6700xt is aging better than 3070 in a lot of current titles unless you have proof. Twitter drama certainly has been hammering the vram limitations but you need to be sure they aren't just trying to misslead people so we need to make sure that there's facts to back it up.

If we look at various recent titles, the 6700xt is in fact not aging better than 3070/Ti:

https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html

https://www.techpowerup.com/review/forspoken-benchmark-test-performance-analysis/5.html

https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html

You can see that while vram usage is high, the 3070 is infact not performing worse than 6700xt. A lot of the drama that I have seen surrounding the vram issue is when you enable ray tracing, that some of these Vram limited gpus are not performing as well as they should which is true. But that doesn't mean RDNA 2 is performing better in RT enable titles either as the RT performance on RDNA 2 is generally bad.

I do think a 7900xt for the same price as a 4070 Ti is a better buy though but I'd wait and see if Nvidia responds with price drops as well.

I think age better isn't right the phrase here.. What I'm saying is that having more VRAM is better than running out and having huge performance penalties. With newer titles using more higher quality textures 8GB isn't going to cut it for newer titles for 1440P. It was barely scraping the minimum requirements in 2020 and consoles have way,way more RAM. We had games going over that on cross gen stuff like RE7 on high quality textures. And it looks like the bar is going to get higher as we progress with newer engines and games.

What I'm saying is that you don't want to be close to the edge with VRAM because once your at the limit and go over, you're going to have to reduce settings to make something playable. Granted its not a magic bullet with performance have more is better than running out and getting single figure or massive FPS drops due to lack of RAM.

Another one is the 3080 with its 10GB RAM. Which was and is low for its performance tier and is already problematic with a few select titles with higher quality settings and resolutions.

What we're saying is that Nvidia have skimped on VRAM for years on the lower end stack to get people to go higher end. Its market segmentation but also seems a little like planned obsolecence for mid range buyers. Offering the bare minimum for each tier as possible per generation.

I do agree with what you are saying with Vram and Nvidia has done this for pretty much every generation other than Pascal. But I do think that going with Radeon just because of Vram isn't the answer either because they have a lot of other issues that people aren't mentioning is the point I am trying to make.

I am not saying that Nvidias relatively low Vram is okay or anything of that sort cause yes as you said  it is effectively planned obsolescence. But going onto Radeons camp where the experience is bad but in different ways is not a good option either imo.

We also don't know what next gen engines will do as we have yet to see any games with next gen tech. Things like Sampler Feedback and DirectStorage for GPUs are supposed to significantly lower the Vram usage. But there's also a chance that devs won't use them or will use them but use the added savings to add even more enhancements. But we won't know until games actually come out with those tech. Not to say we should buy vram limited gpus and put our hopes on them but I also wouldn't buy Radeon and put our hopes on them providing a good experience either. It's just better to wait and see if Nvidia discounts their 4080s and see what next gen engines do with vram limited gpus.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850