By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JRPGfan said:
Azzanation said:

Well GDDR7 will be faster Ram which will mean you won't need asmuch. But I do like the sound of 16gigs over 12gigs.

Yeah I understand to make up the power output would take awhile but that could also lead to the Nvidia card running cooler meaning it most likely have a longer life span ontop of being cheaper to run. It's still a neat advantage.

Faster ram just effects memory bandwidth.... not how much of it is needed (to play a game at certain resolutions).
This is why people say don't buy the 5070.... because it has to little ram on it.

GDDR7 means the ram can run faster, and use less power (earning nvidia some performance (from memory bandwidth) and some power savings).
However it costs more than GDDR6 (I think?).

So it comes with a trade off... however, it doesn't mean you need less ram pool (in size) because it moves faster.
This just means, you suffer less performance at higher resolutions where memory bandwidth play a bigger role.
Ironically, at 4k the margins close in AMD's favor with these cards.

This means you miss understood what ram is and does, which is fair enough.... that is why I'm pointing this out for you in this post.

The it runs cooler because of this is also kinda faulty logic.
The cooling has to do with the power comsumption/heat generated AND how good the cooler is.

Running at less power, you can trade off your advantage, for a smaller cheaper cooler = money saved = more profits.
Which is what NVIDIA choose to do.

This is the same with the GDDR7, by using faster and less power hungry RAM, they can use a smaller bit bus with their chip, while still having good enough memory bandwidth.  Thus allowing them to make a smaller cheaper chip (size is smaller as a choice of this = cheaper = more profits).

Everything is just design choices with trade offs, that nvidia usually take for profit margins sake.
Nvidia could choose to not gimp their cards with poor video card ram pools, but choose not too, to force people to buy higher priced cards, and to have better profits on selling the cards.  Its all by design, that way.

"most likely have a longer life span ontop of being cheaper to run. It's still a neat advantage."

Just like there is "designed to fail" concepts that design hardware to last until a bit after warranty lasts, so they can sell you another, when it breaks.
There is something called "Planned obsolescence" where a product loses performance and becomes slower and slower as time goes on.

This is something that plagues Nvidia cards more than AMD ones.

Typically some of this is due to AMD drivers improving more over time, than nvidia ones, that launch closer to prefection (compaired to possible performance you can squeeze out of the card).... however some also think, nvidia actively purposefully make changes that hurt older cards in drivers and such, to force you to upgrade.

This is why there is the expression "...AMD ages like fine wine"  when it comes to GPUs.
They give more video card ram (which typically means future proofing) and AMD don't do any of this "planned obsolescence" stuff.

When a generation or two, passes by and you "re-test" (benchmark) these old cards again, against one another, you typically see AMD ones performing better, than they did at launch against nvidia ones. Ei. AMD ages better in terms of performance.

The RX 6000 Series versus the RTX 3000 Series is a good example of this. Nvidia, as usual, skimped on VRAM. The RX 6000 cards (6700 & 6800 series in particular) from AMD pulled away from the RTX 3000 cards in newer games due the cards choking on a lack of VRAM. The 3070 and 3070Ti versus the RX 6800 being the prime example. Nvidia went with 8GB of VRAM. AMD went with 16GB. When newer titles like the RE2 Remake dropped the 70 cards suffered. There are some talented modders in Brazil that doubled the VRAM on Nvidia cards and it showed that the VRAM, or lack their of, was the biggest factor inhibiting performance. You can see this to a lesser degree on rare ocassions with the RTX 3060. The original version of that card has 12GB of VRAM and there are cases were the 4060 which has a faster chip, lags behind because of its saddled with 8GB of VRAM.

I think the AMD "Fine Wine" has more to do with games being optimized to run on Nvidia hardware. So AMD's drivers have to try to make up for that fact. 

Side note: The upcoming 60 class cards (5060 and 9060) are going to be wastes of sand if they are still stuck with 8GB of VRAM.