By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - AMD or NVIDIA: PC Upgrade

 

AMD or NVIDIA Build?

AMD 13 56.52%
 
NVIDIA 10 43.48%
 
Wait. 0 0%
 
Total:23
LegitHyperbole said:
Azzanation said:

The talk is it might be possible on the PS5 PRO but unlikely on based PS5 and Series X The PS5 is RDNA 1.5 and Series X is RDNA 2.0. This tech is built around RDNA 4.0. The PRO is RDNA 2.0X so it really depends what Sony has modified to allow for FSR4, however since the PRO has its own AI Upscaler, its probably alittle pointless having both and Sony would prefer to push their own tech.

Currently nothing is confirmed, just my opinion.

Damn. Well, at least things are looking up for 10th gen. I hate to say it but it's time for an early refresh, scrap this disaster and move on.

Next gen, PS6 and Series2, will likely be focused around this, as a reason to upgrade.
FSR4 looks much better than PSSR imo (even if that, looked better than FSR3).



Around the Network
Azzanation said:
Random_Matt said:

Whoever buys a RTX 5070 over the 9070 XT is a mentally challenged fanboy. The 12GB VRAM makes the card DOA; Hardware Unboxed showed that marvellously. 

So we hear, but the advantages of the 5070Ti is its 12gigs of GDDR7 while the 9070TX has 16gigs of GDDR6. Plus the 5070Ti uses alot less power to achieve similar results.

The type of memory doesn't matter at all.... only the size (because you can run into situations where your then limited by amount of ram).

The 5070ti does use alot less power.
If you care about using like ~80watts less when gaming, yes that is a valid argument.
The thing is, the 600$ for the AMD card, vs the ~1300$ current price for the Nvidia card....
means it will take awhile before the power consumption differnce earns itself back.

Like how long would you need to play a game for 80watts to cost like 700$ ?
Years?



Oh, if NVIDIA flood the market with 5070 TI's at MSRP; the 9070 XT would be less appealing. Best buy (US) and Overclockers (UK) and many more only have one MSRP card, although overclockers have over a 1000 pulse model. Yes you read that right, confirmed by the manager.



If all the cards were at MSRP there might be some interesting choices. The only one that would still make no sense at all is the 5070-- same price as a 9070, worse in raster and because of it's low VRAM even loses out a lot in RT. At that price you may as well skip both those cards and go up to a 9070 XT. Both the 5070 (more like a 5060Ti) and the 9070 should at least be $50 USD cheaper to have a real market.



JRPGfan said:
Azzanation said:

So we hear, but the advantages of the 5070Ti is its 12gigs of GDDR7 while the 9070TX has 16gigs of GDDR6. Plus the 5070Ti uses alot less power to achieve similar results.

The type of memory doesn't matter at all.... only the size (because you can run into situations where your then limited by amount of ram).

The 5070ti does use alot less power.
If you care about using like ~80watts less when gaming, yes that is a valid argument.
The thing is, the 600$ for the AMD card, vs the ~1300$ current price for the Nvidia card....
means it will take awhile before the power consumption differnce earns itself back.

Like how long would you need to play a game for 80watts to cost like 700$ ?
Years?

Well GDDR7 will be faster Ram which will mean you won't need asmuch. But I do like the sound of 16gigs over 12gigs.

Yeah I understand to make up the power output would take awhile but that could also lead to the Nvidia card running cooler meaning it most likely have a longer life span ontop of being cheaper to run. It's still a neat advantage.



Around the Network
Azzanation said:
JRPGfan said:

The type of memory doesn't matter at all.... only the size (because you can run into situations where your then limited by amount of ram).

The 5070ti does use alot less power.
If you care about using like ~80watts less when gaming, yes that is a valid argument.
The thing is, the 600$ for the AMD card, vs the ~1300$ current price for the Nvidia card....
means it will take awhile before the power consumption differnce earns itself back.

Like how long would you need to play a game for 80watts to cost like 700$ ?
Years?

Well GDDR7 will be faster Ram which will mean you won't need asmuch. But I do like the sound of 16gigs over 12gigs.

Yeah I understand to make up the power output would take awhile but that could also lead to the Nvidia card running cooler meaning it most likely have a longer life span ontop of being cheaper to run. It's still a neat advantage.

GDDR7 is faster, yet it's on 192bit bus, unlike 9070XT that has 256bit bus. But indeed, in the end 5070 somewhat edges 9070XT in memory speed with 672 vs 640 Gb/s.

As for temperature, they are about the same for GPU, but 9070XT runs quite a bit hotter on VRAM - 85 on average, vs sub 70 for 5070.



Azzanation said:
JRPGfan said:

The type of memory doesn't matter at all.... only the size (because you can run into situations where your then limited by amount of ram).

The 5070ti does use alot less power.
If you care about using like ~80watts less when gaming, yes that is a valid argument.
The thing is, the 600$ for the AMD card, vs the ~1300$ current price for the Nvidia card....
means it will take awhile before the power consumption differnce earns itself back.

Like how long would you need to play a game for 80watts to cost like 700$ ?
Years?

Well GDDR7 will be faster Ram which will mean you won't need asmuch. But I do like the sound of 16gigs over 12gigs.

Yeah I understand to make up the power output would take awhile but that could also lead to the Nvidia card running cooler meaning it most likely have a longer life span ontop of being cheaper to run. It's still a neat advantage.

Faster ram just effects memory bandwidth.... not how much of it is needed (to play a game at certain resolutions).
This is why people say don't buy the 5070.... because it has to little ram on it.

GDDR7 means the ram can run faster, and use less power (earning nvidia some performance (from memory bandwidth) and some power savings).
However it costs more than GDDR6 (I think?).

So it comes with a trade off... however, it doesn't mean you need less ram pool (in size) because it moves faster.
This just means, you suffer less performance at higher resolutions where memory bandwidth play a bigger role.
Ironically, at 4k the margins close in AMD's favor with these cards.

This means you miss understood what ram is and does, which is fair enough.... that is why I'm pointing this out for you in this post.

The it runs cooler because of this is also kinda faulty logic.
The cooling has to do with the power comsumption/heat generated AND how good the cooler is.

Running at less power, you can trade off your advantage, for a smaller cheaper cooler = money saved = more profits.
Which is what NVIDIA choose to do.

This is the same with the GDDR7, by using faster and less power hungry RAM, they can use a smaller bit bus with their chip, while still having good enough memory bandwidth.  Thus allowing them to make a smaller cheaper chip (size is smaller as a choice of this = cheaper = more profits).

Everything is just design choices with trade offs, that nvidia usually take for profit margins sake.
Nvidia could choose to not gimp their cards with poor video card ram pools, but choose not too, to force people to buy higher priced cards, and to have better profits on selling the cards.  Its all by design, that way.

"most likely have a longer life span ontop of being cheaper to run. It's still a neat advantage."

Just like there is "designed to fail" concepts that design hardware to last until a bit after warranty lasts, so they can sell you another, when it breaks.
There is something called "Planned obsolescence" where a product loses performance and becomes slower and slower as time goes on.

This is something that plagues Nvidia cards more than AMD ones.

Typically some of this is due to AMD drivers improving more over time, than nvidia ones, that launch closer to prefection (compaired to possible performance you can squeeze out of the card).... however some also think, nvidia actively purposefully make changes that hurt older cards in drivers and such, to force you to upgrade.

This is why there is the expression "...AMD ages like fine wine"  when it comes to GPUs.
They give more video card ram (which typically means future proofing) and AMD don't do any of this "planned obsolescence" stuff.

When a generation or two, passes by and you "re-test" (benchmark) these old cards again, against one another, you typically see AMD ones performing better, than they did at launch against nvidia ones. Ei. AMD ages better in terms of performance.

Last edited by JRPGfan - on 06 March 2025

Azzanation said:
Random_Matt said:

Whoever buys a RTX 5070 over the 9070 XT is a mentally challenged fanboy. The 12GB VRAM makes the card DOA; Hardware Unboxed showed that marvellously. 

So we hear, but the advantages of the 5070Ti is its 12gigs of GDDR7 while the 9070TX has 16gigs of GDDR6. Plus the 5070Ti uses alot less power to achieve similar results.

Remember 5070 Ti has 16GB of GDDR7. 5070 (non-Ti) has 12GB of Vram.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

LegitHyperbole said:
JRPGfan said:

https://www.youtube.com/watch?v=ptp5suRDdQQ

Linus seems to like it.

update:
https://www.youtube.com/watch?v=nzomNQaPFSk

Digital Foundry compared it to DLSS.

Alex says it beats the normal DLSS 4 CNN model, but loses to the DLSS4 Transformer model.
Still its pretty clear this is a huge step up, from the older FSR3 method AMD used before.
Machine learning really helped here.


Daniel Owen looks at FSR5 vs DLSS.
https://www.youtube.com/watch?v=EZU0_ZVZtOA


Seems to come to the same conclusion.... at some things FSR4 is beating DLSS.... its a massive upgrade over FSR3.

Wow. Will we get FSR4 in PS5 games? Is it possible games already released could be patched?

FSR4 is based on RDNA4 architecture, and the PS5 is RDNA2 based with the Pro being what Cerny described as somewhere in-between RDNA 2 and 3. So for PS5 to get FSR4, it would've required a complete replacement of graphics architecture, which would be a practical impossibility at this point. It would be a 10th gen addition.

Probably why Sony opted for PSSR with the Pro.



You called down the thunder, now reap the whirlwind

AMD IS BACK BABYYYYYYYYYY