By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Pemalite said:
Jizz_Beard_thePirate said:

AMD leaves the door open to FSR 4-like support on RDNA 3

https://videocardz.com/newz/amd-leaves-the-door-open-to-fsr-4-like-support-on-rdna-3

Remember kids, if Radeon is treating RDNA 3 users like shit, they will likely treat RDNA 4 users like shit when RDNA 5 comes out

Keep in mind that only RDNA4 supports FP8 to make optimal use of FSR4.
RDNA3 can use the INT8 path, but it comes with a performance/visual quality tradeoff.

I don't think its about "treating RDNA3 users like shit" rather than there being genuine technical differences between the architectures, it's not like AMD made any promises that RDNA 1/2/3 would get the full blown FSR4 experience either.

I think what AMD needs to do is just make FSR open source and let the community handle it.

Sure it comes with trade offs but let the users decide whether or not the trade offs are worth it. From the reviews of the leaked version, it's a pretty big improvement in image quality even when being used with RDNA3 INT8 compared to FSR 3 which is currently the only official choice.

And sure Radeon hasn't promised anything to anyone so the logic of "you got what you paid for" certainly applies. But personally speaking, when a company like Nvidia is giving options to 2060 users from 2019 to run the latest DLSS models while Radeon doesn't want to give those that spent $1000 on 7900XTX the time of day even though Radeon had a plenty of marketing around the so called "Ai capabilities" of RDNA 3... Yea it really shows how each company supports their user base and it affects the reputation of each company.

Radeon should be working towards a "we are better than Nvidia" reputation in anyway possible given their single digit market share. But if they are going to act like how they have done in the past year, they absolutely deserve that market share imo.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Zkuq said:
JEMC said:

The problem with your first point is that Sci-Fi movies have been giving aliens "voices" for decades, with plenty of them not being human at all. Prototyping could be an option, but then you find this from the time they launched The Finals: 

The Finals uses AI text-to-speech because it can produce lines 'in just a matter of hours rather than months', baffles actual voice actors

https://www.pcgamer.com/the-finals-uses-ai-text-to-speech-because-it-can-produce-lines-in-just-a-matter-of-hours-rather-than-months-baffles-actual-voice-actors/

Sounds like the skepticism was well-grounded then. I had already forgotten about The Finals AI voice thing.

Anyway, I meant voice acting that wouldn't have been done at all due to costs or whatnot, not e.g. alien voices. I think for a game like ARC Raiders, that's probably not much, because probably everything that isn't text in the game world gets voice acted, but I think it's still an interesting point. For example, a lot of older games have only partial voice acting, and some characters might not have any voice acting. AI could be used for voice acting in such instances without AI actually replacing people (if it's actually AI versus no voice, which is tricky to determine). Of course it's probably a thing mostly for indie games... but indie devs don't seem too excited about using AI regardless, so it's probably not a particularly relevant point in most cases.

What you're describing is what some mods already do. Many Besthesda games have mods that are like expansions and feature new story plots, lines and voices, that are usually made with AI.

Although some mods manage to do it with real human voices by asking fans to do it for free instead of hiring actors.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Jizz_Beard_thePirate said:

Sure it comes with trade offs but let the users decide whether or not the trade offs are worth it. From the reviews of the leaked version, it's a pretty big improvement in image quality even when being used with RDNA3 INT8 compared to FSR 3 which is currently the only official choice.

And sure Radeon hasn't promised anything to anyone so the logic of "you got what you paid for" certainly applies. But personally speaking, when a company like Nvidia is giving options to 2060 users from 2019 to run the latest DLSS models while Radeon doesn't want to give those that spent $1000 on 7900XTX the time of day even though Radeon had a plenty of marketing around the so called "Ai capabilities" of RDNA 3... Yea it really shows how each company supports their user base and it affects the reputation of each company.

Radeon should be working towards a "we are better than Nvidia" reputation in anyway possible given their single digit market share. But if they are going to act like how they have done in the past year, they absolutely deserve that market share imo.

It does come with tangible benefits. But I would argue that you could just use XeSS anyway.

The difference between the 2060 is that... nVidia included tensor cores, dedicated hardware units for inference tasks, something AMD never did with GCN or RDNA 1/2/3 hardware.
Anything you run on GCN/RDNA 1-2-3 takes resources away from something else.

Still, they could run things on the FP16 hardware or the INT hardware, but if it takes away resources from RDNA4 or newer... Then they are just better off making it open source and let the community manage it.

RDNA4 though has been terrific, one of the best upgrades I have done in recent years from a price/performance/power perspective relative to the competition...  And I don't even use the A.I upscaling junk. - RDNA2 that I had prior was getting long in the tooth but had terrific support for the duration of it's life.




www.youtube.com/@Pemalite

JEMC said:
Zkuq said:

Sounds like the skepticism was well-grounded then. I had already forgotten about The Finals AI voice thing.

Anyway, I meant voice acting that wouldn't have been done at all due to costs or whatnot, not e.g. alien voices. I think for a game like ARC Raiders, that's probably not much, because probably everything that isn't text in the game world gets voice acted, but I think it's still an interesting point. For example, a lot of older games have only partial voice acting, and some characters might not have any voice acting. AI could be used for voice acting in such instances without AI actually replacing people (if it's actually AI versus no voice, which is tricky to determine). Of course it's probably a thing mostly for indie games... but indie devs don't seem too excited about using AI regardless, so it's probably not a particularly relevant point in most cases.

What you're describing is what some mods already do. Many Besthesda games have mods that are like expansions and feature new story plots, lines and voices, that are usually made with AI.

Although some mods manage to do it with real human voices by asking fans to do it for free instead of hiring actors.

Unsurprising, although I don't generally use a lot of mods anymore. I've seen some full AI dialog mods though, but I hadn't heard of doing human-written lines using AI in mods - but like I said, not a huge surprise anyway... Anyway, thanks for sharing!



Pemalite said:
Jizz_Beard_thePirate said:

Sure it comes with trade offs but let the users decide whether or not the trade offs are worth it. From the reviews of the leaked version, it's a pretty big improvement in image quality even when being used with RDNA3 INT8 compared to FSR 3 which is currently the only official choice.

And sure Radeon hasn't promised anything to anyone so the logic of "you got what you paid for" certainly applies. But personally speaking, when a company like Nvidia is giving options to 2060 users from 2019 to run the latest DLSS models while Radeon doesn't want to give those that spent $1000 on 7900XTX the time of day even though Radeon had a plenty of marketing around the so called "Ai capabilities" of RDNA 3... Yea it really shows how each company supports their user base and it affects the reputation of each company.

Radeon should be working towards a "we are better than Nvidia" reputation in anyway possible given their single digit market share. But if they are going to act like how they have done in the past year, they absolutely deserve that market share imo.

It does come with tangible benefits. But I would argue that you could just use XeSS anyway.

The difference between the 2060 is that... nVidia included tensor cores, dedicated hardware units for inference tasks, something AMD never did with GCN or RDNA 1/2/3 hardware.
Anything you run on GCN/RDNA 1-2-3 takes resources away from something else.

Still, they could run things on the FP16 hardware or the INT hardware, but if it takes away resources from RDNA4 or newer... Then they are just better off making it open source and let the community manage it.

RDNA4 though has been terrific, one of the best upgrades I have done in recent years from a price/performance/power perspective relative to the competition...  And I don't even use the A.I upscaling junk. - RDNA2 that I had prior was getting long in the tooth but had terrific support for the duration of it's life.

Yea you could use XeSS but like why should that be the only option or even an option in the first place when we know FSR4 works. Did RDNA 3 users pay Intel when they spent $1000 on 7900XTX? Or $500 when they got a 7800XT? No and in fact if anything, XeSS shows that if Radeon actually does try, they could come up with a solution that can work with RDNA 3 and even RDNA 2.

And yes, I understand that Turing had Tensor cores but the point is that both Nvidia and Radeon with Turning and RDNA 3 started making a big spiel about Ai. But the difference is that Nvidia really showed Turing users that their Ai spiel wasn't just for nothing. And yes, it takes away resources from other things but again, we know exactly how well it works and every review that tested FSR4 on RDNA 3 have all said it's well worth the performance hit vs FSR3 because while FSR4 won't give you as much FPS as FSR3, it will still give you more performance compared to Native and actually great image quality on top.

They can make multiple versions of FSR4. One that runs on INIT8 and the other that runs on FP8. If Intel can make multiple versions of XeSS where one runs on dp4a for everyone else and xmx on Intels own hardware, then surely Radeon can make multiple versions of FSR4 for it's own hardware right? Surely it can't be too much to ask for Radeon to support it's own products like Nvidia and Intel does?

And when it comes to getting terrific support with RDNA 2... Like with what? Driver updates which is the most basic thing any manufacture can do these days? Making a basic upscaler that is disliked by every reviewer? Making a frame gen feature that has frame pacing issues that still haven't been fixed after 2+ years? And yea, RDNA 4 is getting great support now since it's their latest and greatest but are they gonna treat RDNA 4 like RDNA 3 when RDNA 5 comes out?



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
JEMC said:
Zkuq said:

Sounds like the skepticism was well-grounded then. I had already forgotten about The Finals AI voice thing.

Anyway, I meant voice acting that wouldn't have been done at all due to costs or whatnot, not e.g. alien voices. I think for a game like ARC Raiders, that's probably not much, because probably everything that isn't text in the game world gets voice acted, but I think it's still an interesting point. For example, a lot of older games have only partial voice acting, and some characters might not have any voice acting. AI could be used for voice acting in such instances without AI actually replacing people (if it's actually AI versus no voice, which is tricky to determine). Of course it's probably a thing mostly for indie games... but indie devs don't seem too excited about using AI regardless, so it's probably not a particularly relevant point in most cases.

What you're describing is what some mods already do. Many Besthesda games have mods that are like expansions and feature new story plots, lines and voices, that are usually made with AI.

Although some mods manage to do it with real human voices by asking fans to do it for free instead of hiring actors.

Voice actors will license their voice libraries, or do custom library for project they are hired for - given that a lot of dialog will be generated instead of scripted, this is the only way to have fully voiced games.



Pemalite said:
Jizz_Beard_thePirate said:

Sure it comes with trade offs but let the users decide whether or not the trade offs are worth it. From the reviews of the leaked version, it's a pretty big improvement in image quality even when being used with RDNA3 INT8 compared to FSR 3 which is currently the only official choice.

And sure Radeon hasn't promised anything to anyone so the logic of "you got what you paid for" certainly applies. But personally speaking, when a company like Nvidia is giving options to 2060 users from 2019 to run the latest DLSS models while Radeon doesn't want to give those that spent $1000 on 7900XTX the time of day even though Radeon had a plenty of marketing around the so called "Ai capabilities" of RDNA 3... Yea it really shows how each company supports their user base and it affects the reputation of each company.

Radeon should be working towards a "we are better than Nvidia" reputation in anyway possible given their single digit market share. But if they are going to act like how they have done in the past year, they absolutely deserve that market share imo.


The difference between the 2060 is that... nVidia included tensor cores, dedicated hardware units for inference tasks, something AMD never did with GCN or RDNA 1/2/3 hardware.
Anything you run on GCN/RDNA 1-2-3 takes resources away from something else.

A little correction while it true RDNA 2 did not have anything like Tensor correction, RDNA 3 did add in Matrix cores which is AMD version of Tensor cores.  Both 20 series and 30 series for Nvidia tensor cores did not support FP8 yet they still give the option to use DLSS 4.5.  There is a trade off about a 30% performance drop since it don't have native FP8 support.  There no reason AMD could not given RDNA 3 users the option for that trade off like Nvidia did as the nvida 20/30 series was in the same position RDNA 3 was hardware wise.



Cyran said:
Pemalite said:


The difference between the 2060 is that... nVidia included tensor cores, dedicated hardware units for inference tasks, something AMD never did with GCN or RDNA 1/2/3 hardware.
Anything you run on GCN/RDNA 1-2-3 takes resources away from something else.

A little correction while it true RDNA 2 did not have anything like Tensor correction, RDNA 3 did add in Matrix cores which is AMD version of Tensor cores.  Both 20 series and 30 series for Nvidia tensor cores did not support FP8 yet they still give the option to use DLSS 4.5.  There is a trade off about a 30% performance drop since it don't have native FP8 support.  There no reason AMD could not given RDNA 3 users the option for that trade off like Nvidia did as the nvida 20/30 series was in the same position RDNA 3 was hardware wise.

I will have to correct your correction.

RDNA 3 did not have dedicated Matrix "cores" like CDNA in AMD's Instinct line, AMD included Matrix Accelerators within their Compute Units and could work alongside the vector units, essentially borrowing the resources that would be used for the FP16 units.
There is still the issue of contention in the CU units due to schedulers and cache being needed for other tasks, it's a shared resource approach.
However, it still doesn't support FP8... And that's the big issue.

RDNA 2 could also do Matrix multiplications in it's vector units in order to support Machine Learning.

...And that is partly the issue here, RDNA1 (Couldn't do it at all), RDNA2, RDNA3, RDNA4 handles Matrix all differently, it's AMD's own fault in that regard, but RDNA4 has aligned itself with the general Industry trend.

And I agree, that AMD could support FSR4 on older Radeon architectures by bifurcating the upscaler, but if we are talking what's best for the consumer, then the best approach for the consumer would be for AMD to make FSR4 open source and let the community manage it and they can just focus all their resources on FSR5.

In the end though, I have never bought a GPU with the idea I am going to get some "new" feature years down the line, it's simply unrealistic, you judge the hardware how it presents on release and at the time of purchase.

I.E. When S3 promised to provide TnL on it's Savage and never did as the TnL unit was buggy.




www.youtube.com/@Pemalite

Pemalite said:
Cyran said:

A little correction while it true RDNA 2 did not have anything like Tensor correction, RDNA 3 did add in Matrix cores which is AMD version of Tensor cores.  Both 20 series and 30 series for Nvidia tensor cores did not support FP8 yet they still give the option to use DLSS 4.5.  There is a trade off about a 30% performance drop since it don't have native FP8 support.  There no reason AMD could not given RDNA 3 users the option for that trade off like Nvidia did as the nvida 20/30 series was in the same position RDNA 3 was hardware wise.

I will have to correct your correction.

RDNA 3 did not have dedicated Matrix "cores" like CDNA in AMD's Instinct line, AMD included Matrix Accelerators within their Compute Units and could work alongside the vector units, essentially borrowing the resources that would be used for the FP16 units.
There is still the issue of contention in the CU units due to schedulers and cache being needed for other tasks, it's a shared resource approach.
However, it still doesn't support FP8... And that's the big issue.

RDNA 2 could also do Matrix multiplications in it's vector units in order to support Machine Learning.

...And that is partly the issue here, RDNA1 (Couldn't do it at all), RDNA2, RDNA3, RDNA4 handles Matrix all differently, it's AMD's own fault in that regard, but RDNA4 has aligned itself with the general Industry trend.

And I agree, that AMD could support FSR4 on older Radeon architectures by bifurcating the upscaler, but if we are talking what's best for the consumer, then the best approach for the consumer would be for AMD to make FSR4 open source and let the community manage it and they can just focus all their resources on FSR5.

In the end though, I have never bought a GPU with the idea I am going to get some "new" feature years down the line, it's simply unrealistic, you judge the hardware how it presents on release and at the time of purchase.

I.E. When S3 promised to provide TnL on it's Savage and never did as the TnL unit was buggy.

Fair they added AI Hardware acceleration in the CU instead of separate cores like they did in CDNA 3 and CDNA/RDNA 4.  Personally having some kind of long term support matters to me as I tend keep GPU for a number of years. 

Considering AMD history lately with GPUs and the fact RDNA 5 is going to be a major architecture change according to many rumors.  Add in the fact that am guessing integrated GPU going to skip RDNA 4 and go straight to RDNA 5.  This is a guess but base on the roadmaps I seen I think this is a good guess and it not like they not skip generations before on there integrated gpus.  On top of that P6 and next xbox will be RDNA 5, I personally would be very worried that as soon as RDNA 5 out AMD going to basically ambandom RDNA 4 and earlier when it comes to driver optimizations and new features.

Shit happens like the S3 example but that different then choosing not to support TnL vs not being able to.  I fine with a architecture shift not allowing past GPU to support some new feature happening once in awhile but not every generation. 

I hoping RDNA 5 will be AMD GPU zen moment which will cause then to support RDNA 5 and future GPU more like Nvidia does.  As on paper from all the rumors RDNA 5 looking very good but personally I not taking a chance on a AMD GPU till RDNA 5 is out.  Now depending on how they handle RDNA 5 I would be willing to go AMD over Nvidia. 

I feel the same way about Intel with CPUs, I not touching anything from them till Nova Lake (desktop).  Now depending on how they handle Nova Lake they could get me back.  



Cyran said:

Fair they added AI Hardware acceleration in the CU instead of separate cores like they did in CDNA 3 and CDNA/RDNA 4.  Personally having some kind of long term support matters to me as I tend keep GPU for a number of years. 

Majority of PC gamers do tend to hold onto their hardware for a number of years these days as there is very little incentive to upgrade as often.

Go back to the early Geforce/Radeon days we would go from a TnL pipeline one year, to SM1.1 shaders the next year to SM1.4 shaders the year after and then onto SM2.0 shader hardware the year after that with an easy doubling of performance year on year... So there was incentive to upgrade on a semi-regular basis.

I went from the Radeon RX 580 in 2017 to the Radeon RX 6600XT in 2021 which kept me going until I got the Radeon RX 9060XT 16GB in 2025.
So I am churning over a mid-range GPU every 4 years instead of a high-end GPU every year now... And I am doubling performance with each upgrade.

But under no illusion was I expecting new features to be back-rolled onto older hardware.

But again, I am not against new features being rolled into older hardware, but as someone who owns *new* hardware, I want that to be the focus, my dollar is just as important as yours.

And the perfect solution to all this drama is to simply open source FSR. Then it can be applied to all platforms, I.E. Steam OS, Linux, Android etc' and not tied to AMD's driver platform.

The argument shouldn't be about what AMD should or shouldn't do for older/newer hardware, it should be what is best for the consumer.

Cyran said:

Considering AMD history lately with GPUs and the fact RDNA 5 is going to be a major architecture change according to many rumors.  Add in the fact that am guessing integrated GPU going to skip RDNA 4 and go straight to RDNA 5.  This is a guess but base on the roadmaps I seen I think this is a good guess and it not like they not skip generations before on there integrated gpus.  On top of that P6 and next xbox will be RDNA 5, I personally would be very worried that as soon as RDNA 5 out AMD going to basically ambandom RDNA 4 and earlier when it comes to driver optimizations and new features.

AMD has had to rapidly catch up to nVidia who has been pushing CUDA for almost 20 years now, so they have been caught off-guard, but AMD has also typically not been as proficient on "side features" where nVidia has been happy to invest silicon and get maximum performance.
I.E. Tessellation, Ray Tracing, Machine Learning etc'.

As for what is coming in the future...

RDNA stops are RDNA4. There will be no RDNA5.

AMD has already revealed they are rolling CDNA and RDNA into a new "Unified" architecture dubbed UDNA.
https://www.tomshardware.com/pc-components/cpus/amd-announces-unified-udna-gpu-architecture-bringing-rdna-and-cdna-together-to-take-on-nvidias-cuda-ecosystem

So yes, there will be a shift in architecture once again. And I am okay with that, I have the features and performance I want and expect and paid for out of RDNA4.

Consoles are often a blend of new/old architectures and features and thus aren't representative of the PC's release cadence... So speculation on the Xbox Series X2 and Playstation 6 is redundant until we get a proper idea on what side of the technology fence they fall on.

Cyran said:

Shit happens like the S3 example but that different then choosing not to support TnL vs not being able to.  I fine with a architecture shift not allowing past GPU to support some new feature happening once in awhile but not every generation. 

You are missing the point. 
The difference here is that S3 promised it and didn't deliver it.

Or... The other example is when nVidia has 32bit PhysX support and took that away, only to bring it back after significant backlash.

The difference here is that FSR4 was never promised for RDNA 1/2/3 hardware.
.. Demanding we receive FSR4 on older RDNA hardware is just entitlement.

Cyran said:

I hoping RDNA 5 will be AMD GPU zen moment which will cause then to support RDNA 5 and future GPU more like Nvidia does.  As on paper from all the rumors RDNA 5 looking very good but personally I not taking a chance on a AMD GPU till RDNA 5 is out.  Now depending on how they handle RDNA 5 I would be willing to go AMD over Nvidia. 

I feel the same way about Intel with CPUs, I not touching anything from them till Nova Lake (desktop).  Now depending on how they handle Nova Lake they could get me back.  

RDNA5 isn't happening... Any Rumors about RDNA5 would thus be false.

RDNA4 is arguably AMD's "Zen" moment as it brings AMD's hardware feature set up to parity with nVidia.

People forget that the first generation Zen was still behind Intel, but it offered a "good enough" experience at the right price... And that was the key. Price.
It wasn't until the Ryzen 3000 series that AMD actually started to gain credibility back in the CPU space.

AMD for all intent has abandoned the High-End Enthusiast GPU market for the foreseeable future.

Intels Integrated Graphics are actually set to be ahead of AMD mobile RDNA2/3 parts this coming refresh, so that should make things interesting, especially in the handheld space with how good XeSS actually is.








www.youtube.com/@Pemalite