haxxiy said:
I suppose that's what AMD gets for not prioritizing tensor cores and deep learning like Nvidia has done for almost 10 years now. It's not just consumer GPUs, mind, their entire HPC line made a wrong bet in FP64 and is now pivoting hard to minifloats too. I'd now keep a close watch on neural rendering features, depending on how much TOPS and which format that takes, this could deprecate older GPUs much faster than you'd think from raster alone. |
I feel like this is more of a management/leadership issue than a hardware issue. We know that FSR4 can certainly run on RDNA 3 with Int8 with great success. If unpaid modders can develop optiscaler to get this going, then surely a multi-billion dollar company who Radeon fans are giving their money to can do the same. Especially when you consider that it's thanks to FSR4 build being leaked from someone at Radeon that Optiscaler is able to do this which means they were already working on it or had a build of some kind but chose not to support RDNA 3.
It is also pretty nuts that they still haven't fixed the frame pacing issues that HUB's video showed with FSR frame generation for 2 years now. And the only workaround is to set a frame rate limiter lol. It doesn't matter if FSR's FG visual quality is similar to DLSS FG if the FG is unusable unless you lock it to a set frame rate in an era where every monitor has VRR.
Personally it feels like if you are a tech person that keeps up with the news, you would need some crazy level of mental gymnastics to go Radeon as your next gpu purchase unless Nvidia goofs up royally like with the 5070 or there is a huge price gap. Cause at this point, what do you gain over Nvidia if you go red team other than saving a buck? Just better Linux support?
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850







