By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
Captain_Yuri said:

AMD does innovate and they have innovated in the past no doubt. Back in the day, I did buy Radeon products like the 4870 and 6870 because Radeon innovated. I still buy Ryzen products because AMD continues to innovate in the CPU landscape. But the issue is that as a whole, most reviewers out there would largely agree that the recent innovations that Nvidia has brought to the table far outdoes anything that Radeon has recently brought. And factually that is correct because in my view, Nvidia and Radeon are simply not equal to each other because how far ahead Nvidia is in their feature set.

I don't disagree that nVidia has some solid features, but lets be real here... It's not like games look ugly or are unplayable on Radeon... They are just icing on top of the cake.

They aren't unplayable/ugly, they are just largely what they look like on consoles since PS5/Series X can crank out close to ultra Raster settings and have the same upscaling tech. With Nvidia, you simply get better visuals if you have a good enough RT card, significantly better upscaling with DLSS and significantly lower latency with Reflex.

Pemalite said:
Captain_Yuri said:

If I am buying a GPU this generation, I am not going to look at what Radeon did 10 years ago. I am going to look at what Nvidia and Radeon are doing right now.

The point I was trying to convey is that everyone innovates in the industry, nVidia just innovates far more rapidly because they have the R&D budget... And it locks people into their ecosystem. Comes down to money in the end.

For example, let's take G-Sync, amazing technology, but you *HAD* to use an nVidia graphics card, you *HAD* to use an nVidia G-Sync display, which means all future GPU purchases for that system was likely to be Geforce to retain use of variable refresh baked into your monitor.

AMD however invented Freesync which went open source... Which then got adopted by the consoles, phones, tablets and pretty much every display today and didn't need propriety hardware, that was a technology far more impactful and positive to the entire industry than nVidia's G-Sync.

nVidia will also use it's propriety technologies to hamper it's competitors as well... For example, Hairworks leveraged nVidia's polymorph engines to the 8th degree so that in games like The Witcher 3, they pushed 64x factors, which absolutely crippled AMD hardware (Until AMD introduce primitive discard with Polaris) despite the fact that there was no visual difference over 16x factors. It made nVidia look good, AMD look bad.

Yes DLSS is better than FSR. There is no disputing that.

But that doesn't make FSR useless, it definitely has a place and will benefit integrated graphics with their limited fillrates more than higher end parts. - Plus it's not locked down to a certain piece of hardware or even platform, it's part of GPUOpen, so 10 years from now, it will receive community support and improvements. - What will the status of DLSS be?


Also keep in mind that FSR does not use A.I to improve it's visuals, they aren't even comparable.

AMD does open source because they have to due to being low market share. If they locked FSR or Freesync to Radeon hardware, almost no one would implement them without getting heaps of money. And yes, Nvidia does implement locked features or pay devs to utilize their GPUs to put a leg up against the competition but so does AMD to a degree such as with Far Cry 6 and it's copious amounts of Vram even though it looks like a last gen title.

FSR has it's place but it's not what I'd consider to be a selling point like what DLSS is which is the main issue. Like people that are looking to buy a new GPU is gonna watch the reviews and see that everyone says DLSS is significantly better than FSR, it's not going to sway people to buy Radeon. Especially with those with 1440p or lower monitors because again, reviews say it looks not very good compared to Native/DLSS.

And it doesn't matter what happens 10 years from now. PS6 and XBox Series X2 will come out and Nvidia will invent new features that will lock people into their platform and it's a rinse and repeat cycle. That is unless Radeon or someone else does something. And Intel just may be that company. They already came out with XeSS which is also Ai based upscaler. They have RT performance similar to Nvidia and so on. That's the issue with Radeon, they could fall behind even Intel if they don't start catching up (assuming Intel even sticks around).

Pemalite said:
Captain_Yuri said:

Reviewers say that Nvidia has significant Ray Tracing performance lead. Reviewers say that DLSS is significantly better than FSR. Reflex is something that Radeon does not have an answer to. The raster performance is similar when comparing the performance tier.

nVidia has better Ray Tracing performance, no one is arguing otherwise... Provided you don't run out of Ram that is. Same goes for Raster... nVidia does fine, until it runs out of Ram.

To get the most out of Reflex you need a compatible Reflex monitor and a Reflex mouse.
...And then you need a Reflex compatible game.
https://www.nvidia.com/en-au/geforce/technologies/reflex/supported-products/

Not exactly a big list to even call it a "vital" feature yet.

I think it would be better to wait for an industry-adopted solution, otherwise users will suffer another G-Sync scenario.

Yea vram has always been an issue with Nvidia, no changing that until you pay the big bucks.

Actually that's not how Reflex works at all. Reflex works with any Nvidia GPU going back to GTX 900 and you can enable it on any game that has Reflex and it does not require any external hardware like monitor/mouse to use. All the monitor/mouse does is allows you to measure the system latency if you want those stats, they are not required to enable or use Reflex to it's full potential.

I know this because I use Reflex in MW2 and Overwatch and I do not have a Reflex enabled monitor or mouse and it's easy to notice the difference. You can do it too if you play Reflex enabled games and see for yourself. And anyone that uses Reflex will absolutely call it a vital feature.

Pemalite said:
Captain_Yuri said:

The raster performance is similar when comparing the performance tier. Yuzu (and other emulation) developers posts issues with Radeon drivers almost every month. Nvidia does have their own issues but it's a lot less. Just look at last months Yuzu progress report and skip to the driver section:

https://yuzu-emu.org/entry/yuzu-progress-report-mar-2023/

Yuzu also had a lot of positives about AMD's drivers and their support.

nVidia tends to have higher CPU usage than AMD with it's drivers... And we can't forget the recent CPU usage spiking issue with the 531.18 drivers which introduced a heap of bugs and issues.
https://www.anandtech.com/show/18758/nvidia-releases-hotfix-for-geforce-driver-to-resolve-cpu-spikes

But I digress, I game on both AMD and nVidia hardware currently, I am not biased or preferential to any particular brand, they both have their quirks... I run multi-monitors and on Geforce, you are unable to have all three displays sleep and wake, this is something that has not been an issue with AMD.

Otherwise I generally don't have crashes or other issues with either company, but I also don't run with the latest drivers, I only update when I need a new feature or bug fix.

And yea, both Radeon and Nvidia can have wonky drivers but most people agree that Nvidia continues to be better when you look at factual data like from emulation devs and such. Not to mention 900 series are still recieving continued driver support while Radeon has abandoned RX300 series GPUs.

Pemalite said:

Captain_Yuri said:

So it's like, cool so what has Radeon done recently that's better than Nvidia? Well you get more vram for the same price and their last gen cards are cheaper. Great anything else? I am not saying it's bad to buy Radeon and if you are on a budget, RDNA 2 is absolutely the right buy for sub $550. But in terms of recent PC innovation that matters to those that are buying now? It just feels like Radeon quite lacking.

Chiplets are new.

Yea and are pretty useless to anyone other than Radeons margins. Least when Ryzen did chiplets, they had double the cores vs Intel for the same price. When Radeon does chiplets, it's power hungry, less efficient, provides the same Raster performance as Mono and costs as much as Nvidia products with basically no advantages against Nvidia other than more vram. In the long term that could end up being different but that is a long ways away and Nvidia is no Intel.

Last edited by Jizz_Beard_thePirate - on 19 April 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850