Pemalite said:
Captain_Yuri said:
They aren't unplayable/ugly, they are just largely what they look like on consoles since PS5/Series X can crank out close to ultra Raster settings and have the same upscaling tech. With Nvidia, you simply get better visuals if you have a good enough RT card, significantly better upscaling with DLSS and significantly lower latency with Reflex.
|
No way near it. Radeon on PC allows you to run with Ultra settings, unlike the Playstation 5/Series X. Many Series X/Playstation 5 games run with mostly high settings. - I do own every platform.
You can also Super Sample on a Radeon+PC, which is difficult to do on console as you don't have control over resolution... Nor do consoles generally run native resolution anyway.
You aren't getting a different experience running with nVidia unless you start leveraging DLSS... You will obviously get higher framerates of course.
But otherwise, it's not that different. (Again, I own by AMD and nVidia hardware.)
|
Except DF has done plenty of comparisons that says otherwise until you turn on Ray Tracing which is what really starts showing meaningful visual differences. I myself own a PS5, Switch, Steam Deck and a PC. There might be some minor visual enhancements to ultra but Ray Tracing shows significant visual improvements unless it's a bad implementation like RT shadows.
And depending on the GPU + Resolution + Game, you can certainly run Ray Tracing at Native and get good FPS
Open Source is factually better from an industry adoption and support perspective which means it's better for the consumer.
AMD has historically always leveraged open-source approaches.. I know you don't care what happens 10+ years ago which you have alluded to, but even back then AMD was leveraging open source to build compute into the Radeon x1900XT back in 2005-2007...
And that was when AMD and nVidia had almost 50/50 marketshare. So your argument doesn't hold any weight anyway. |
Open source is better for industry adoption and better for consumer but it only matters if the technology is good which in the case of FSR, it really isn't proven by many outlets already. I am sure there will eventually be some Ai based upscaler that will be open, maybe even XeSS that will end up replacing DLSS which gets widely adopted by industry. Who knows, maybe FSR 4.0 will be exactly that or FSR 3.0. But FSR as it's current form really isn't making much in roads.
You mean ATI Radeon x1900XT? Do you see any ATI logo anymore? Cause I sure don't. We know management and the entire company has changed quite a bit and so has the landscape.
FSR isn't meant to compete with DLSS, it's not using A.I upscaling. It's using the Lanczos algorithm, which makes it hardware agnostic.
If image quality is your concern, then I wouldn't use DLSS or FSR, I would use Super Sampling... But nVidia GPU's likely don't have the VRAM for that anyway. |
That's also the point. FSR is inferior because it doesn't use Ai upscaling. But AMD positions FSR 2.0 as a competitor to DLSS. It doesn't matter if they use completely different methods because those who are buying a new GPU now doesn't care. They will look at DLSS being superior and choose Nvidia.
Sorry what is your specs again? Cause you do know Nvidia has 3090s and 4090s and 4080s right?
Doesn't need to be that way. You are already paying "big bucks" and getting less on that front. |
Until someone can be an actual competitor instead of releasing products with half backed features, people will continue to pay the big bucks because that's how the market is. People are choosing 3060s over 6700XTs even though they are the same price with a wide gap in performance in favor of the 6700XT. I wouldn't recommend getting a 3060 over 6700XT any day but they are because there is no alternative to Nvidias feature sets.
To use Reflex to it's fullest extent (I.E All the propriety features you clamor for that sets nVidia apart from AMD), you need propriety hardware, you have already elaborated how this is a big key selling-point by being an nVidia owner.
The GTX 900 series doesn't support Reflex in all titles either.
Keep in mind that AMD also has Radeon Anti-Lag which reduces input latency as well. |
If by propriety hardware, you mean Nvidia GPU then yes... But other than that you don't. And I do like propriety features when they are the only thing that's available. Why would I pay over $500 for half baked features?
"The GTX 900 series doesn't support Reflex in all titles either."
Source?
"Keep in mind that AMD also has Radeon Anti-Lag which reduces input latency as well."
Not even remotely the same thing. Do some research yea?
You can't use that Chestnut.
Remember, nVidia abandoned support for Fermi in 2018... But was a GPU architecture they were re-releasing (Geforce 730) even in 2014.
Heck... It even got re-released in 2021.
That's right, an unsupported GPU re-released in 2021. https://www.digitaltrends.com/computing/msi-gt-730-re-release-gpu-shortage/
...But sure, let's paint AMD as the only bad guy in this game. |
I sure can cause AMD ended driver support for HD 6000 quite early which came out in October 22, 2010 where as Fermi came out in April 2010. But AMD stopped supporting HD 6000 series in November 2015.
https://www.techspot.com/news/62913-amd-ends-driver-support-hd-5000-6000-series.html
So yes, they are still the bad guy in this game. And lets be honest, no one cares about the GT730 cause when Radeon has sacked entire generation of GPUs which costed a lot more money with less than 5 years of driver support in the past.
You do realise that the Fabric between chiplets actually consumes additional power?
The only reason why Ryzen uses less power and offers more performance than Intel isn't actually due to the Chiplet design itself, it's actually due to the architecture.
If AMD made Ryzen monolithic and ditched the fabric and made it on a leading-edge manufacturing process, it would actually consume less energy.
The purpose of chiplets is simply cost... You can make more functional chips per wafer.. And considering AMD's biggest advantage other than more RAM... Is cost, it's good they are leveraging that, because GPU's have spiralled out of control in regards to costs.
One of the big factors was unprecedented demand due to Crypto and COVID, so AMD and nVidia could price GPU's however they wanted and they would still sell, that's now changed, so price is going to be a key selling point going forwards as the USA potentially goes into recession.
|
I am aware of the reasoning but it really doesn't matter all that much to the buyer is the point. When people saw Ryzen giving 8 cores 16 threads for the same price as a quad cord CPU from Intel, that was truly an eye opening moment. But there is nothing like that with RDNA 3. And considering how 4090s alone has more market share according to Steam Survey (granted should be taken with salt) than high end RX6000 series, that really says it all.
https://www.pcgamer.com/seriously-where-did-you-lot-get-the-money-for-all-those-rtx-4090s/
At the end of the day, we can keep going back and forth and it doesn't really matter cause I am not going to change your opinion and vice versa. At the end of the day, what matters is the health of the industry. Nvidia is ruining PC gaming because of it's prices and people are willing to pay those prices because there is no alternative regardless of what people think proven by JPR and everyone else. Imo the only way is for a competitor to come out with Nvidia features like DLSS and Nvidias RT performance but for a significantly cheaper price. I hope that Intel can do it cause AMD feels like they don't want to go that route. They would rather be a follower and let Nvidia set the prices and sell those who dislike Nvidia products with half baked features for less money and more vram. Where as Intel at least feels like maybe there could be something there. Happy to be proven wrong though.
Last edited by Jizz_Beard_thePirate - on 19 April 2023