By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Welp launch 4070's are seeminly sitting on shelves. Already well under MSRP in some places in the UK.

Available to buy for £550. Its a Palit though, mind.. https://www.novatech.co.uk/products/palit-nvidia-geforce-rtx-4070-dual-12gb-gddr6x-graphics-card/ned4070019k9-1047d.html#utm_source=tradedoubler&utm_medium=affiliate&utm_campaign=2142931_digidip+UK

Bodes well for people who are looking for GPU's rn. Lets hope the 7800XT releases sooner to make this and other GPU's drop faster.



Around the Network
Chazore said:
Captain_Yuri said:

Well I wouldn't count the chickens before they hatch just yet as AMD hasn't said whether or not FSR 3 will be coming to their previous gen gpus. They said they are trying to make it happen but hasn't actually said if it will or what issues it will have if it does. Plus considering DLSS 2 is better to significantly better than FSR 2 in every area and AMD doesn't have an alternative to Reflex to take the latency down without reducing image quality, even if FSR 3 is available on older GPUs, it could be something you may not want to use similar to how using FSR 2 at 1440p is horrid compared to using DLSS 2.

But yea, DLSS 3 not coming to Ampere/Turing is pretty lame but I think the reasoning behind it makes sense because while DLSS 3 does give you a framerate "boost," it comes at a big latency hit even on a 4090. So if Ampere/Turing isn't able to calculate the added frames fast enough or they don't look as good as they should, then it's basically pointless.

Come now, Yuri, you're giving Nvidia a bit of a benefit of the doubt in the same breath as "wait till AMD does an Nvidia", there.

Besides, Nvidia already has the jacked up prices, they've already shown their hand, DLSS 3 is also locked behind the 4000 series, and we know that FSR is open and still not holding a candle to DLSS, but at least it's open and I can use it, while I cannot use DLSS 3.0, because Nvidia wants me to sell my kidneys. 

I know you're likely to say you aren't at all influenced, but we're humans my man, we are absolutely influenced in some way by a bit of bias, be it by the products we own or services we partake in. I own my 1080ti to this day, but because I don't own a 4000 series, I will wholly admit, yes, I am salty, but at the same time, you own a 4090, and you're kinda pulling my leg a bit with the "well AMD on the other hand".


C'mon broskie, you gotta admit you're getting a tad flustered by that GPU of yours, just a smidge. Yes both sides have done good and bad, but right here, right now, Nivida is doing bad, only good thing is the tech, the band aid we absolutely need, because no one has figured how to not need a band aid to use RT yet (which I still don't like at all to count as a bonus, because we shouldn't have to rely on DLSS to do RT to begin with, it's like a stop gap in the tech cycle).

We better hope that Nvidia doesn't screw the 4000 series with DLSS 4.0 in the future, because that again would be bad, no matter the magical excuses made for something that isn't even a decade old.

Well I am not saying it's a good thing that Nvidia has locked DLSS 3 behind RTX 4000 series but I do think the technical reason does make sense because many tech experts have done benchmarks and testing and found out that even on a 4090, DLSS 3 has a big latency penalty. So it's not like I am believing only in Nvidias words, it is backed by those that tested the actual technology.

And sure, FSR is available to older GPUs but whether or not you would want to actually use it is another question entirely. Just because a technology is available to you doesn't mean it's worth using. Because of the issues with image stability when using FSR at anything lower than 4k, many would argue it's simply better to reduce settings than use the product. Remember how people used to laugh at consoles for using upscaling? The reason they used to do that is because of how bad the upscaling tech was. Only when Nvidia came out with DLSS 2 did upscaling become acceptable. FSR while better than something like checkerboard rendering is still not very good proven by DF, HUB, etc unless you are doing 4k. So while FSR might be open and available, it doesn't make it worth using. And while DLSS 3 is locked to RTX 4000, DLSS 2 and Reflex which are a part of DLSS 3 aren't. So any game that gets DLSS 3 also gets DLSS 2 and Reflex for RTX Ampere/Turing. Nvidia could have made DLSS 3 a separate thing altogether by doing the ultimate dickish move but they didn't.

I do favor Nvidia over Radeon because they bring innovation to the market. They are a terrible and greedy company no doubt but unlike Radeon who is also terrible and greedy, Nvidia is actually pushing pc gaming forward with new and innovative tech. It's also why I like Intel because they are trying to do what Nvidia is doing but at a much cheaper price. And it's not like Nvidia is only good at Ray Tracing. Nvidia is also competitive if not faster depending on the GPU with Raster as well. But it does depend on a persons priorities. Do you want a console experience but faster or do you want more innovative experience? Are you on a budget or are you willing to pay a premium for that experience?

I am sure Nvidia will eventually lock out RTX 4000 series from something in the future as well. It's Nvidia lol. But the fact is that there is no real alternative if you want a PC experience. Radeon is too far behind in tech that makes PC gaming worth investing a lot of money into. Intel could eventually be the cheap alternative that we are looking for but their driver department is still years away from being recommendable. But if you are on a budget, then Radeon is the easy option. Can't beat the value that a $500 6800XT really.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

And sure, FSR is available to older GPUs but whether or not you would want to actually use it is another question entirely. Just because a technology is available to you doesn't mean it's worth using. Because of the issues with image stability when using FSR at anything lower than 4k, many would argue it's simply better to reduce settings than use the product. Remember how people used to laugh at consoles for using upscaling? The reason they used to do that is because of how bad the upscaling tech was. Only when Nvidia came out with DLSS 2 did upscaling become acceptable. FSR while better than something like checkerboard rendering is still not very good proven by DF, HUB, etc unless you are doing 4k. So while FSR might be open and available, it doesn't make it worth using. And while DLSS 3 is locked to RTX 4000, DLSS 2 and Reflex which are a part of DLSS 3 aren't. So any game that gets DLSS 3 also gets DLSS 2 and Reflex for RTX Ampere/Turing. Nvidia could have made DLSS 3 a separate thing altogether by doing the ultimate dickish move but they didn't.

I would argue technologies like FSR and DLSS is actually *more* important for older and less capable graphics cards.

It's a performance/image enhancement and older hardware would absolutely benefit from it more in regards to being able to continue playing the latest titles with acceptable image quality and performance.

Captain_Yuri said:

I do favor Nvidia over Radeon because they bring innovation to the market. They are a terrible and greedy company no doubt but unlike Radeon who is also terrible and greedy, Nvidia is actually pushing pc gaming forward with new and innovative tech. It's also why I like Intel because they are trying to do what Nvidia is doing but at a much cheaper price. And it's not like Nvidia is only good at Ray Tracing. Nvidia is also competitive if not faster depending on the GPU with Raster as well. But it does depend on a persons priorities. Do you want a console experience but faster or do you want more innovative experience? Are you on a budget or are you willing to pay a premium for that experience?

Every company innovates, whether those innovations are important is another matter entirely... Many innovations are not middle-ware either that are advertised to consumers... AMD has pushed integrated graphics capabilities forwards rather significantly for instance.

AMD beat nVidia with Tessellation by almost a decade, TressFX got rolled into GPUOpen and is still being developed... Mostly because it's open source and not beholden to AMD developing it.

Where GameWorks with HairWorks has fallen by the wayside.

Things like Mantle getting rolled into Vulkan was another innovation brought to the PC market, which then influenced Microsoft's Direct X 12 to be more efficient by bringing low-level improvements.

I think it's a big disservice to the entire industry when people assert that any company doesn't innovate... Even the long dead S3 Graphics brought innovative technologies that nVidia and AMD use today, like texture compression.

I don't find AMD or nVidia better than either, they both have their Pro's and Con's... And I will weigh them both everytime I make a purchase.
I.E. I generally go Radeon on Desktop because of price/performance reasons... But I often go nVidia with notebooks because of how seamless Optimus generally is.



--::{PC Gaming Master Race}::--

Pemalite said:

I would argue technologies like FSR and DLSS is actually *more* important for older and less capable graphics cards.

It's a performance/image enhancement and older hardware would absolutely benefit from it more in regards to being able to continue playing the latest titles with acceptable image quality and performance.

I do agree that FSR and DLSS are more important for older GPUs/less capable cards but the point I was making is that based on various tech reviews like DF/HUB is that there is a stark difference between using FSR and DLSS at lower resolutions than 4k. And the lower you go, the worse FSR gets. And the main reason is because FSRs image stability is significantly worse than DLSS at lower resolutions hence why many people would recommend to turn down settings if possible than to use FSR.

Pemalite said:
Captain_Yuri said:

I do favor Nvidia over Radeon because they bring innovation to the market. They are a terrible and greedy company no doubt but unlike Radeon who is also terrible and greedy, Nvidia is actually pushing pc gaming forward with new and innovative tech. It's also why I like Intel because they are trying to do what Nvidia is doing but at a much cheaper price. And it's not like Nvidia is only good at Ray Tracing. Nvidia is also competitive if not faster depending on the GPU with Raster as well. But it does depend on a persons priorities. Do you want a console experience but faster or do you want more innovative experience? Are you on a budget or are you willing to pay a premium for that experience?

Every company innovates, whether those innovations are important is another matter entirely... Many innovations are not middle-ware either that are advertised to consumers... AMD has pushed integrated graphics capabilities forwards rather significantly for instance.

AMD beat nVidia with Tessellation by almost a decade, TressFX got rolled into GPUOpen and is still being developed... Mostly because it's open source and not beholden to AMD developing it.

Where GameWorks with HairWorks has fallen by the wayside.

Things like Mantle getting rolled into Vulkan was another innovation brought to the PC market, which then influenced Microsoft's Direct X 12 to be more efficient by bringing low-level improvements.

I think it's a big disservice to the entire industry when people assert that any company doesn't innovate... Even the long dead S3 Graphics brought innovative technologies that nVidia and AMD use today, like texture compression.

I don't find AMD or nVidia better than either, they both have their Pro's and Con's... And I will weigh them both everytime I make a purchase.
I.E. I generally go Radeon on Desktop because of price/performance reasons... But I often go nVidia with notebooks because of how seamless Optimus generally is.

AMD does innovate and they have innovated in the past no doubt. Back in the day, I did buy Radeon products like the 4870 and 6870 because Radeon innovated. I still buy Ryzen products because AMD continues to innovate in the CPU landscape. But the issue is that as a whole, most reviewers out there would largely agree that the recent innovations that Nvidia has brought to the table far outdoes anything that Radeon has recently brought. And factually that is correct because in my view, Nvidia and Radeon are simply not equal to each other because how far ahead Nvidia is in their feature set.

If I am buying a GPU this generation, I am not going to look at what Radeon did 10 years ago. I am going to look at what Nvidia and Radeon are doing right now. Reviewers say that Nvidia has significant Ray Tracing performance lead. Reviewers say that DLSS is significantly better than FSR. Reflex is something that Radeon does not have an answer to. The raster performance is similar when comparing the performance tier. Yuzu (and other emulation) developers posts issues with Radeon drivers almost every month. Nvidia does have their own issues but it's a lot less. Just look at last months Yuzu progress report and skip to the driver section:

https://yuzu-emu.org/entry/yuzu-progress-report-mar-2023/

So it's like, cool so what has Radeon done recently that's better than Nvidia? Well you get more vram for the same price and their last gen cards are cheaper. Great anything else? I am not saying it's bad to buy Radeon and if you are on a budget, RDNA 2 is absolutely the right buy for sub $550. But in terms of recent PC innovation that matters to those that are buying now? It just feels like Radeon quite lacking.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

AMD does innovate and they have innovated in the past no doubt. Back in the day, I did buy Radeon products like the 4870 and 6870 because Radeon innovated. I still buy Ryzen products because AMD continues to innovate in the CPU landscape. But the issue is that as a whole, most reviewers out there would largely agree that the recent innovations that Nvidia has brought to the table far outdoes anything that Radeon has recently brought. And factually that is correct because in my view, Nvidia and Radeon are simply not equal to each other because how far ahead Nvidia is in their feature set.

I don't disagree that nVidia has some solid features, but lets be real here... It's not like games look ugly or are unplayable on Radeon... They are just icing on top of the cake.

Captain_Yuri said:

If I am buying a GPU this generation, I am not going to look at what Radeon did 10 years ago. I am going to look at what Nvidia and Radeon are doing right now.

The point I was trying to convey is that everyone innovates in the industry, nVidia just innovates far more rapidly because they have the R&D budget... And it locks people into their ecosystem. Comes down to money in the end.

For example, let's take G-Sync, amazing technology, but you *HAD* to use an nVidia graphics card, you *HAD* to use an nVidia G-Sync display, which means all future GPU purchases for that system was likely to be Geforce to retain use of variable refresh baked into your monitor.

AMD however invented Freesync which went open source... Which then got adopted by the consoles, phones, tablets and pretty much every display today and didn't need propriety hardware, that was a technology far more impactful and positive to the entire industry than nVidia's G-Sync.

nVidia will also use it's propriety technologies to hamper it's competitors as well... For example, Hairworks leveraged nVidia's polymorph engines to the 8th degree so that in games like The Witcher 3, they pushed 64x factors, which absolutely crippled AMD hardware (Until AMD introduce primitive discard with Polaris) despite the fact that there was no visual difference over 16x factors. It made nVidia look good, AMD look bad.

Yes DLSS is better than FSR. There is no disputing that.

But that doesn't make FSR useless, it definitely has a place and will benefit integrated graphics with their limited fillrates more than higher end parts. - Plus it's not locked down to a certain piece of hardware or even platform, it's part of GPUOpen, so 10 years from now, it will receive community support and improvements. - What will the status of DLSS be?


Also keep in mind that FSR does not use A.I to improve it's visuals, they aren't even comparable.


Captain_Yuri said:

Reviewers say that Nvidia has significant Ray Tracing performance lead. Reviewers say that DLSS is significantly better than FSR. Reflex is something that Radeon does not have an answer to. The raster performance is similar when comparing the performance tier.

nVidia has better Ray Tracing performance, no one is arguing otherwise... Provided you don't run out of Ram that is. Same goes for Raster... nVidia does fine, until it runs out of Ram.

To get the most out of Reflex you need a compatible Reflex monitor and a Reflex mouse.
...And then you need a Reflex compatible game.
https://www.nvidia.com/en-au/geforce/technologies/reflex/supported-products/

Not exactly a big list to even call it a "vital" feature yet.

I think it would be better to wait for an industry-adopted solution, otherwise users will suffer another G-Sync scenario.

Captain_Yuri said:

The raster performance is similar when comparing the performance tier. Yuzu (and other emulation) developers posts issues with Radeon drivers almost every month. Nvidia does have their own issues but it's a lot less. Just look at last months Yuzu progress report and skip to the driver section:

https://yuzu-emu.org/entry/yuzu-progress-report-mar-2023/

Yuzu also had a lot of positives about AMD's drivers and their support.

nVidia tends to have higher CPU usage than AMD with it's drivers... And we can't forget the recent CPU usage spiking issue with the 531.18 drivers which introduced a heap of bugs and issues.
https://www.anandtech.com/show/18758/nvidia-releases-hotfix-for-geforce-driver-to-resolve-cpu-spikes

But I digress, I game on both AMD and nVidia hardware currently, I am not biased or preferential to any particular brand, they both have their quirks... I run multi-monitors and on Geforce, you are unable to have all three displays sleep and wake, this is something that has not been an issue with AMD.

Otherwise I generally don't have crashes or other issues with either company, but I also don't run with the latest drivers, I only update when I need a new feature or bug fix.

Captain_Yuri said:

So it's like, cool so what has Radeon done recently that's better than Nvidia? Well you get more vram for the same price and their last gen cards are cheaper. Great anything else? I am not saying it's bad to buy Radeon and if you are on a budget, RDNA 2 is absolutely the right buy for sub $550. But in terms of recent PC innovation that matters to those that are buying now? It just feels like Radeon quite lacking.

Chiplets are new,

Last edited by Pemalite - on 19 April 2023

--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
Captain_Yuri said:

AMD does innovate and they have innovated in the past no doubt. Back in the day, I did buy Radeon products like the 4870 and 6870 because Radeon innovated. I still buy Ryzen products because AMD continues to innovate in the CPU landscape. But the issue is that as a whole, most reviewers out there would largely agree that the recent innovations that Nvidia has brought to the table far outdoes anything that Radeon has recently brought. And factually that is correct because in my view, Nvidia and Radeon are simply not equal to each other because how far ahead Nvidia is in their feature set.

I don't disagree that nVidia has some solid features, but lets be real here... It's not like games look ugly or are unplayable on Radeon... They are just icing on top of the cake.

They aren't unplayable/ugly, they are just largely what they look like on consoles since PS5/Series X can crank out close to ultra Raster settings and have the same upscaling tech. With Nvidia, you simply get better visuals if you have a good enough RT card, significantly better upscaling with DLSS and significantly lower latency with Reflex.

Pemalite said:
Captain_Yuri said:

If I am buying a GPU this generation, I am not going to look at what Radeon did 10 years ago. I am going to look at what Nvidia and Radeon are doing right now.

The point I was trying to convey is that everyone innovates in the industry, nVidia just innovates far more rapidly because they have the R&D budget... And it locks people into their ecosystem. Comes down to money in the end.

For example, let's take G-Sync, amazing technology, but you *HAD* to use an nVidia graphics card, you *HAD* to use an nVidia G-Sync display, which means all future GPU purchases for that system was likely to be Geforce to retain use of variable refresh baked into your monitor.

AMD however invented Freesync which went open source... Which then got adopted by the consoles, phones, tablets and pretty much every display today and didn't need propriety hardware, that was a technology far more impactful and positive to the entire industry than nVidia's G-Sync.

nVidia will also use it's propriety technologies to hamper it's competitors as well... For example, Hairworks leveraged nVidia's polymorph engines to the 8th degree so that in games like The Witcher 3, they pushed 64x factors, which absolutely crippled AMD hardware (Until AMD introduce primitive discard with Polaris) despite the fact that there was no visual difference over 16x factors. It made nVidia look good, AMD look bad.

Yes DLSS is better than FSR. There is no disputing that.

But that doesn't make FSR useless, it definitely has a place and will benefit integrated graphics with their limited fillrates more than higher end parts. - Plus it's not locked down to a certain piece of hardware or even platform, it's part of GPUOpen, so 10 years from now, it will receive community support and improvements. - What will the status of DLSS be?


Also keep in mind that FSR does not use A.I to improve it's visuals, they aren't even comparable.

AMD does open source because they have to due to being low market share. If they locked FSR or Freesync to Radeon hardware, almost no one would implement them without getting heaps of money. And yes, Nvidia does implement locked features or pay devs to utilize their GPUs to put a leg up against the competition but so does AMD to a degree such as with Far Cry 6 and it's copious amounts of Vram even though it looks like a last gen title.

FSR has it's place but it's not what I'd consider to be a selling point like what DLSS is which is the main issue. Like people that are looking to buy a new GPU is gonna watch the reviews and see that everyone says DLSS is significantly better than FSR, it's not going to sway people to buy Radeon. Especially with those with 1440p or lower monitors because again, reviews say it looks not very good compared to Native/DLSS.

And it doesn't matter what happens 10 years from now. PS6 and XBox Series X2 will come out and Nvidia will invent new features that will lock people into their platform and it's a rinse and repeat cycle. That is unless Radeon or someone else does something. And Intel just may be that company. They already came out with XeSS which is also Ai based upscaler. They have RT performance similar to Nvidia and so on. That's the issue with Radeon, they could fall behind even Intel if they don't start catching up (assuming Intel even sticks around).

Pemalite said:
Captain_Yuri said:

Reviewers say that Nvidia has significant Ray Tracing performance lead. Reviewers say that DLSS is significantly better than FSR. Reflex is something that Radeon does not have an answer to. The raster performance is similar when comparing the performance tier.

nVidia has better Ray Tracing performance, no one is arguing otherwise... Provided you don't run out of Ram that is. Same goes for Raster... nVidia does fine, until it runs out of Ram.

To get the most out of Reflex you need a compatible Reflex monitor and a Reflex mouse.
...And then you need a Reflex compatible game.
https://www.nvidia.com/en-au/geforce/technologies/reflex/supported-products/

Not exactly a big list to even call it a "vital" feature yet.

I think it would be better to wait for an industry-adopted solution, otherwise users will suffer another G-Sync scenario.

Yea vram has always been an issue with Nvidia, no changing that until you pay the big bucks.

Actually that's not how Reflex works at all. Reflex works with any Nvidia GPU going back to GTX 900 and you can enable it on any game that has Reflex and it does not require any external hardware like monitor/mouse to use. All the monitor/mouse does is allows you to measure the system latency if you want those stats, they are not required to enable or use Reflex to it's full potential.

I know this because I use Reflex in MW2 and Overwatch and I do not have a Reflex enabled monitor or mouse and it's easy to notice the difference. You can do it too if you play Reflex enabled games and see for yourself. And anyone that uses Reflex will absolutely call it a vital feature.

Pemalite said:
Captain_Yuri said:

The raster performance is similar when comparing the performance tier. Yuzu (and other emulation) developers posts issues with Radeon drivers almost every month. Nvidia does have their own issues but it's a lot less. Just look at last months Yuzu progress report and skip to the driver section:

https://yuzu-emu.org/entry/yuzu-progress-report-mar-2023/

Yuzu also had a lot of positives about AMD's drivers and their support.

nVidia tends to have higher CPU usage than AMD with it's drivers... And we can't forget the recent CPU usage spiking issue with the 531.18 drivers which introduced a heap of bugs and issues.
https://www.anandtech.com/show/18758/nvidia-releases-hotfix-for-geforce-driver-to-resolve-cpu-spikes

But I digress, I game on both AMD and nVidia hardware currently, I am not biased or preferential to any particular brand, they both have their quirks... I run multi-monitors and on Geforce, you are unable to have all three displays sleep and wake, this is something that has not been an issue with AMD.

Otherwise I generally don't have crashes or other issues with either company, but I also don't run with the latest drivers, I only update when I need a new feature or bug fix.

And yea, both Radeon and Nvidia can have wonky drivers but most people agree that Nvidia continues to be better when you look at factual data like from emulation devs and such. Not to mention 900 series are still recieving continued driver support while Radeon has abandoned RX300 series GPUs.

Pemalite said:

Captain_Yuri said:

So it's like, cool so what has Radeon done recently that's better than Nvidia? Well you get more vram for the same price and their last gen cards are cheaper. Great anything else? I am not saying it's bad to buy Radeon and if you are on a budget, RDNA 2 is absolutely the right buy for sub $550. But in terms of recent PC innovation that matters to those that are buying now? It just feels like Radeon quite lacking.

Chiplets are new.

Yea and are pretty useless to anyone other than Radeons margins. Least when Ryzen did chiplets, they had double the cores vs Intel for the same price. When Radeon does chiplets, it's power hungry, less efficient, provides the same Raster performance as Mono and costs as much as Nvidia products with basically no advantages against Nvidia other than more vram. In the long term that could end up being different but that is a long ways away and Nvidia is no Intel.

Last edited by Jizz_Beard_thePirate - on 19 April 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
Pemalite said:

I don't disagree that nVidia has some solid features, but lets be real here... It's not like games look ugly or are unplayable on Radeon... They are just icing on top of the cake.

They aren't unplayable/ugly, they are just largely what they look like on consoles since PS5/Series X can crank out close to ultra Raster settings and have the same upscaling tech. With Nvidia, you simply get better visuals if you have a good enough RT card, significantly better upscaling with DLSS and significantly lower latency with Reflex.

No way near it. Radeon on PC allows you to run with Ultra settings, unlike the Playstation 5/Series X.
Many Series X/Playstation 5 games run with mostly high settings. - I do own every platform.

You can also Super Sample on a Radeon+PC, which is difficult to do on console as you don't have control over resolution... Nor do consoles generally run native resolution anyway.

You aren't getting a different experience running with nVidia unless you start leveraging DLSS... You will obviously get higher framerates of course.

But otherwise, it's not that different. (Again, I own by AMD and nVidia hardware.)

Captain_Yuri said:

AMD does open source because they have to due to being low market share. If they locked FSR or Freesync to Radeon hardware, almost no one would implement them without getting heaps of money. And yes, Nvidia does implement locked features or pay devs to utilize their GPUs to put a leg up against the competition but so does AMD to a degree such as with Far Cry 6 and it's copious amounts of Vram even though it looks like a last gen title.

Open Source is factually better from an industry adoption and support perspective which means it's better for the consumer.

AMD has historically always leveraged open-source approaches.. I know you don't care what happens 10+ years ago which you have alluded to, but even back then AMD was leveraging open source to build compute into the Radeon x1900XT back in 2005-2007...

And that was when AMD and nVidia had almost 50/50 marketshare.
So your argument doesn't hold any weight anyway.




Captain_Yuri said:

FSR has it's place but it's not what I'd consider to be a selling point like what DLSS is which is the main issue. Like people that are looking to buy a new GPU is gonna watch the reviews and see that everyone says DLSS is significantly better than FSR, it's not going to sway people to buy Radeon. Especially with those with 1440p or lower monitors because again, reviews say it looks not very good compared to Native/DLSS.

FSR isn't meant to compete with DLSS, it's not using A.I upscaling.
It's using the Lanczos algorithm, which makes it hardware agnostic.

If image quality is your concern, then I wouldn't use DLSS or FSR, I would use Super Sampling...  But nVidia GPU's likely don't have the VRAM for that anyway.

Captain_Yuri said:

And it doesn't matter what happens 10 years from now. PS6 and XBox Series X2 will come out and Nvidia will invent new features that will lock people into their platform and it's a rinse and repeat cycle. That is unless Radeon or someone else does something. And Intel just may be that company. They already came out with XeSS which is also Ai based upscaler. They have RT performance similar to Nvidia and so on. That's the issue with Radeon, they could fall behind even Intel if they don't start catching up (assuming Intel even sticks around).

The big issue with Intel is their drivers. And they tend to abandon projects extremely quickly.
Intel is very terrible at making consumer-facing software, their commercial/enterprise stuff though (I.E. Compilers) tend to be extremely solid however.

Do they have potential? Absolutely.

And I would like Intel to apply pressure to AMD and nVidia, that would be great for all of us.

Captain_Yuri said:

Yea vram has always been an issue with Nvidia, no changing that until you pay the big bucks.

Doesn't need to be that way. You are already paying "big bucks" and getting less on that front.

Captain_Yuri said:

Actually that's not how Reflex works at all. Reflex works with any Nvidia GPU going back to GTX 900 and you can enable it on any game that has Reflex and it does not require any external hardware like monitor/mouse to use. All the monitor/mouse does is allows you to measure the system latency if you want those stats, they are not required to enable or use Reflex to it's full potential.

To use Reflex to it's fullest extent (I.E All the propriety features you clamor for that sets nVidia apart from AMD), you need propriety hardware, you have already elaborated how this is a big key selling-point by being an nVidia owner.

The GTX 900 series doesn't support Reflex in all titles either.

Keep in mind that AMD also has Radeon Anti-Lag which reduces input latency as well.

Captain_Yuri said:

Not to mention 900 series are still recieving continued driver support while Radeon has abandoned RX300 series GPUs.

You can't use that Chestnut.

Remember, nVidia abandoned support for Fermi in 2018... But was a GPU architecture they were re-releasing (Geforce 730) even in 2014.

Heck... It even got re-released in 2021.

That's right, an unsupported GPU re-released in 2021.
https://www.digitaltrends.com/computing/msi-gt-730-re-release-gpu-shortage/

...But sure, let's paint AMD as the only bad guy in this game.

Captain_Yuri said:

Yea and are pretty useless to anyone other than Radeons margins. Least when Ryzen did chiplets, they had double the cores vs Intel for the same price. When Radeon does chiplets, it's power hungry, less efficient, provides the same Raster performance as Mono and costs as much as Nvidia products with basically no advantages against Nvidia other than more vram. In the long term that could end up being different but that is a long ways away and Nvidia is no Intel.

You do realise that the Fabric between chiplets actually consumes additional power?

The only reason why Ryzen uses less power and offers more performance than Intel isn't actually due to the Chiplet design itself, it's actually due to the architecture.

If AMD made Ryzen monolithic and ditched the fabric and made it on a leading-edge manufacturing process, it would actually consume less energy.

The purpose of chiplets is simply cost... You can make more functional chips per wafer.. And considering AMD's biggest advantage other than more RAM... Is cost, it's good they are leveraging that, because GPU's have spiralled out of control in regards to costs.

One of the big factors was unprecedented demand due to Crypto and COVID, so AMD and nVidia could price GPU's however they wanted and they would still sell, that's now changed, so price is going to be a key selling point going forwards as the USA potentially goes into recession.



--::{PC Gaming Master Race}::--

Pemalite said:
Captain_Yuri said:

They aren't unplayable/ugly, they are just largely what they look like on consoles since PS5/Series X can crank out close to ultra Raster settings and have the same upscaling tech. With Nvidia, you simply get better visuals if you have a good enough RT card, significantly better upscaling with DLSS and significantly lower latency with Reflex.

No way near it. Radeon on PC allows you to run with Ultra settings, unlike the Playstation 5/Series X.
Many Series X/Playstation 5 games run with mostly high settings. - I do own every platform.

You can also Super Sample on a Radeon+PC, which is difficult to do on console as you don't have control over resolution... Nor do consoles generally run native resolution anyway.

You aren't getting a different experience running with nVidia unless you start leveraging DLSS... You will obviously get higher framerates of course.

But otherwise, it's not that different. (Again, I own by AMD and nVidia hardware.)

Except DF has done plenty of comparisons that says otherwise until you turn on Ray Tracing which is what really starts showing meaningful visual differences. I myself own a PS5, Switch, Steam Deck and a PC. There might be some minor visual enhancements to ultra but Ray Tracing shows significant visual improvements unless it's a bad implementation like RT shadows.

And depending on the GPU + Resolution + Game, you can certainly run Ray Tracing at Native and get good FPS

Open Source is factually better from an industry adoption and support perspective which means it's better for the consumer.

AMD has historically always leveraged open-source approaches.. I know you don't care what happens 10+ years ago which you have alluded to, but even back then AMD was leveraging open source to build compute into the Radeon x1900XT back in 2005-2007...

And that was when AMD and nVidia had almost 50/50 marketshare.
So your argument doesn't hold any weight anyway.

Open source is better for industry adoption and better for consumer but it only matters if the technology is good which in the case of FSR, it really isn't proven by many outlets already. I am sure there will eventually be some Ai based upscaler that will be open, maybe even XeSS that will end up replacing DLSS which gets widely adopted by industry. Who knows, maybe FSR 4.0 will be exactly that or FSR 3.0. But FSR as it's current form really isn't making much in roads.

You mean ATI Radeon x1900XT? Do you see any ATI logo anymore? Cause I sure don't. We know management and the entire company has changed quite a bit and so has the landscape.

FSR isn't meant to compete with DLSS, it's not using A.I upscaling.
It's using the Lanczos algorithm, which makes it hardware agnostic.

If image quality is your concern, then I wouldn't use DLSS or FSR, I would use Super Sampling...  But nVidia GPU's likely don't have the VRAM for that anyway.

That's also the point. FSR is inferior because it doesn't use Ai upscaling. But AMD positions FSR 2.0 as a competitor to DLSS. It doesn't matter if they use completely different methods because those who are buying a new GPU now doesn't care. They will look at DLSS being superior and choose Nvidia.

Sorry what is your specs again? Cause you do know Nvidia has 3090s and 4090s and 4080s right?

Doesn't need to be that way. You are already paying "big bucks" and getting less on that front.

Until someone can be an actual competitor instead of releasing products with half backed features, people will continue to pay the big bucks because that's how the market is. People are choosing 3060s over 6700XTs even though they are the same price with a wide gap in performance in favor of the 6700XT. I wouldn't recommend getting a 3060 over 6700XT any day but they are because there is no alternative to Nvidias feature sets.

To use Reflex to it's fullest extent (I.E All the propriety features you clamor for that sets nVidia apart from AMD), you need propriety hardware, you have already elaborated how this is a big key selling-point by being an nVidia owner.

The GTX 900 series doesn't support Reflex in all titles either.

Keep in mind that AMD also has Radeon Anti-Lag which reduces input latency as well.

If by propriety hardware, you mean Nvidia GPU then yes... But other than that you don't. And I do like propriety features when they are the only thing that's available. Why would I pay over $500 for half baked features?

"The GTX 900 series doesn't support Reflex in all titles either."

Source?

"Keep in mind that AMD also has Radeon Anti-Lag which reduces input latency as well."

Not even remotely the same thing. Do some research yea?

You can't use that Chestnut.

Remember, nVidia abandoned support for Fermi in 2018... But was a GPU architecture they were re-releasing (Geforce 730) even in 2014.

Heck... It even got re-released in 2021.

That's right, an unsupported GPU re-released in 2021.
https://www.digitaltrends.com/computing/msi-gt-730-re-release-gpu-shortage/

...But sure, let's paint AMD as the only bad guy in this game.

I sure can cause AMD ended driver support for HD 6000 quite early which came out in October 22, 2010 where as Fermi came out in April 2010. But AMD stopped supporting HD 6000 series in November 2015.

https://www.techspot.com/news/62913-amd-ends-driver-support-hd-5000-6000-series.html

So yes, they are still the bad guy in this game. And lets be honest, no one cares about the GT730 cause when Radeon has sacked entire generation of GPUs which costed a lot more money with less than 5 years of driver support in the past.

You do realise that the Fabric between chiplets actually consumes additional power?

The only reason why Ryzen uses less power and offers more performance than Intel isn't actually due to the Chiplet design itself, it's actually due to the architecture.

If AMD made Ryzen monolithic and ditched the fabric and made it on a leading-edge manufacturing process, it would actually consume less energy.

The purpose of chiplets is simply cost... You can make more functional chips per wafer.. And considering AMD's biggest advantage other than more RAM... Is cost, it's good they are leveraging that, because GPU's have spiralled out of control in regards to costs.

One of the big factors was unprecedented demand due to Crypto and COVID, so AMD and nVidia could price GPU's however they wanted and they would still sell, that's now changed, so price is going to be a key selling point going forwards as the USA potentially goes into recession.

I am aware of the reasoning but it really doesn't matter all that much to the buyer is the point. When people saw Ryzen giving 8 cores 16 threads for the same price as a quad cord CPU from Intel, that was truly an eye opening moment. But there is nothing like that with RDNA 3. And considering how 4090s alone has more market share according to Steam Survey (granted should be taken with salt) than high end RX6000 series, that really says it all.

https://www.pcgamer.com/seriously-where-did-you-lot-get-the-money-for-all-those-rtx-4090s/

At the end of the day, we can keep going back and forth and it doesn't really matter cause I am not going to change your opinion and vice versa. At the end of the day, what matters is the health of the industry. Nvidia is ruining PC gaming because of it's prices and people are willing to pay those prices because there is no alternative regardless of what people think proven by JPR and everyone else. Imo the only way is for a competitor to come out with Nvidia features like DLSS and Nvidias RT performance but for a significantly cheaper price. I hope that Intel can do it cause AMD feels like they don't want to go that route. They would rather be a follower and let Nvidia set the prices and sell those who dislike Nvidia products with half baked features for less money and more vram. Where as Intel at least feels like maybe there could be something there. Happy to be proven wrong though.

Last edited by Jizz_Beard_thePirate - on 19 April 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

 Imo the only way is for a competitor to come out with Nvidia features like DLSS and Nvidias RT performance but for a significantly cheaper price. I hope that Intel can do it cause AMD feels like they don't want to go that route. They would rather be a follower and let Nvidia set the prices and sell those who dislike Nvidia products with half baked features for less money and more vram. Where as Intel at least feels like maybe there could be something there. Happy to be proven wrong though.

But no one's going to do that though?.

Like that's the biggest issue going on with the market atm. AMD doesn't have Nvidia's capital to just magically whip out the same exact features, and then lower the price, and we know Intel isn't entirely consumer friendly either, and they are still playing the biggest game of catch-up, so I don't expect them to ever catch up to Nvidia's current stage, let alone 5yrs from now, when nvidia will undoubtedly be ahead again.

Nvidia knows this, that's why they are still playing with closed off tech, jacking up prices and spamming their own line of cards within a singular brand, because they know at some point, ppl are going to buy them, and this is also another big issue, ppl lack spines and patience. 

So really, when you think about it, there can be no "only way", because that way is simply not possible, not logically speaking, because one of Nvidia's competitors just doesn't have the capital and engi's needed, and as you said, seemingly doesn't want to go that route, and then there's Intel, who is taking an absolute eternity in tech years to catch up, and they also aren't full on consumer friendly, so really when you really stand back at look at the bigger picture, there isn't anyone left. No one is going to appear out of thin air to challenge Nvidia, at this point to challenge them tenfold and beyond what AMD/Intel are doing (it's really not possible, unless MS decides to, but would you trust MS after Windows 11 and how they treat Xbox?, I wouldn't, fuck naw, so who else is left?). 

You know how we have Timmy Tencent over in the corner bitching about Valve having a monopoly?, well that's what it actually looks like when Nvidia is the one who has it, not Valve.

Last edited by Chazore - on 19 April 2023

Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

His comment on Nvidia fudging the numbers for users enabling RT sounds like something Nvidia would do tbh.

I've seen a number of people that complain about RT perf on PC via Steam forums, and even on Optimized reddit, I see suggestions for cranking down RT values.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"