By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Zippy6 said:
Captain_Yuri said:

Ultra vs. High Settings in PC Games

https://www.techspot.com/article/2338-ultra-vs-high-settings/

The jist of it is that most people shouldn't be using Ultra settings due to diminishing returns for the performance hit that you get. Of course, they don't factor in Ray Tracing or DLSS which imo is different. Ray tracing in games that are properly implemented combined with DLSS will give you a pretty noticeable visual upgrade while maintaining decent performance. If you turn on Ray Tracing on AMD sponsored games like RE or Far Cry 6, then yea, it's useless and you really need a magnifying glass to see the difference. But with something like Cyberpunk? Oof. Looks incredible! And of course, if you play something like Spiderman or Ratchet and Clank on PS5 with RT on, it can show you what console optimization can do with RT.

I never use Ultra settings anymore. I can't tell the difference between that and high in most games and it usually halves your fps going from high to ultra. The biggest impact in performance for the least impact in visuals. High settings and then crank the resolution as high as I can get it without dropping frames is what I do.

Yea there is a pretty big diminishing returns barrier going from high to Ultra as most games don't really push visuals too far from what consoles are capable of. I do crank everything to ultra because of my monitors 3440 x 1440p resolution and 120hz display as a 3080 can push well above that in most games so it's like, well I may as well crank everything to max. But there are certainly some games where I'd rather crank it down a notch to get a better frame rate or use DLSS if possible.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:

Ultra vs. High Settings in PC Games

https://www.techspot.com/article/2338-ultra-vs-high-settings/

The jist of it is that most people shouldn't be using Ultra settings due to diminishing returns for the performance hit that you get. Of course, they don't factor in Ray Tracing or DLSS which imo is different. Ray tracing in games that are properly implemented combined with DLSS will give you a pretty noticeable visual upgrade while maintaining decent performance. If you turn on Ray Tracing on AMD sponsored games like RE or Far Cry 6, then yea, it's useless and you really need a magnifying glass to see the difference. But with something like Cyberpunk? Oof. Looks incredible! And of course, if you play something like Spiderman or Ratchet and Clank on PS5 with RT on, it can show you what console optimization can do with RT.

Tbh I've been running high for a lot of AAA games these days, because there really isn't that much difference between ultra/high. I just bought and tried out Days Gone, and if you follow Alex's "optimised" settings from DF, you only change two options, but gain more perf and the game hardly looks that much different.

I do wish games were made with Ultra settings in mind, to really show us much of a difference, but those days are long gone and probably won't come back for a long time (Talking Crysis low to Crysis Ultra style differences, and don't anyone @Me with the shoddy Crysis remasters that still sport single threads either).

Also, dunno why the author thought it wise to include any Ubisoft game, when they hardly look that much different from the console versions, and don't tend to perform as good on PC, so the metrics of changing from very high to high look to give minimal perf gains. 

Include Red Dead and other games, but I'd leave out the Ubi games until they get their shit together.

Last edited by Chazore - on 07 October 2021

Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Talking about single threaded games, I've seen a couple of Far Cry 6 reviews and the game seems to scale well with multiple cores CPUs.I don't know what kind of settings did the guy that had problems with it had, but they don't seem to be true.

From Guru3D's analysis (page 7):

Processor usage

Looking at threaded behavior this game work really well with any six core and upward processor. But yeah, six to eight cores for best results (is twelve to sixteen threads).

The game definitely likes and utilizes multi-core processors and thus threads. However, we can't say that we're stressing the CPU heaps. here we used an RTX 3080 at 2560x1440 to push framerates (DXR disabled).



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Chazore said:
Captain_Yuri said:

Ultra vs. High Settings in PC Games

https://www.techspot.com/article/2338-ultra-vs-high-settings/

The jist of it is that most people shouldn't be using Ultra settings due to diminishing returns for the performance hit that you get. Of course, they don't factor in Ray Tracing or DLSS which imo is different. Ray tracing in games that are properly implemented combined with DLSS will give you a pretty noticeable visual upgrade while maintaining decent performance. If you turn on Ray Tracing on AMD sponsored games like RE or Far Cry 6, then yea, it's useless and you really need a magnifying glass to see the difference. But with something like Cyberpunk? Oof. Looks incredible! And of course, if you play something like Spiderman or Ratchet and Clank on PS5 with RT on, it can show you what console optimization can do with RT.

Tbh I've been running high for a lot of AAA games these days, because there really isn't that much difference between ultra/high. I just bought and tried out Days Gone, and if you follow Alex's "optimised" settings from DF, you only change two options, but gain more perf and the game hardly looks that much different.

I do wish games were made with Ultra settings in mind, to really show us much of a difference, but those days are long gone and probably won't come back for a long time (Talking Crysis low to Crysis Ultra style differences, and don't anyone @Me with the shoddy Crysis remasters that still sport single threads either).

Also, dunno why the author thought it wise to include any Ubisoft game, when they hardly look that much different from the console versions, and don't tend to perform as good on PC, so the metrics of changing from very high to high look to give minimal perf gains. 

Include Red Dead and other games, but I'd leave out the Ubi games until they get their shit together.

Yea DF does a pretty good job and giving you the best visuals/performance settings.

I think the ones that I have seen a noticeable difference is in Cyberpunk and Control where there's a pretty big difference between non Ray Tracing Settings and Ray Tracing settings.

There are other games that do have noticable differences but aren't as stark as Cyberpunk/Control at Max settings imo. But yea, nothing will be as glorious as the Crysis Days. That game was incredible for its PC killing power.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

that face looking fucked aint just RT off



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Around the Network
Captain_Yuri said:

Yea DF does a pretty good job and giving you the best visuals/performance settings.

I think the ones that I have seen a noticeable difference is in Cyberpunk and Control where there's a pretty big difference between non Ray Tracing Settings and Ray Tracing settings.

There are other games that do have noticable differences but aren't as stark as Cyberpunk/Control at Max settings imo. But yea, nothing will be as glorious as the Crysis Days. That game was incredible for its PC killing power.

Holy shit that's horrifying to look at lol.

Yeah 2077 and Control are just those kind of games made with RT in mind, and as a result they stand out a lot when it's on or off. 

Also, a comment on that article stood out to be:

"The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?)

We just need graphics card prices to go down so we can be happy with playing “only” at High settings"

I believe that is one important factor that the author never thought to address, because to me that presents a visible issue that's been doing rounds for years now. GPU prices for the beefier cards has been going up, but we're still seeing them not being utilised as often (besides cranking up the res, but afaik we don't all sport 4k monitors, let alone 8k and I think 4-8 isn't worth it over an artist providing us with higher fidelity assets in their games).

The beefier cards should either come down in price (fat chance) or devs should utilise them more, because I think this is only going to pile up as time goes on, to a point where the higher end GPU market is just going to be made fun of, corps, customers and all.

Like I'm fine with going down to high settings, I've got a 1080ti and play at 1440p, but if high ain't cutting it either and there still isn't that much difference visually, I'm going to want to question the devs on wtf they are playing at.

Last edited by Chazore - on 07 October 2021

Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Captain_Yuri said:

Ultra vs. High Settings in PC Games

https://www.techspot.com/article/2338-ultra-vs-high-settings/

The jist of it is that most people shouldn't be using Ultra settings due to diminishing returns for the performance hit that you get. Of course, they don't factor in Ray Tracing or DLSS which imo is different. Ray tracing in games that are properly implemented combined with DLSS will give you a pretty noticeable visual upgrade while maintaining decent performance. If you turn on Ray Tracing on AMD sponsored games like RE or Far Cry 6, then yea, it's useless and you really need a magnifying glass to see the difference. But with something like Cyberpunk? Oof. Looks incredible! And of course, if you play something like Spiderman or Ratchet and Clank on PS5 with RT on, it can show you what console optimization can do with RT.

Tbh I've been running high for a lot of AAA games these days, because there really isn't that much difference between ultra/high. I just bought and tried out Days Gone, and if you follow Alex's "optimised" settings from DF, you only change two options, but gain more perf and the game hardly looks that much different.

I do wish games were made with Ultra settings in mind, to really show us much of a difference, but those days are long gone and probably won't come back for a long time (Talking Crysis low to Crysis Ultra style differences, and don't anyone @Me with the shoddy Crysis remasters that still sport single threads either).

Also, dunno why the author thought it wise to include any Ubisoft game, when they hardly look that much different from the console versions, and don't tend to perform as good on PC, so the metrics of changing from very high to high look to give minimal perf gains. 

Include Red Dead and other games, but I'd leave out the Ubi games until they get their shit together.

Same. Pretty much been running most modern titles on high and its nigh indistinguishable from Ultra settings in terms of IQ. I feel like you probably need a 4K display to see a small differences in UQ textures but even then unless there's an abundance of extra horse power its not really worth it. And even then, I'd opt for high for a more smoother and responsive gameplay experience. Ambient occlusion being one of the more demanding thing you could dial that back with negligible difference. Shadows too. 

The great thing I love about PC is that there's a lot of options you can pick and tweak options. Something you can't do on consoles.

There are some outlier games that push the boundaries that aren't constraint by the limitations by consoles. Star Citizens being one - if you can call it one. Recommended specs are 16GB Ram (32GB recommended), SSD and relatively good spec to run well on max settings.

Last edited by hinch - on 07 October 2021

PC building sim free on epic. Nice my dreams can be memes now.



green_sky said:

PC building sim free on epic. Nice my dreams can be memes now.

Yeah, I think I'll get it, to make dream machines that I'll never have and, let's be honest, because making a game about building PCs seems like a silly joke that got out of hand.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:

Talking about single threaded games, I've seen a couple of Far Cry 6 reviews and the game seems to scale well with multiple cores CPUs.I don't know what kind of settings did the guy that had problems with it had, but they don't seem to be true.

From Guru3D's analysis (page 7):

Processor usage

Looking at threaded behavior this game work really well with any six core and upward processor. But yeah, six to eight cores for best results (is twelve to sixteen threads)The game definitely likes and utilizes multi-core processors and thus threads. However, we can't say that we're stressing the CPU heaps. here we used an RTX 3080 at 2560x1440 to push framerates (DXR disabled).

I do have Far Cry 6 for free because of the AMD promo when I got my 5950x but man, I can't bring myself to playing it. Other than needing to install Uplay, it looks so damn bland. Plus it's one of the worst Ray Tracing games I have ever seen. It looks like a blurry mess.

Chazore said:
Captain_Yuri said:

Yea DF does a pretty good job and giving you the best visuals/performance settings.

I think the ones that I have seen a noticeable difference is in Cyberpunk and Control where there's a pretty big difference between non Ray Tracing Settings and Ray Tracing settings.

There are other games that do have noticable differences but aren't as stark as Cyberpunk/Control at Max settings imo. But yea, nothing will be as glorious as the Crysis Days. That game was incredible for its PC killing power.

Holy shit that's horrifying to look at lol.

Yeah 2077 and Control are just those kind of games made with RT in mind, and as a result they stand out a lot when it's on or off. 

Also, a comment on that article stood out to be:

"The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?)

We just need graphics card prices to go down so we can be happy with playing “only” at High settings"

I believe that is one important factor that the author never thought to address, because to me that presents a visible issue that's been doing rounds for years now. GPU prices for the beefier cards has been going up, but we're still seeing them not being utilised as often (besides cranking up the res, but afaik we don't all sport 4k monitors, let alone 8k and I think 4-8 isn't worth it over an artist providing us with higher fidelity assets in their games).

The beefier cards should either come down in price (fat chance) or devs should utilise them more, because I think this is only going to pile up as time goes on, to a point where the higher end GPU market is just going to be made fun of, corps, customers and all.

Like I'm fine with going down to high settings, I've got a 1080ti and play at 1440p, but if high ain't cutting it either and there still isn't that much difference visually, I'm going to want to question the devs on wtf they are playing at.

Yea pretty much. It's funny that when I got my Strix 3080 for msrp, my friends were saying I overpaid for a 3080 as it was pretty little gain. 1 year later, it feels like I won a lottery ticket. Of course, as a person that upgrades his GPU every generation (with the exception of Turing cause that was a shat deal going for a 1080), Lovelace (and maybe RDNA3) will really test my luck.

But it really does feel like PC gaming is heading towards a dark time if we can't keep Crypto mining under control. I remember when we used to lul at pre-builts for being overpriced compared to DIY... Now pre-builts are luling at DIY for being heavily overpriced due to GPU. It's insane when you can have Alienware of all companies able to give you a better deal than being able to go out there and buy a PC yourself.

I suppose the silver lining is that you do have some companies like newegg that are selling reasonably priced pre-builts with off the shelf parts. It's just that in the future, instead of upgrading just the GPU... You may be better off selling your entire PC and just switching the ssd. If it comes to it, I might just buy a prebuilt from newegg with a 4080/4090. Switch the GPUs and resell the prebuilt with my 3080.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850