| JEMC said: Talking about single threaded games, I've seen a couple of Far Cry 6 reviews and the game seems to scale well with multiple cores CPUs.I don't know what kind of settings did the guy that had problems with it had, but they don't seem to be true. From Guru3D's analysis (page 7): Processor usageLooking at threaded behavior this game work really well with any six core and upward processor. But yeah, six to eight cores for best results (is twelve to sixteen threads)The game definitely likes and utilizes multi-core processors and thus threads. However, we can't say that we're stressing the CPU heaps. here we used an RTX 3080 at 2560x1440 to push framerates (DXR disabled). |
I do have Far Cry 6 for free because of the AMD promo when I got my 5950x but man, I can't bring myself to playing it. Other than needing to install Uplay, it looks so damn bland. Plus it's one of the worst Ray Tracing games I have ever seen. It looks like a blurry mess.


Chazore said:
Holy shit that's horrifying to look at lol. Yeah 2077 and Control are just those kind of games made with RT in mind, and as a result they stand out a lot when it's on or off. Also, a comment on that article stood out to be: "The dilemma is the more we spend on a graphics card, the more we expect to play it at the latest and greatest Very High or Ultra setting. With graphics cards prices through the roof, the desire to play at max settings will be even higher. (Who wants to spend $500+ and not at least give Very High or Ultra a run?) I believe that is one important factor that the author never thought to address, because to me that presents a visible issue that's been doing rounds for years now. GPU prices for the beefier cards has been going up, but we're still seeing them not being utilised as often (besides cranking up the res, but afaik we don't all sport 4k monitors, let alone 8k and I think 4-8 isn't worth it over an artist providing us with higher fidelity assets in their games). The beefier cards should either come down in price (fat chance) or devs should utilise them more, because I think this is only going to pile up as time goes on, to a point where the higher end GPU market is just going to be made fun of, corps, customers and all. Like I'm fine with going down to high settings, I've got a 1080ti and play at 1440p, but if high ain't cutting it either and there still isn't that much difference visually, I'm going to want to question the devs on wtf they are playing at. |
Yea pretty much. It's funny that when I got my Strix 3080 for msrp, my friends were saying I overpaid for a 3080 as it was pretty little gain. 1 year later, it feels like I won a lottery ticket. Of course, as a person that upgrades his GPU every generation (with the exception of Turing cause that was a shat deal going for a 1080), Lovelace (and maybe RDNA3) will really test my luck.
But it really does feel like PC gaming is heading towards a dark time if we can't keep Crypto mining under control. I remember when we used to lul at pre-builts for being overpriced compared to DIY... Now pre-builts are luling at DIY for being heavily overpriced due to GPU. It's insane when you can have Alienware of all companies able to give you a better deal than being able to go out there and buy a PC yourself.
I suppose the silver lining is that you do have some companies like newegg that are selling reasonably priced pre-builts with off the shelf parts. It's just that in the future, instead of upgrading just the GPU... You may be better off selling your entire PC and just switching the ssd. If it comes to it, I might just buy a prebuilt from newegg with a 4080/4090. Switch the GPUs and resell the prebuilt with my 3080.

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850







