By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
Captain_Yuri said:

Yes, most of the video is about GPU because the entire video isn't about trying to prove CPUs being a bottleneck when it comes to Ray Tracing but there is a section where DF shows the CPU bottleneck. I also never said it was only CPU bound. I have no idea where you got that idea. The post you are quoting to says "Yes Ray Tracing is primarily GPU bound but DF has noticed that many aspects are in fact CPU bound to the point where a 3600 can't hold 60fps." Ray Tracing is primarily GPU heavy but as DF says in this video and many others, there are certain RT settings that heavily affect CPU utilization as well because of the BVH buidling amongst other reasons. When you enable the setting is called "RT Object Range" because now the CPU has to calculate the reflected Ray Traced Ai and such as well on top of it's normal calculations. Hence why it's a Ray Tracing setting because without it, the CPU doesn't need to calculate it.

And yes, higher core count will help but as DF said, with newer CPUs, you can turn it up to max. You could also do your own research and see how it scales with cores vs ST performance:

Keep in mind that a 3900/3950X isn't the same performance as the 3600. They have double the cores, higher clocks, more cache.

So whilst a 3600 may struggle with maintaing 60fps in SOME instances, the same doesn't hold true for the higher tier parts.

And we cannot forget that Spiderman is only one implementation of Ray Tracing, there is always an exception to the norm... Having a lack of scaling between a 3600 and 3900 and even the 5800X3D just shows how GPU bound and not CPU bound that game is.

Keep in mind that going from 61fps on the 3600XT to 73.5fps on the Ryzen 5800X3D really isn't making your case.
Is it worth buying a new motherboard for an extra 12fps? Or would you be better spending that money on a better higher-tier Ray Tracing capable GPU which will give you a larger return on your investment in regards to framerates in pretty much all games?

Captain_Yuri said:

Sure but in the scenario we are talking about, 5800X3D is the obvious answer cause otherwise, he may need to upgrade is RAM if he went with any other CPU among other issues if he goes with Ryzen 3000.

You don't have to upgrade your Ram.

No one is discrediting the benefit of the increase in cache, but there are games that will still not fit inside that cache pool... So there would be benefits in some "rare" circumstances that having dual rank AND 3800 mhz AND CL14-14-14-14 would be a big benefit still.

"Keep in mind that a 3900/3950X isn't the same performance as the 3600. They have double the cores, higher clocks, more cache.

So whilst a 3600 may struggle with maintaing 60fps in SOME instances, the same doesn't hold true for the higher tier parts.

And we cannot forget that Spiderman is only one implementation of Ray Tracing, there is always an exception to the norm... Having a lack of scaling between a 3600 and 3900 and even the 5800X3D just shows how GPU bound and not CPU bound that game is."

In the case for games, 3900x/3950x performs largely similarly to 3600x/3800x. But with certain other applications such as virtual machines and such, yes more cores are more beneficial. Majority  of the games today and most likely the rest of the generation won't scale past 6-8 cores. So for a gaming build, the main requirement is having at least 6 core or 8 cores followed by high ST. In the CPU gaming benchmark that I have provided along with many others that you can search yourself, it does show poor scaling as you go to higher tier cores as far as games are concerned vs stronger ST.

Also don't forget that while the higher core count 3000 series CPUs do have more cache compared to 3600, there is a huge caveat. Ryzen 3000 has 4 cores per CCX before needing to go through the Infinity Fabric for intercore communication which has a latency penalty. Each CCX has 16mb of L3 cache vs 5800X3D has 8 cores per CCX and those 8 cores have access to 96mb of L3 cache. So no infinity fabric penalty for intercore communication + 96mb of cache for all 8 cores with 5800X3D.

And while I agree that Spiderman is only one implementation. There are other games that DF discovered that showed a similar behaviour such as hitman raytracing with 10900k. I have posted other DF videos that showcase this behaviour in the past when it comes to Ray Tracing but I am not going to go through DF's videos and post them all. It's up to you to do your own research or go back through this thread and look at the times I have mentioned this in the past.

"Keep in mind that going from 61fps on the 3600XT to 73.5fps on the Ryzen 5800X3D really isn't making your case.
Is it worth buying a new motherboard for an extra 12fps? Or would you be better spending that money on a better higher-tier Ray Tracing capable GPU which will give you a larger return on your investment in regards to framerates in pretty much all games?"

% wise, it is a pretty big difference and 73.5fps ensures this CPU stays above 60fps more so than it drops below as 3600 being at 61fps will drop below 60fps far more frequently. 5800X3D ensures that there is more wiggle room than 3000 series who is already averaging 61fps. But this is as you say, just "one game." There are other games where the lead for 5800X3D is pretty crazy:

https://www.techspot.com/review/2502-upgrade-ryzen-3600-to-5800x3d/

"When CPU limited, the 5800X3D offers Ryzen 5 3600 owners massive performance gains. Using the Radeon 6950 XT at 1080p we see on average 48% greater 1% lows, with average frame rates boosted by 46%. Even at 1440p, the 5800X3D provided up to 35% greater performance on average."

Also considering he can get a 4080 with the options I am suggesting, skimping out on $120 for X570 won't net him a higher tier GPU. Not to mention since he has a 2700x, most likely he has X470 which is PCI-E Gen 3. So going to X570 for $120 has benefits other than just new CPU. It has PCI-E Gen 4 for GPU and SSD which will help in the long run. But to be fair, if he does have a 400 series board, he can do an in place upgrade since AMD has allowed 5000 compatibility for even 300 series boards.

Last edited by Jizz_Beard_thePirate - on 20 August 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850