By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

haxxiy said:
Jizz_Beard_thePirate said:

AMD officially confirms fresh next-gen Zen 6 CPU details

https://overclock3d.net/news/cpu_mainboard/amd-officially-confirms-fresh-next-gen-zen-6-cpu-details/

I wonder if 12 cores per CCU will be enough, Nova Lake looks insane (on paper, at least).

While I love the core count wars between the two companies, ultimately the winner in the consumer space will be whichever is the faster one in gaming. GPUs can be used to offload a lot of heavy tasks these days such as video editing, Ai and etc so I don't think that outside of truly heavily threaded use cases such as file compression, virtual machines and etc, most consumers and creatives will really benefit from a big core count advantage if the "lower core count" cpu is significantly better in gaming. This is why AMD's x3d chips and 6-8 cores in general dominate a lot of sales charts even though technically, Intel is giving you a lot more cores for the same price.

I really hope as next generation gets going, Intel makes some big changes to their direction. Longer platform support, refocused efforts in gaming and of course big increases in core count. They need to hit AMD with a one two punch instead of half assing their efforts, even when they are behind.

The cpu in the ps6 should be nuts though. I'll probably upgrade my cpu the cycle after the ps6 releases.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

Last call for joining this year's Secret Santa PC Event.



Jizz_Beard_thePirate said:
haxxiy said:

I wonder if 12 cores per CCU will be enough, Nova Lake looks insane (on paper, at least).

While I love the core count wars between the two companies, ultimately the winner in the consumer space will be whichever is the faster one in gaming. GPUs can be used to offload a lot of heavy tasks these days such as video editing, Ai and etc so I don't think that outside of truly heavily threaded use cases such as file compression, virtual machines and etc, most consumers and creatives will really benefit from a big core count advantage if the "lower core count" cpu is significantly better in gaming. This is why AMD's x3d chips and 6-8 cores in general dominate a lot of sales charts even though technically, Intel is giving you a lot more cores for the same price.

I really hope as next generation gets going, Intel makes some big changes to their direction. Longer platform support, refocused efforts in gaming and of course big increases in core count. They need to hit AMD with a one two punch instead of half assing their efforts, even when they are behind.

The cpu in the ps6 should be nuts though. I'll probably upgrade my cpu the cycle after the ps6 releases.

Counter to that... Higher core count CPU's tend to last longer.

Case in point... Back in the Core 2 Duo days, I opted for a Core 2 Quad despite no games using 4-CPU cores, so most games ran better on the Core 2 Duo, but over successive years as games became a little more threaded, that Core 2 Quad lasted a lot longer... I actually still have my Core 2 Quad today, which I use mostly for modding/testing and it's able to play games like Fortnite competently at 60fps.

Still have my Phenom 2 x6 PC as well, which is able to play many modern games just fine, provided they don't require newer SIMD instructions that were in Bulldozer or newer... And the Phenom 2's had IPC roughly around the Core 2 levels... Unless you overclocked the NB, then you can approach Sandy Bridge levels clock for clock.




www.youtube.com/@Pemalite

Pemalite said:
Jizz_Beard_thePirate said:

While I love the core count wars between the two companies, ultimately the winner in the consumer space will be whichever is the faster one in gaming. GPUs can be used to offload a lot of heavy tasks these days such as video editing, Ai and etc so I don't think that outside of truly heavily threaded use cases such as file compression, virtual machines and etc, most consumers and creatives will really benefit from a big core count advantage if the "lower core count" cpu is significantly better in gaming. This is why AMD's x3d chips and 6-8 cores in general dominate a lot of sales charts even though technically, Intel is giving you a lot more cores for the same price.

I really hope as next generation gets going, Intel makes some big changes to their direction. Longer platform support, refocused efforts in gaming and of course big increases in core count. They need to hit AMD with a one two punch instead of half assing their efforts, even when they are behind.

The cpu in the ps6 should be nuts though. I'll probably upgrade my cpu the cycle after the ps6 releases.

Counter to that... Higher core count CPU's tend to last longer.

Case in point... Back in the Core 2 Duo days, I opted for a Core 2 Quad despite no games using 4-CPU cores, so most games ran better on the Core 2 Duo, but over successive years as games became a little more threaded, that Core 2 Quad lasted a lot longer... I actually still have my Core 2 Quad today, which I use mostly for modding/testing and it's able to play games like Fortnite competently at 60fps.

Still have my Phenom 2 x6 PC as well, which is able to play many modern games just fine, provided they don't require newer SIMD instructions that were in Bulldozer or newer... And the Phenom 2's had IPC roughly around the Core 2 levels... Unless you overclocked the NB, then you can approach Sandy Bridge levels clock for clock.

Personally it depends on whether or not you actively are using those additional cores or if they are just sitting there waiting to be used for many years later. Cause for a gaming build, a person could get a much better experience investing more into gpu than a cpu with more cores. For example, if a person is thinking between spending extra $200 to go from 8 to 12 cores or 9060 XT to 9070, they would get much more usage out of getting a 9070 vs those extra cores. It would also be easier to sell to a wider audience cause gamers generally value a higher tier gpu over a higher tier cpu.

On top of that, when it's time to upgrade, you may be getting the higher core count cpu for a mainstream price anyway. Like a 9700 with 8 cores costs about the same as an i7 7700k with 4 cores used to back in the day. And thanks to Amds platform longevity, you might not need to upgrade the ram or motherboard.

But if you are actively using those extra cores, then yea,by all means buy the higher core setup. My 5950x was such a wonderful cpu with 16 cores. It was awesome for vms and shader compliation. But after my new job which is more cloud based, I didn't need 16 cores for my next upgrade so instead, I decided to save $600 cad and put it towards a 4090 instead of a 4080. And as zen 6 is coming with 12 cores per ccd, we might soon get 12 core mainstream cpus.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Jizz_Beard_thePirate said:
Pemalite said:

Counter to that... Higher core count CPU's tend to last longer.

Case in point... Back in the Core 2 Duo days, I opted for a Core 2 Quad despite no games using 4-CPU cores, so most games ran better on the Core 2 Duo, but over successive years as games became a little more threaded, that Core 2 Quad lasted a lot longer... I actually still have my Core 2 Quad today, which I use mostly for modding/testing and it's able to play games like Fortnite competently at 60fps.

Still have my Phenom 2 x6 PC as well, which is able to play many modern games just fine, provided they don't require newer SIMD instructions that were in Bulldozer or newer... And the Phenom 2's had IPC roughly around the Core 2 levels... Unless you overclocked the NB, then you can approach Sandy Bridge levels clock for clock.

Personally it depends on whether or not you actively are using those additional cores or if they are just sitting there waiting to be used for many years later. Cause for a gaming build, a person could get a much better experience investing more into gpu than a cpu with more cores. For example, if a person is thinking between spending extra $200 to go from 8 to 12 cores or 9060 XT to 9070, they would get much more usage out of getting a 9070 vs those extra cores. It would also be easier to sell to a wider audience cause gamers generally value a higher tier gpu over a higher tier cpu.

On top of that, when it's time to upgrade, you may be getting the higher core count cpu for a mainstream price anyway. Like a 9700 with 8 cores costs about the same as an i7 7700k with 4 cores used to back in the day. And thanks to Amds platform longevity, you might not need to upgrade the ram or motherboard.

But if you are actively using those extra cores, then yea,by all means buy the higher core setup. My 5950x was such a wonderful cpu with 16 cores. It was awesome for vms and shader compliation. But after my new job which is more cloud based, I didn't need 16 cores for my next upgrade so instead, I decided to save $600 cad and put it towards a 4090 instead of a 4080. And as zen 6 is coming with 12 cores per ccd, we might soon get 12 core mainstream cpus.

They aren't just "sitting there waiting to be used many years later". - You tend to have less of a performance penalty if you have a multitude of apps open.
Think: Virus scanner, xsplit, discord, web browser, Steam, GOG, Epic Store and more can all take resources.

Some games already will use every CPU thread you can give it... Think: Civilization, Cities Skylines 2, Ashes of the Singularity and more.
Even Cyberpunk sees scaling from 12 cores/24 threads to 16 cores/32 threads.
https://www.tomshardware.com/news/cyberpunk-2077-cpu-scaling-benchmarks


And whilst you are correct that we may see 12 cores as the "mainstream part" - It's going to be unaffordable for most as the price of DDR5 Ram and NAND is making any future platform changes unobtainable for most anyway... At-least for the next few years.
Those who are still on AM4 with a 12 or 16 core processor still have many many years of life left in the tank. - Especially as the X3D parts have sky rocketed in price on AM4, making the choice for a 16/12 core part a better option for most which are still cheap.

And as gamers we tend to upgrade GPU's more often than CPU's anyway, so being a little more conservative on the GPU, with the intent to upgrade it in 2-3 years time isn't the worst decision in the world. (An upgrade you would do anyway.)
The 9060XT isn't exactly a part that is unplayable in modern games verses the 9070.

I have always gone for the highest core counts and have always gotten stupidly long system life out of my systems.




www.youtube.com/@Pemalite

Around the Network
Pemalite said:
Jizz_Beard_thePirate said:

Personally it depends on whether or not you actively are using those additional cores or if they are just sitting there waiting to be used for many years later. Cause for a gaming build, a person could get a much better experience investing more into gpu than a cpu with more cores. For example, if a person is thinking between spending extra $200 to go from 8 to 12 cores or 9060 XT to 9070, they would get much more usage out of getting a 9070 vs those extra cores. It would also be easier to sell to a wider audience cause gamers generally value a higher tier gpu over a higher tier cpu.

On top of that, when it's time to upgrade, you may be getting the higher core count cpu for a mainstream price anyway. Like a 9700 with 8 cores costs about the same as an i7 7700k with 4 cores used to back in the day. And thanks to Amds platform longevity, you might not need to upgrade the ram or motherboard.

But if you are actively using those extra cores, then yea,by all means buy the higher core setup. My 5950x was such a wonderful cpu with 16 cores. It was awesome for vms and shader compliation. But after my new job which is more cloud based, I didn't need 16 cores for my next upgrade so instead, I decided to save $600 cad and put it towards a 4090 instead of a 4080. And as zen 6 is coming with 12 cores per ccd, we might soon get 12 core mainstream cpus.

They aren't just "sitting there waiting to be used many years later". - You tend to have less of a performance penalty if you have a multitude of apps open.
Think: Virus scanner, xsplit, discord, web browser, Steam, GOG, Epic Store and more can all take resources.

Some games already will use every CPU thread you can give it... Think: Civilization, Cities Skylines 2, Ashes of the Singularity and more.
Even Cyberpunk sees scaling from 12 cores/24 threads to 16 cores/32 threads.
https://www.tomshardware.com/news/cyberpunk-2077-cpu-scaling-benchmarks


And whilst you are correct that we may see 12 cores as the "mainstream part" - It's going to be unaffordable for most as the price of DDR5 Ram and NAND is making any future platform changes unobtainable for most anyway... At-least for the next few years.
Those who are still on AM4 with a 12 or 16 core processor still have many many years of life left in the tank. - Especially as the X3D parts have sky rocketed in price on AM4, making the choice for a 16/12 core part a better option for most which are still cheap.

And as gamers we tend to upgrade GPU's more often than CPU's anyway, so being a little more conservative on the GPU, with the intent to upgrade it in 2-3 years time isn't the worst decision in the world. (An upgrade you would do anyway.)
The 9060XT isn't exactly a part that is unplayable in modern games verses the 9070.

I have always gone for the highest core counts and have always gotten stupidly long system life out of my systems.

Those things take so little resources with modern hardware that it's not even worth looking into the difference. I haven't seen my 7800X3D anywhere close to 100% cpu usage during games even when paired with a 4090 even though I do run two 4k monitors that have plenty of apps, browser tabs and etc open at all times. I am sure there is a difference but it's so negligible unless you are running cpu heavy games such as the ones that you stated. But if those are your main games to play, then yea, get all the cpu cores you need similar to if you run cpu heavy applications.

But generally for many gaming related applications, the ones that take a lot of resources such a streaming will yield better results if you offload them to the gpu anyway which is one of the reasons why so many streamers use Nvidia specifically due to how good it's H264 encoders are. And the thing with a game like cyberpunk is you will be heavily gpu bound well before you will see benefits of cpu scaling come into factor. Both of those will certainly be more beneficial if you invest into a stronger gpu over cpu.

The funny thing about what's affordable and what's not is that GPUs ever since the RTX 3000/RX 6000 have been the one key thing that always seen awful prices whether it by crypto, Ai or other nonsense. CPUs have been pretty dang cheap and still are with constant sales. Ram and SSDs for the past however many years have been super cheap and only really have gotten expensive now. And for Ram prices funly enough will also be affecting GPU prices due to gpus having vram. If anything, the smartest thing to do these days would be to get a gpu with as much vram and features as possible and hold for as long as possible since with CPUs, it's competitive landscape should continue to be cheap for the foreseeable future. And when it comes time to sell that gpu, you will get a more of a return because of how gpu prices seem to get cucked every generation due to some nonsense.

Last edited by Jizz_Beard_thePirate - on 20 December 2025

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Jizz_Beard_thePirate said:
Pemalite said:

They aren't just "sitting there waiting to be used many years later". - You tend to have less of a performance penalty if you have a multitude of apps open.
Think: Virus scanner, xsplit, discord, web browser, Steam, GOG, Epic Store and more can all take resources.

Some games already will use every CPU thread you can give it... Think: Civilization, Cities Skylines 2, Ashes of the Singularity and more.
Even Cyberpunk sees scaling from 12 cores/24 threads to 16 cores/32 threads.
https://www.tomshardware.com/news/cyberpunk-2077-cpu-scaling-benchmarks


And whilst you are correct that we may see 12 cores as the "mainstream part" - It's going to be unaffordable for most as the price of DDR5 Ram and NAND is making any future platform changes unobtainable for most anyway... At-least for the next few years.
Those who are still on AM4 with a 12 or 16 core processor still have many many years of life left in the tank. - Especially as the X3D parts have sky rocketed in price on AM4, making the choice for a 16/12 core part a better option for most which are still cheap.

And as gamers we tend to upgrade GPU's more often than CPU's anyway, so being a little more conservative on the GPU, with the intent to upgrade it in 2-3 years time isn't the worst decision in the world. (An upgrade you would do anyway.)
The 9060XT isn't exactly a part that is unplayable in modern games verses the 9070.

I have always gone for the highest core counts and have always gotten stupidly long system life out of my systems.

Those things take so little resources with modern hardware that it's not even worth looking into the difference. I haven't seen my 7800X3D anywhere close to 100% cpu usage during games even when paired with a 4090 even though I do run two 4k monitors that have plenty of apps, browser tabs and etc open at all times. I am sure there is a difference but it's so negligible unless you are running cpu heavy games such as the ones that you stated. But if those are your main games to play, then yea, get all the cpu cores you need similar to if you run cpu heavy applications.

But generally for many gaming related applications, the ones that take a lot of resources such a streaming will yield better results if you offload them to the gpu anyway which is one of the reasons why so many streamers use Nvidia specifically due to how good it's H264 encoders are. And the thing with a game like cyberpunk is you will be heavily gpu bound well before you will see benefits of cpu scaling come into factor. Both of those will certainly be more beneficial if you invest into a stronger gpu over cpu.

The funny thing about what's affordable and what's not is that GPUs ever since the RTX 3000/RX 6000 have been the one key thing that always seen awful prices whether it by crypto, Ai or other nonsense. CPUs have been pretty dang cheap and still are with constant sales. Ram and SSDs for the past however many years have been super cheap and only really have gotten expensive now. And for Ram prices funly enough will also be affecting GPU prices due to gpus having vram. If anything, the smartest thing to do these days would be to get a gpu with as much vram and features as possible and hold for as long as possible since with CPUs, it's competitive landscape should continue to be cheap for the foreseeable future. And when it comes time to sell that gpu, you will get a more of a return because of how gpu prices seem to get cucked every generation due to some nonsense.

With Ray Tracing CPU's can make a catastrophic difference, especially if the game is using Path Tracing or better... Especially if the game is using bounding volume hierarchy (BVH) which requires a heavy level of threading.

Either way, people choose the CPU's they choose based on a variety of factors, but there is definitely obvious benefits having a heavier threaded CPU with modern games as they are today, even if we ignore the future benefits.

And yes, streaming can be done on the GPU, but that isn't always the best option as some games may use the encoder/decoder for the games built in cinematic and often the CPU (Based on settings) can provide higher quality output for Twitch streams.




www.youtube.com/@Pemalite

Pemalite said:
Jizz_Beard_thePirate said:

Those things take so little resources with modern hardware that it's not even worth looking into the difference. I haven't seen my 7800X3D anywhere close to 100% cpu usage during games even when paired with a 4090 even though I do run two 4k monitors that have plenty of apps, browser tabs and etc open at all times. I am sure there is a difference but it's so negligible unless you are running cpu heavy games such as the ones that you stated. But if those are your main games to play, then yea, get all the cpu cores you need similar to if you run cpu heavy applications.

But generally for many gaming related applications, the ones that take a lot of resources such a streaming will yield better results if you offload them to the gpu anyway which is one of the reasons why so many streamers use Nvidia specifically due to how good it's H264 encoders are. And the thing with a game like cyberpunk is you will be heavily gpu bound well before you will see benefits of cpu scaling come into factor. Both of those will certainly be more beneficial if you invest into a stronger gpu over cpu.

The funny thing about what's affordable and what's not is that GPUs ever since the RTX 3000/RX 6000 have been the one key thing that always seen awful prices whether it by crypto, Ai or other nonsense. CPUs have been pretty dang cheap and still are with constant sales. Ram and SSDs for the past however many years have been super cheap and only really have gotten expensive now. And for Ram prices funly enough will also be affecting GPU prices due to gpus having vram. If anything, the smartest thing to do these days would be to get a gpu with as much vram and features as possible and hold for as long as possible since with CPUs, it's competitive landscape should continue to be cheap for the foreseeable future. And when it comes time to sell that gpu, you will get a more of a return because of how gpu prices seem to get cucked every generation due to some nonsense.

With Ray Tracing CPU's can make a catastrophic difference, especially if the game is using Path Tracing or better... Especially if the game is using bounding volume hierarchy (BVH) which requires a heavy level of threading.

Either way, people choose the CPU's they choose based on a variety of factors, but there is definitely obvious benefits having a heavier threaded CPU with modern games as they are today, even if we ignore the future benefits.

And yes, streaming can be done on the GPU, but that isn't always the best option as some games may use the encoder/decoder for the games built in cinematic and often the CPU (Based on settings) can provide higher quality output for Twitch streams.

Yes Ray Tracing and Path Tracing can certainly impact CPU performance but there are a few important things to point out...

First in the link that you yourself have provided, you can see that the difference between 5800X "modified" and 5950X is only 1.5 FPS when running Cyberpunk with Ray Tracing. The only reason the 5800X shows modified is because at the time, there was a known bug with Ryzen and Cyberpunk where the game didn't use all of the threads for 6 and 8 core cpus. So Toms hardware according to the article applied a fix so that it will show the true scaling which really isn't much.

Second if you look at a more modern tests with modern CPUs such as 7700X vs 7950X and games like Alan Wake 2, you can see that 8 vs 16 cores really aren't making much of a difference. In fact thanks to Vcache with 5800X3D, it is able to gain more frames than a cpu with 16 cores like 5950x. I posted 720p to get rid of any potential gpu bottlenecks but they also have 1080p tests:

https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/17.html

And third, lets say instead of getting a 5950X which had an MSRP of $800, they got a 5800X which had an MSRP of $450 and used that $350 to towards a GPU and they already had a fixed budget of $300 since that is basically the barrier to entry. So instead of getting stuck with a 3060 or 6600. They could have gotten either a 3080 ($700 msrp) or 6800XT ($650 msrp).

So basically in gaming, a person could have gotten an increase of 50% in performance in games if they put that money towards a GPU instead of doubling the cores and gotten one hell of a return if they sold it at the right time. Granted cryto shat on everyone 3 months after those gpus launched but you know, in theory and all that. But you could apply the logic with modern cpus too. 9950x msrp is $650 and 9700x msrp is $350. 9070 XT ($600) is $250 more expensive than a 9060 XT 16GB ($350) and will give around 50% increase in gaming performance.

Keep in mind, I am not saying it's wrong to buy a more expensive cpu over gpu. Everyones use case is different as you said. But for general gaming, I'd always recommend putting more money towards a gpu over a cpu.

Last edited by Jizz_Beard_thePirate - on 21 December 2025

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Jizz_Beard_thePirate said:
Pemalite said:

With Ray Tracing CPU's can make a catastrophic difference, especially if the game is using Path Tracing or better... Especially if the game is using bounding volume hierarchy (BVH) which requires a heavy level of threading.

Either way, people choose the CPU's they choose based on a variety of factors, but there is definitely obvious benefits having a heavier threaded CPU with modern games as they are today, even if we ignore the future benefits.

And yes, streaming can be done on the GPU, but that isn't always the best option as some games may use the encoder/decoder for the games built in cinematic and often the CPU (Based on settings) can provide higher quality output for Twitch streams.

Yes Ray Tracing and Path Tracing can certainly impact CPU performance but there are a few important things to point out...

First in the link that you yourself have provided, you can see that the difference between 5800X "modified" and 5950X is only 1.5 FPS when running Cyberpunk with Ray Tracing. The only reason the 5800X shows modified is because at the time, there was a known bug with Ryzen and Cyberpunk where the game didn't use all of the threads for 6 and 8 core cpus. So Toms hardware according to the article applied a fix so that it will show the true scaling which really isn't much.

Second if you look at a more modern tests with modern CPUs such as 7700X vs 7950X and games like Alan Wake 2, you can see that 8 vs 16 cores really aren't making much of a difference. In fact thanks to Vcache with 5800X3D, it is able to gain more frames than a cpu with 16 cores like 5950x. I posted 720p to get rid of any potential gpu bottlenecks but they also have 1080p tests:

https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/17.html

And third, lets say instead of getting a 5950X which had an MSRP of $800, they got a 5800X which had an MSRP of $450 and used that $350 to towards a GPU and they already had a fixed budget of $300 since that is basically the barrier to entry. So instead of getting stuck with a 3060 or 6600. They could have gotten either a 3080 ($700 msrp) or 6800XT ($650 msrp).

So basically in gaming, a person could have gotten an increase of 50% in performance in games if they put that money towards a GPU instead of doubling the cores and gotten one hell of a return if they sold it at the right time. Granted cryto shat on everyone 3 months after those gpus launched but you know, in theory and all that. But you could apply the logic with modern cpus too. 9950x msrp is $650 and 9700x msrp is $350. 9070 XT ($600) is $250 more expensive than a 9060 XT 16GB ($350) and will give around 50% increase in gaming performance.

Keep in mind, I am not saying it's wrong to buy a more expensive cpu over gpu. Everyones use case is different as you said. But for general gaming, I'd always recommend putting more money towards a gpu over a cpu.

Just to add to this, most publishers build their engines around the weakest non-portable console of the time, which would currently be the Xbox series S. As long as the series S is still taken into account, the vast majority of titles won't make use of more than 6 cores/12 threads.



Ah! Internet for leisure, How I missed you!

Looks like you've been busy and I have some catch up to do.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.