By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - I kicked intel to the curb, I'm a Ryzen boi now.

vivster said:

But can it run Crysis?

Edit: I just checked out of curiosity because the price point of $500 seemed a bit high for AMD. My hunch was correct. Looks like the 9900K is actually cheaper AND slightly better for gaming than the 3900X. Did not expect that. The 9700 is even cheaper and still better. Looks like the 3700X is basically the same as the 3900X in games but loads cheaper. I'm just gonna hope for everyone that those core heavy monsters will become useful in gaming at some point.

It will be. The first Quad Cores and Hex-Core processors have aged really well.

vivster said:
deskpro2k3 said:

9900K is cheaper because it doesn't come with a cooler.

Edit. A sexy cooler

I have very specific opinions about people who subject their CPUs to stock coolers instead of a sweet Noctua D15.

I also have very specific opinions about people who subject their motherboards to such a large and heavy chunk of metal instead of getting a closed loop water cooler. :P

In saying that... I am still running a D15 in the C2Q rig, it's actually an amazing cooler, it's just big, bulky and heavy...

JRPGfan said:

Another thing is that the 9900k uses more power and produces more heat than the 3900x.

Even if Intel says its TPD is 95watts, and AMD rates theirs at 105watts.
(they have differnt ways of messureing this stuff)

It's always been that way though, even back in the days when people were debating between a Pentium 4 and an Athlon XP.



--::{PC Gaming Master Race}::--

Around the Network
JRPGfan said:
vivster said:

But can it run Crysis?

Edit: I just checked out of curiosity because the price point of $500 seemed a bit high for AMD. My hunch was correct. Looks like the 9900K is actually cheaper AND slightly better for gaming than the 3900X. Did not expect that. The 9700 is even cheaper and still better. Looks like the 3700X is basically the same as the 3900X in games but loads cheaper. I'm just gonna hope for everyone that those core heavy monsters will become useful in gaming at some point.

^ note this is from before the Agesa ABBA (boost clock fix) update.

And this is with one of those factory overclocked 2080 Ti's.... the CPU isnt the bottlenecks for performance.

On avg its probably less than 1% performance differnce, between the two, for general gameing (at 1440p).

"The 9700k is even cheaper and still better." - vivster

~0,4% performance differnce.

Yeah the "smart" consumer buys a 9600k or 3600X and calls it a day.
Better to save the money for a stronger graphics card instead.

Sounds like I'm good. Though the benchmarks I looked up on computerbase.de had a few more percentages between them. I can't have my PC subjected to 1% inferior performance when I build a brand new rig.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
JRPGfan said:

^ note this is from before the Agesa ABBA (boost clock fix) update.

And this is with one of those factory overclocked 2080 Ti's.... the CPU isnt the bottlenecks for performance.

On avg its probably less than 1% performance differnce, between the two, for general gameing (at 1440p).

"The 9700k is even cheaper and still better." - vivster

~0,4% performance differnce.

Yeah the "smart" consumer buys a 9600k or 3600X and calls it a day.
Better to save the money for a stronger graphics card instead.

Sounds like I'm good. Though the benchmarks I looked up on computerbase.de had a few more percentages between them. I can't have my PC subjected to 1% inferior performance when I build a brand new rig.

If you want to see bigger gaps between CPUs, you'll probably have to go to PC Games Hardware (pcgh.de), since they test in 720p and reduce some graphic details (not all, since some like particle effects actually are more CPU intensive tasks) to get as far away from a GPU limit as possible.

There, the 3900X does slightly edge out the 9700k in gaming (95.6 to 93.3 points, 9900K is the benchmark in gaming with 100 points).

Though that was before the dancing queen ABBA Agesa update, which gave some games up to 7% more performance, as Hardware Unboxed found out in their tests.

Besides, due to not having SMT, I'd rather take an 8700K than a 9700K if I had a choice. While the former has 2 less cores, it has 4 more threads in total and should thus have more reserves for later games.



How pointless, not many enthusiasts even game at 1080p. It is mostly 1440p/144Hz, it is where playing at is best.



Random_Matt said:
How pointless, not many enthusiasts even game at 1080p. It is mostly 1440p/144Hz, it is where playing at is best.

^ this.

You dont buy a monster PC to game at 720p, with low settings.


Thats why I linked a 1440p bench, with that ~1% differnce between the 3900x and a 9900k.
It shows real world, situtation (where it matters).

The fact is if you have a overclocked 2080ti, theres many CPU's able to fully feed it.
For gameing its a waste of money currently to buy alot of these upper limit CPUs.

Why waste money on a beefy cpu (for gameing) just to run into a GPU bottleneck?
(maybe with next gen games, and stronger GPU's than are currently on the market, this will change, but right now? just a waste imo)



Around the Network
Random_Matt said:
How pointless, not many enthusiasts even game at 1080p. It is mostly 1440p/144Hz, it is where playing at is best.

JRPGfan said:
Random_Matt said:
How pointless, not many enthusiasts even game at 1080p. It is mostly 1440p/144Hz, it is where playing at is best.

^ this.

You dont buy a monster PC to game at 720p, with low settings.


Thats why I linked a 1440p bench, with that ~1% differnce between the 3900x and a 9900k.
It shows real world, situtation (where it matters).

The fact is if you have a overclocked 2080ti, theres many CPU's able to fully feed it.
For gameing its a waste of money currently to buy alot of these upper limit CPUs.

Why waste money on a beefy cpu (for gameing) just to run into a GPU bottleneck?
(maybe with next gen games, and stronger GPU's than are currently on the market, this will change, but right now? just a waste imo)

looks like both of you didn't understand why they test in 720p.

They also do test in 1080p, 1440p and in 4k (the latter only in GPU tests), but the CPU score comes from 720p as it eliminates the GPU limit. It's to push the GPU limit as far away as possible to see what the reserves of the CPU are for future, more demanding games or doing stuff on the side, like streaming for instance

1440p tests are good to see what they reach now, but doesn't show what the limits of the CPUs are, since those tests are always GPU bound. So no 1440p bench can tell you if the CPU has any headroom left. Granted, it's not perfect (core utilization, or lack of it, has an effect on the score), but it still shows how much headroom, if any, the CPU has left.



Bofferbrauer2 said:
Random_Matt said:
How pointless, not many enthusiasts even game at 1080p. It is mostly 1440p/144Hz, it is where playing at is best.

JRPGfan said:

^ this.

You dont buy a monster PC to game at 720p, with low settings.


Thats why I linked a 1440p bench, with that ~1% differnce between the 3900x and a 9900k.
It shows real world, situtation (where it matters).

The fact is if you have a overclocked 2080ti, theres many CPU's able to fully feed it.
For gameing its a waste of money currently to buy alot of these upper limit CPUs.

Why waste money on a beefy cpu (for gameing) just to run into a GPU bottleneck?
(maybe with next gen games, and stronger GPU's than are currently on the market, this will change, but right now? just a waste imo)

looks like both of you didn't understand why they test in 720p.

They also do test in 1080p, 1440p and in 4k (the latter only in GPU tests), but the CPU score comes from 720p as it eliminates the GPU limit. It's to push the GPU limit as far away as possible to see what the reserves of the CPU are for future, more demanding games or doing stuff on the side, like streaming for instance

1440p tests are good to see what they reach now, but doesn't show what the limits of the CPUs are, since those tests are always GPU bound. So no 1440p bench can tell you if the CPU has any headroom left. Granted, it's not perfect (core utilization, or lack of it, has an effect on the score), but it still shows how much headroom, if any, the CPU has left.

Thats just it though.... its a non-sense test, because it doesnt reflect the real world usage of the CPU.
It honestly shouldnt matter to anyone who's fastest at 720p (eliminating the GPU limits) because thats never going to show up in real world usage.

"It's to push the GPU limit as far away as possible to see what the reserves of the CPU are for future, more demanding games or doing stuff on the side, like streaming for instance"

Rather wait for the future to get here, and then see then, how they actually perform, instead of useing a metric that doesnt matter, to try and predict the future.

In my view those 720p tests, are just numbers without meaning, for bragging rights.
Its a stupid test, if it doesnt actually reflect anything (no one buys these monster cpus, or gpus to game at 720p).
Also you cant be sure that what works now at 720p benchmark, will reflect performance in the future.

That makes it a worthless bench.
And a bad way to compare CPUs.

Last edited by JRPGfan - on 20 September 2019

Bofferbrauer2 said:

[...]

JRPGfan said:
Bofferbrauer2 said:

looks like both of you didn't understand why they test in 720p.

They also do test in 1080p, 1440p and in 4k (the latter only in GPU tests), but the CPU score comes from 720p as it eliminates the GPU limit. It's to push the GPU limit as far away as possible to see what the reserves of the CPU are for future, more demanding games or doing stuff on the side, like streaming for instance

1440p tests are good to see what they reach now, but doesn't show what the limits of the CPUs are, since those tests are always GPU bound. So no 1440p bench can tell you if the CPU has any headroom left. Granted, it's not perfect (core utilization, or lack of it, has an effect on the score), but it still shows how much headroom, if any, the CPU has left.

Thats just it though.... its a non-sense test, because it doesnt reflect the real world usage of the CPU.
It honestly shouldnt matter to anyone who's fastest at 720p (eliminating the GPU limits) because thats never going to show up in real world usage.

"It's to push the GPU limit as far away as possible to see what the reserves of the CPU are for future, more demanding games or doing stuff on the side, like streaming for instance"

Rather wait for the future to get here, and then see then, how they actually perform, instead of useing a metric that doesnt matter, to try and predict the future.

In my view those 720p tests, are just numbers without meaning, for bragging rights.
Its a stupid test, if it doesnt actually reflect anything (no one buys these monster cpus, or gpus to game at 720p).
Also you cant be sure that what works now at 720p benchmark, will reflect performance in the future.

That makes it a worthless bench.
And a bad way to compare CPUs.


Not stupid or worthless at all, knowing the CPU raw power can be useful in many cases, if one uses the PC for other tasks besides gaming, or if some devs decide to push gameplay and game world complexity instead of graphics, this for the present, and it can be useful to plan future GPU upgrades too.
Taken alone that test would be pointless, but taken together with higher res tests it helps coming to useful conclusions, for example that if you play at 1080p or 1440p and don't plan a major GPU upgrade soon, you can just buy the cheapest CPU amongst the top 12, but if you plan to upgrade at regular and not too long intervals, one time the CPU and the other the GPU, and you stick to high-end GPUs, it will be better considering the top 5 CPUs. Obviously benchmarks alone won't be enough, knowing the architecture is important too, as an important criterion to decide a CPU upgrade is how soon your favourite devs will start using more efficiently more cores that they currently do. But those benchmarks are useful also for buyers on a budget, as they tell them how cheap they can go without noticing a major performance downgrade, how cheaper they can go with a downgrade that keeps performance still more than acceptable or how many compromises they'll have to accept if they absolutely want to, or must, go even cheaper.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


I rather sacrifice 1 or 5 fps in some games to have that 4 extra cores because gaming is not all I do. "B-but deskpro2k3 what about muh 350fps instead of 340fps in esports titles!?" 

Conclusion: You can’t tell a difference if the fps counter is off.

When the new bios boost clock fix from AMD drops this September 30th, I wonder if opinions will change.

source: https://www.techradar.com/news/amds-fix-to-improve-ryzen-3000-cpu-boost-speeds-has-leaked-and-you-can-try-it-for-yourself

According to AMD:

"We understand that there are some users who expressed concerns about their ability to hit the maximum boost frequency of their product. The new BIOS resolves this issue by implementing the performance optimization to enhance the frequency which will add approximately 25-50MHz to the current boost frequencies under various workloads."



CPU: Ryzen 7950X
GPU: MSI 4090 SUPRIM X 24G
Motherboard: MSI MEG X670E GODLIKE
RAM: CORSAIR DOMINATOR PLATINUM 32GB DDR5
SSD: Kingston FURY Renegade 4TB
Gaming Console: PLAYSTATION 5
Alby_da_Wolf said:

Not stupid or worthless at all, knowing the CPU raw power can be useful in many cases, if one uses the PC for other tasks besides gaming, or if some devs decide to push gameplay and game world complexity instead of graphics, this for the present, and it can be useful to plan future GPU upgrades too.
Taken alone that test would be pointless, but taken together with higher res tests it helps coming to useful conclusions, for example that if you play at 1080p or 1440p and don't plan a major GPU upgrade soon, you can just buy the cheapest CPU amongst the top 12, but if you plan to upgrade at regular and not too long intervals, one time the CPU and the other the GPU, and you stick to high-end GPUs, it will be better considering the top 5 CPUs. Obviously benchmarks alone won't be enough, knowing the architecture is important too, as an important criterion to decide a CPU upgrade is how soon your favourite devs will start using more efficiently more cores that they currently do. But those benchmarks are useful also for buyers on a budget, as they tell them how cheap they can go without noticing a major performance downgrade, how cheaper they can go with a downgrade that keeps performance still more than acceptable or how many compromises they'll have to accept if they absolutely want to, or must, go even cheaper.

720p gaming benchmarks are dumb and don't have any real world performance correlation. How does one know if they're not hitting some I/O bottleneck or game/graphics code framerate scaling issues ? Trying to eliminate one source of bottleneck like the GPU does not eliminate the other potential sources of bottlenecks in isolation to measuring CPU performance ... 

As for other tasks besides gaming, AMD has a clear advantage in productivity so this is a moot point and mechanical complexity can also do more than just hit the CPU. What if the newly complex game logic hits the GPU most ? 

720p benchmarks might as well be synthetic benchmarks in a sense since that resolution has severe image quality issues according to most gamers ...