Quantcast
I kicked intel to the curb, I'm a Ryzen boi now.

Forums - PC Discussion - I kicked intel to the curb, I'm a Ryzen boi now.

Alby_da_Wolf said:

Not stupid or worthless at all, knowing the CPU raw power can be useful in many cases, if one uses the PC for other tasks besides gaming, or if some devs decide to push gameplay and game world complexity instead of graphics, this for the present, and it can be useful to plan future GPU upgrades too.
Taken alone that test would be pointless, but taken together with higher res tests it helps coming to useful conclusions, for example that if you play at 1080p or 1440p and don't plan a major GPU upgrade soon, you can just buy the cheapest CPU amongst the top 12, but if you plan to upgrade at regular and not too long intervals, one time the CPU and the other the GPU, and you stick to high-end GPUs, it will be better considering the top 5 CPUs. Obviously benchmarks alone won't be enough, knowing the architecture is important too, as an important criterion to decide a CPU upgrade is how soon your favourite devs will start using more efficiently more cores that they currently do. But those benchmarks are useful also for buyers on a budget, as they tell them how cheap they can go without noticing a major performance downgrade, how cheaper they can go with a downgrade that keeps performance still more than acceptable or how many compromises they'll have to accept if they absolutely want to, or must, go even cheaper.

720p gaming benchmarks are dumb and don't have any real world performance correlation. How does one know if they're not hitting some I/O bottleneck or game/graphics code framerate scaling issues ? Trying to eliminate one source of bottleneck like the GPU does not eliminate the other potential sources of bottlenecks in isolation to measuring CPU performance ... 

As for other tasks besides gaming, AMD has a clear advantage in productivity so this is a moot point and mechanical complexity can also do more than just hit the CPU. What if the newly complex game logic hits the GPU most ? 

720p benchmarks might as well be synthetic benchmarks in a sense since that resolution has severe image quality issues according to most gamers ...



Around the Network

People still pushing 720p, lol. Again pointless and irrelevant.



JRPGfan said:
Random_Matt said:
How pointless, not many enthusiasts even game at 1080p. It is mostly 1440p/144Hz, it is where playing at is best.

^ this.

You dont buy a monster PC to game at 720p, with low settings.


Thats why I linked a 1440p bench, with that ~1% differnce between the 3900x and a 9900k.
It shows real world, situtation (where it matters).

The fact is if you have a overclocked 2080ti, theres many CPU's able to fully feed it.
For gameing its a waste of money currently to buy alot of these upper limit CPUs.

Why waste money on a beefy cpu (for gameing) just to run into a GPU bottleneck?
(maybe with next gen games, and stronger GPU's than are currently on the market, this will change, but right now? just a waste imo)

Because some people run programs in conjunction with their games.

And the CPU tends to outlive the GPU, you will upgrade a GPU more than once per CPU, so it makes sense to get a good CPU to start with.

fatslob-:O said:

720p gaming benchmarks are dumb and don't have any real world performance correlation. How does one know if they're not hitting some I/O bottleneck or game/graphics code framerate scaling issues ? Trying to eliminate one source of bottleneck like the GPU does not eliminate the other potential sources of bottlenecks in isolation to measuring CPU performance ... 

As for other tasks besides gaming, AMD has a clear advantage in productivity so this is a moot point and mechanical complexity can also do more than just hit the CPU. What if the newly complex game logic hits the GPU most ? 

720p benchmarks might as well be synthetic benchmarks in a sense since that resolution has severe image quality issues according to most gamers ...

That's exactly right. In saying that, it's not entirely useless... Especially if you game at 720P.

It more or less shows the CPU differences in the best possible scenario, not a real world one... It's an extra datapoint to base a purchasing decision on.






--::{PC Gaming Master Race}::--

Pemalite said:

That's exactly right. In saying that, it's not entirely useless... Especially if you game at 720P.

It more or less shows the CPU differences in the best possible scenario, not a real world one... It's an extra datapoint to base a purchasing decision on.

Gaming at 720p is not at all realistic on desktop/home platforms. If it were portable systems like laptops then it'd be a realistic scenario because 720p would still come in handy for power constrained scenarios and the PPI density would stay similar enough so that the hit to image quality isn't as severe on bigger displays ... 

Benchmarking at 720p (especially at low settings) is a bad idea as it is since it'll mostly reveal a flaw in the game code as framerate scaling is not anywhere close to being linear in proportion to the increase in a CPUs single threaded performance! (there comes a point where measuring CPU performance can become limited by the game code itself so it then becomes inappropriate for benchmarking purposes) 

Using 720p with low settings to measure CPU performance is almost a purely academic exercise rather than a real world one because code quality then almost becomes a factor in itself at that point so bias can still manifest in the background for other reasons ... (it's very hard to purely measure CPU performance in isolation because code can also contribute as a source of bias) 

Just as I had explained before, old code is bad code for benchmarking because it is unmaintained code and likewise poorly designed code is also bade code for benchmarking as well since much of the game code out there isn't well suited for benchmarking at very high framerates so when performing tests we should always stick to realistic cases instead ... 

That means benchmarking 32-bit applications becomes a non-starter since many of them are obviously unmaintained and then benchmarking at low resolutions/quality presets may reveal more biases about the game code itself as observed in Far Cry games rather than pure CPU performance since a certain vendor's CPU may be able to optimize a couple more hot loops than the other vendor's CPU when the programmer built that into the code design ... 

We don't do game benchmarks at 480p anymore because of serious image quality issues and we shouldn't do game benchmarks at 720p anymore as well for the same reasons to a lesser extent but not especially when the new systems will be coming out to make that resolution extinct ... 

IMO when next generation comes around, 720p tests needs to be buried for good because it's not cutting it anymore for a high-end experience even for high framerates ... 



Nobody is saying the 720p tests are for people that game at 720p. It's a CPU test to determine how much of the CPU itself is being used in the games. You can then use that to gauge potential CPU bottlenecks either in multi-application sessions or future CPU intensive games.

Why is this so hard for you some of you grasp?



Massimus - "Trump already has democrat support."

Around the Network
SpokenTruth said:
Nobody is saying the 720p tests are for people that game at 720p. It's a CPU test to determine how much of the CPU itself is being used in the games. You can then use that to gauge potential CPU bottlenecks either in multi-application sessions or future CPU intensive games.

Why is this so hard for you some of you grasp?

I think everyone grasps that.

What your not grasping is that it doesnt matter..... in the real world.
Also No two games are the exact same, with demands or optimisations, and you dont know how games in the future will be run (from running old games at 720p).

Also Its a bad way to "gauge cpu bottlenecks".

All you need to do, is look at the 1440p benchmarks with 1% differnce in performance.
Once CPU's reach that point, where the performance between them all is soo small, it means its not the CPU holding anything back anymore.



Eh, 8700k costs around 350€ and performs better according to those benchmarks. So cheaper and better seems to fit.
As a cooler, I use the same old crappy fan I used for the 3770k. Huehuehue



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | Nvidia RTX 2080 Ti 11GB VRAM | Asus PG27UQ gaming on 3840 x 2160 @120 Hz GSYNC HDR| HTC Vive Pro :3

Reached PC Masterrace level.

SpokenTruth said:
Nobody is saying the 720p tests are for people that game at 720p. It's a CPU test to determine how much of the CPU itself is being used in the games. You can then use that to gauge potential CPU bottlenecks either in multi-application sessions or future CPU intensive games.

Why is this so hard for you some of you grasp?

The bold is not true ... 

Having multiple applications don't matter much anymore with multiple cores when games mostly only hit a couple of the threads very hard and future CPU intensive games will likely have better code design so 720p performance today won't necessarily scale to 720p performance in the future ... 



fatslob-:O said:

Gaming at 720p is not at all realistic on desktop/home platforms. If it were portable systems like laptops then it'd be a realistic scenario because 720p would still come in handy for power constrained scenarios and the PPI density would stay similar enough so that the hit to image quality isn't as severe on bigger displays ... 

Absolutely false. 720P is still important and used often by a chunk of the market.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Often it's desktops that use integrated graphics or the lowest-rung discreet GPU's that would sit at 720P... It's not just notebooks and portable systems.



--::{PC Gaming Master Race}::--

Pemalite said:

Absolutely false. 720P is still important and used often by a chunk of the market.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Often it's desktops that use integrated graphics or the lowest-rung discreet GPU's that would sit at 720P... It's not just notebooks and portable systems.

I think you need to recheck the steam hardware survey stats since only 0.4% of all displays account for a 720p resolution. The next most common resolution is 768p which is a very rare desktop supported monitor resolution and is more commonly found in Laptop supported display resolutions ... 

Also, integrated graphics is irrelevant to the topic of this discussion at hand which is about CPU performance because at that point graphics performance is more important than CPU performance ... 

720p is especially irrelevant for high-end gaming and shouldn't be used for testing purposes at all ... 

Last edited by fatslob-:O - on 22 September 2019