By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:

That's exactly right. In saying that, it's not entirely useless... Especially if you game at 720P.

It more or less shows the CPU differences in the best possible scenario, not a real world one... It's an extra datapoint to base a purchasing decision on.

Gaming at 720p is not at all realistic on desktop/home platforms. If it were portable systems like laptops then it'd be a realistic scenario because 720p would still come in handy for power constrained scenarios and the PPI density would stay similar enough so that the hit to image quality isn't as severe on bigger displays ... 

Benchmarking at 720p (especially at low settings) is a bad idea as it is since it'll mostly reveal a flaw in the game code as framerate scaling is not anywhere close to being linear in proportion to the increase in a CPUs single threaded performance! (there comes a point where measuring CPU performance can become limited by the game code itself so it then becomes inappropriate for benchmarking purposes) 

Using 720p with low settings to measure CPU performance is almost a purely academic exercise rather than a real world one because code quality then almost becomes a factor in itself at that point so bias can still manifest in the background for other reasons ... (it's very hard to purely measure CPU performance in isolation because code can also contribute as a source of bias) 

Just as I had explained before, old code is bad code for benchmarking because it is unmaintained code and likewise poorly designed code is also bade code for benchmarking as well since much of the game code out there isn't well suited for benchmarking at very high framerates so when performing tests we should always stick to realistic cases instead ... 

That means benchmarking 32-bit applications becomes a non-starter since many of them are obviously unmaintained and then benchmarking at low resolutions/quality presets may reveal more biases about the game code itself as observed in Far Cry games rather than pure CPU performance since a certain vendor's CPU may be able to optimize a couple more hot loops than the other vendor's CPU when the programmer built that into the code design ... 

We don't do game benchmarks at 480p anymore because of serious image quality issues and we shouldn't do game benchmarks at 720p anymore as well for the same reasons to a lesser extent but not especially when the new systems will be coming out to make that resolution extinct ... 

IMO when next generation comes around, 720p tests needs to be buried for good because it's not cutting it anymore for a high-end experience even for high framerates ...