By using this site, you agree to our Privacy Policy and our Terms of Use. Close
freedquaker said:
ninetailschris said:
"The CPUs, on the vast majority of the time, are not the bottleneck, and because those are gaming machines, not general purpose PCs, there is no point in putting a CPU on a console, that is faster than necessary, as long it is not a bottleneck on the games, and have sufficient performance for other tasks such as Bluray playback etc..."

I stopped here.

Do you know anything about CPU and performance on games? Depending on how you respond will determine how I show you to be factually incorrect.

 

I'll be more corteous upon your condescending attitude, although I do know how to code in several languages (though not a professional programmer), and worked in AI starting from prolog (http://en.wikipedia.org/wiki/Prolog), so yes I know how cpus work, thank you very much.

The Impact of CPU, with the exception of heavy AI and physics is small, especially with right to the metal programming, extremely low levels of CPU calls, and parallelizable architecture. I am sure you are aware that AMD's Kaveri (very similar to PS4 but with half the cores) outperforms pretty much all high end intel CPUs in gaming although they are much beefier CPUs and AMD cores don't take advantage of any specific metal to the bone architecture, like mantle which appears to increase performance up to 54% in CPU bound scenarios (which is the relevant here). That is of course, unless you have been living under the rock for the last few years, and unless you think you know way better than Sony or Microsoft than their own machines, and what kind of CPU they'd need.

And hey, do you also remember that the original Xbox used a celeron CPU (customized cheap Pentium 3 variant), which was ridiculed by the industry while Xbox still managed to produce much better graphics than not only both PS2 and Gamecube but also most PCs with hig end CPUs at the time!.

As a reference, when Xbox was released with a ridiculous 733 Mhz Celeron processor, new generation Pentium IIIs at 1.4 Ghz as well as 2 Ghz Pentium IVs (Williamette) were already released (practically triple the performance of Xbox CPU). In those days, where the CPU performance increased rapidly, and 733 Mhz Celeron was barely enough to play DVD! So that extra CPU muscle was vital. Even then it was sufficient for Xbox to produce great games with that kind of CPU!

Oh, man, you've got a lot to learn... Not from me, but from Sony and Microsoft...

As an example, take a look at the CPU performance comparison here, in a parallelized but low level unoptimized scenario, from which you should figure that CPU performance hardly makes any difference on a game like Battlefield 4 even under a greater load than 1080p (1900x1200). Higher resolutions might be demanding for the CPU as well, but not for anything between 720p and 1080p, nope!

http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html

Heres a huge tip, modern PC games are weighted towards GPU usage over CPU usage, because modern GPUS can take it.
Feel free to do a comparison of a game that does more calculation on the CPU (just like consoles), such as Minecraft.

Having spent much of last gen working on the PS3 and 360 devkits making retail games, and now PS4/XBO/PC, your talk about CPU usage on consoles is utter nonsense.