By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Debunking the Myth that Next Gen Consoles are too weak

Oh and for the Steam survey keep in mind that it has a small sample size (they haven't said how big) so has a margin of error. I for example have participated 3 times (you are prompted to opt in when you are included) in all the years I have been on Steam and they do it every month, and I know people that have never participated. Also it only scans the system you are running at the time so if for example someone has a monster rig but they are scanned while they are using their work laptop that they play Football manager on while on the road then only the laptop will be included in the stats. Also the PC market is not static there is an upwards trend in hardware specs over time, as more powerful hardware becomes cheaper and games start requiring more to run forcing users to upgrade there will be an upswing in the average hardware power. Also people with multi-GPU setups aren't properly reflected in the stats.

Also there are as of last month 75 million active Steam users. So for example the high end end recently released GTX 780 at 0.55% (up 100% over the previous month) of Steam users is ~412500 people. The GTX 7970 which is double the raw performance of the PS4's GPU is at 0.9% which is 675000 people. The GTX 770 at 1.12% would be 840000. Of course these numbers are pure bull because the survey is not actually reliable for anything but broad trends, but you get the picture.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network

"The CPUs, on the vast majority of the time, are not the bottleneck, and because those are gaming machines, not general purpose PCs, there is no point in putting a CPU on a console, that is faster than necessary, as long it is not a bottleneck on the games, and have sufficient performance for other tasks such as Bluray playback etc..."

I stopped here.

Do you know anything about CPU and performance on games? Depending on how you respond will determine how I show you to be factually incorrect.



"Excuse me sir, I see you have a weapon. Why don't you put it down and let's settle this like gentlemen"  ~ max

ninetailschris said:
"The CPUs, on the vast majority of the time, are not the bottleneck, and because those are gaming machines, not general purpose PCs, there is no point in putting a CPU on a console, that is faster than necessary, as long it is not a bottleneck on the games, and have sufficient performance for other tasks such as Bluray playback etc..."

I stopped here.

Do you know anything about CPU and performance on games? Depending on how you respond will determine how I show you to be factually incorrect.

 

I'll be more corteous upon your condescending attitude, although I do know how to code in several languages (though not a professional programmer), and worked in AI starting from prolog (http://en.wikipedia.org/wiki/Prolog), so yes I know how cpus work, thank you very much.

The Impact of CPU, with the exception of heavy AI and physics is small, especially with right to the metal programming, extremely low levels of CPU calls, and parallelizable architecture. I am sure you are aware that AMD's Kaveri (very similar to PS4 but with half the cores) outperforms pretty much all high end intel CPUs in gaming although they are much beefier CPUs and AMD cores don't take advantage of any specific metal to the bone architecture, like mantle which appears to increase performance up to 54% in CPU bound scenarios (which is the relevant here). That is of course, unless you have been living under the rock for the last few years, and unless you think you know way better than Sony or Microsoft than their own machines, and what kind of CPU they'd need.

And hey, do you also remember that the original Xbox used a celeron CPU (customized cheap Pentium 3 variant), which was ridiculed by the industry while Xbox still managed to produce much better graphics than not only both PS2 and Gamecube but also most PCs with hig end CPUs at the time!.

As a reference, when Xbox was released with a ridiculous 733 Mhz Celeron processor, new generation Pentium IIIs at 1.4 Ghz as well as 2 Ghz Pentium IVs (Williamette) were already released (practically triple the performance of Xbox CPU). In those days, where the CPU performance increased rapidly, and 733 Mhz Celeron was barely enough to play DVD! So that extra CPU muscle was vital. Even then it was sufficient for Xbox to produce great games with that kind of CPU!

Oh, man, you've got a lot to learn... Not from me, but from Sony and Microsoft...

As an example, take a look at the CPU performance comparison here, in a parallelized but low level unoptimized scenario, from which you should figure that CPU performance hardly makes any difference on a game like Battlefield 4 even under a greater load than 1080p (1900x1200). Higher resolutions might be demanding for the CPU as well, but not for anything between 720p and 1080p, nope!

http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

Considering the degree to which the console gaming market is expected to diminish, especially in this forum, I doubt current gen could've financially afforded any better specs. Unfortunately, High Tier PC gaming is not mainstream enough to utilize their extra power more effectively then higher resolutions and AA.

Within that regard, developers will have to find other ways to improve gameplay.
In regards to consoles being powerful enough, I agree on the grounds that the market has set the upper bound.
Any Console above 600+ is destined for failure, and the PC model doesn't have that limitation.
Ultimately, the PC model is advantageous inherently because consumers spend as much as they want on hardware, but because the disparity between low end models and high end models is what makes consoles viable in the first place.
Either streaming or Moore's Law will make PC replace Consoles, but I don't agree that Steambox will.



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

freedquaker said:
ninetailschris said:
"The CPUs, on the vast majority of the time, are not the bottleneck, and because those are gaming machines, not general purpose PCs, there is no point in putting a CPU on a console, that is faster than necessary, as long it is not a bottleneck on the games, and have sufficient performance for other tasks such as Bluray playback etc..."

I stopped here.

Do you know anything about CPU and performance on games? Depending on how you respond will determine how I show you to be factually incorrect.

 

I'll be more corteous upon your condescending attitude, although I do know how to code in several languages (though not a professional programmer), and worked in AI starting from prolog (http://en.wikipedia.org/wiki/Prolog), so yes I know how cpus work, thank you very much.

The Impact of CPU, with the exception of heavy AI and physics is small, especially with right to the metal programming, extremely low levels of CPU calls, and parallelizable architecture. I am sure you are aware that AMD's Kaveri (very similar to PS4 but with half the cores) outperforms pretty much all high end intel CPUs in gaming although they are much beefier CPUs and AMD cores don't take advantage of any specific metal to the bone architecture, like mantle which appears to increase performance up to 54% in CPU bound scenarios (which is the relevant here). That is of course, unless you have been living under the rock for the last few years, and unless you think you know way better than Sony or Microsoft than their own machines, and what kind of CPU they'd need.

And hey, do you also remember that the original Xbox used a celeron CPU (customized cheap Pentium 3 variant), which was ridiculed by the industry while Xbox still managed to produce much better graphics than not only both PS2 and Gamecube but also most PCs with hig end CPUs at the time!.

As a reference, when Xbox was released with a ridiculous 733 Mhz Celeron processor, new generation Pentium IIIs at 1.4 Ghz as well as 2 Ghz Pentium IVs (Williamette) were already released (practically triple the performance of Xbox CPU). In those days, where the CPU performance increased rapidly, and 733 Mhz Celeron was barely enough to play DVD! So that extra CPU muscle was vital. Even then it was sufficient for Xbox to produce great games with that kind of CPU!

Oh, man, you've got a lot to learn... Not from me, but from Sony and Microsoft...

As an example, take a look at the CPU performance comparison here, in a parallelized but low level unoptimized scenario, from which you should figure that CPU performance hardly makes any difference on a game like Battlefield 4 even under a greater load than 1080p (1900x1200). Higher resolutions might be demanding for the CPU as well, but not for anything between 720p and 1080p, nope!

http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html

Heres a huge tip, modern PC games are weighted towards GPU usage over CPU usage, because modern GPUS can take it.
Feel free to do a comparison of a game that does more calculation on the CPU (just like consoles), such as Minecraft.

Having spent much of last gen working on the PS3 and 360 devkits making retail games, and now PS4/XBO/PC, your talk about CPU usage on consoles is utter nonsense.



Around the Network
freedquaker said:


Here is the thing, I agree with pretty much everything you said, so my arguments will be about trying to shed a light on where exactly I am coming or what perspective I am looking at it from... I have narrowed it down to the highlighted points above...

a) I know I was comparing the system RAM to all of the RAM on the consoles, but hey I did what you suggested also several times, it doesn't change anything, other than altering the numbers slightly. I was saving myself some time from this extra chore. The important thing there is that, whether it be with or without the video RAM, the relative amount of RAM on consoles had never been this abundant. So in terms of memory, in comparison to the PC, this generation is incredible. Again this is not actually a comparison to the PC, but a comparison to the earlier generation playstations compared to the PCs of their generation (so I take the PC as a yardstick for relative performance).


Altering the numbers slightly?
If you have a PC with 4Gb of System Ram and 1Gb video card, you are omitting an extra 25% of the systems memory, that's not insignificant, that's just only half as severe as claiming the Playstation 3 only has 256Mb of Ram.
Saving yourself time, made your post factually innacurate in order to skew the results.

Besides, I do agree that Ram wise the consoles have never been this close to the PC, but the rest of the system has also never been so sub-par and behind, there is a *massive* gap.
Ram won't make up for performance deficits, it doesn't do any form of processing, that's something that seems to escape many people in these forums who grab a number and run with it in order to win an argument.

freedquaker said:


b) When I was saying that the games were not made with this in mind, I was referring to the new technologies such as HUMA in the next consoles, rather than GDDR5 RAM. but it's also likely that designing a game for slow RAM and then extrapolating to fast RAM will not yield the same results as designing it for the fast RAM from the scratch, which will effect the game design etc. Basically all console-specific (idiosyncratic) features will improve the performance.


HUMA isn't a solution to sub-par performance due to the lack of hardware resources.
HUMA is also merely a stepping stone for AMD's ultimate eventual goals with it's Fusian initiative.

freedquaker said:


c) I am not saying MS should have designed a PS clone at all. They tried to scale up the X360 design, and while the RAM increased 8 folds, the ESRAM increased only 3.2x which came at the expensive of compute units etc, crippling the machine. They were obviously very near-sighted and couldn't see the performance they'd get. If they had taken a non unified architecture, with 4 GB DDR3 + 4 GB GDDR5 or something like that (which is nothing like PS4 but rather like PS3), they wouldn't have to sacrifice any performance at all. The main problem today is not actually the 40-50% raw performance deficit (to some extent yes but not the major bottleneck), it's the slow RAM and too small of ESRAM, that's dragging the system down.


 

Bundling two different memory types would have driven up costs.
Consoles are cost sensitive devices, they only have a limited budget to work with.
Sony's gamble simply paid off, they would have ended up with only half the amount of Ram if higher density memory modules didn't get released in time, which was a gamble that Microsoft was obviously not willing to make.

freedquaker said:



The Impact of CPU, with the exception of heavy AI and physics is small, especially with right to the metal programming, extremely low levels of CPU calls, and parallelizable architecture. I am sure you are aware that AMD's Kaveri (very similar to PS4 but with half the cores) outperforms pretty much all high end intel CPUs in gaming although they are much beefier CPUs and AMD cores don't take advantage of any specific metal to the bone architecture, like mantle which appears to increase performance up to 54% in CPU bound scenarios (which is the relevant here). That is of course, unless you have been living under the rock for the last few years, and unless you think you know way better than Sony or Microsoft than their own machines, and what kind of CPU they'd need.


Kaveri doesn't out-perform all of Intel's high-end CPU's.
The high-end being Socket 2011.

The Black bars in the charts are Kaveri. Note how potent the mid-range 3 year old Core i5 2500K is.




Now the take away from this is, when-ever something uses plain-jane sequential CPU cycles, Intel still has a massive lead. (It should also be noted, this only includes the Mid-range quad cores, rather than the high-end Hex cores.)
AMD can beat Intel when something can leverage the GPU to assist in processing, thus by extension an Intel Quad, paired up with a GCN GPU would still put Kaveri to shame.

Fact of the matter is, for decades none of the console manufacturers have ever taken CPU performance seriously and why would they? The average joe only cares about graphics, it's a massive selling point and the consoles are built to match such expectations.
The PC offers a no-compromise gaming solution in terms of fidelity, consoles can't match 1440P, 1600P, 4k or eyefinity if you're willing to pay for it.
I'm afraid these days 720P and 1080P are low-end resolutions that I expect out of a mobile phone, not a powerfull gaming device.


freedquaker said:

And hey, do you also remember that the original Xbox used a celeron CPU (customized cheap Pentium 3 variant), which was ridiculed by the industry while Xbox still managed to produce much better graphics than not only both PS2 and Gamecube but also most PCs with hig end CPUs at the time!.

As a reference, when Xbox was released with a ridiculous 733 Mhz Celeron processor, new generation Pentium IIIs at 1.4 Ghz as well as 2 Ghz Pentium IVs (Williamette) were already released (practically triple the performance of Xbox CPU). In those days, where the CPU performance increased rapidly, and 733 Mhz Celeron was barely enough to play DVD! So that extra CPU muscle was vital. Even then it was sufficient for Xbox to produce great games with that kind of CPU!

 

The origional Xbox didn't use a Celeron.
It used a Pentium 3 derived processor, but with half the L2 cache.
If it was a Celeron it would have had half the cache associativity, which would have knocked the CPU performance by a good 10% or so downwards.

As for the clockspeeds, A Pentium 3 Tualatin running at 1.4ghz is faster than the Pentium 4 Williamette, provided it wasn't something bandwidth intensive that gave the Pentium 4's quad-pumped bus an edge.

Back when the origional Xbox was launched I had a Pentium 3 667 at the start of the generation and I was playing most of the Xbox multi-plats without breaking much of a sweat including Halo, eventually I did upgrade to an Athlon Thunderbird however in preperation for games at the time.

Before that I also had a Cyrix PR300+ which was to put simply... One of the worse CPU's of all time, slower than a Pentium 2 300, guess what? In theory it could not only handle DVD's but also Blu-Rays. (Provided you copied the Blu-ray to a Hard Drive first as USB 1.0 probably isn't fast enough to Stream.)
The reason being is that like the origional Xbox, PC's these days have dedicated hardware blocks in the GPU to assist with playback of movies.
For instance, my single-core 1ghz Intel Atom tablet doesn't have a decoder in the GPU block, so I installed a Broadcom Crystal HD chip in the Mini PCI-E slot to perform that action.
1080P Movies wen't from unplayable to perfectly playable, without touching on the CPU usage.

Older computers lacked such functionality, with a couple of exceptions like GPU's from Matrox or the ATI All-in-wonder cards and a few S3 and 3dfx cards that targeted those niche's back then.
Back then GPU's looked like this anmd you can see the dedicated decoder and TV handling part of the card:



--::{PC Gaming Master Race}::--

lucidium said:
freedquaker said:
 

 

Heres a huge tip, modern PC games are weighted towards GPU usage over CPU usage, because modern GPUS can take it.
Feel free to do a comparison of a game that does more calculation on the CPU (just like consoles), such as Minecraft.

Having spent much of last gen working on the PS3 and 360 devkits making retail games, and now PS4/XBO/PC, your talk about CPU usage on consoles is utter nonsense.


So modern GPUs can take it? You mean most games are GPU bound? Yes, this is what I have been claiming all alone, what else did you think? And my talk about CPU usage is utter nonsense? Then the application by console makers have been utter sense all along then because all consoles I know have always used a weak CPU. I don't know what you mean a modern GPU, but all playstations, for example, had weak CPUs.

Playstation had a 33 mhz CPU which was way weaker than any low end PC at the end. PS2 had a 300 mhz CPU, which was again much weaker than even the 733 Mhz Pentium III (I am not comparing Mhz here, it was more efficient but still slower). PS3's cpu was not even comparable to core 2 duo. So yes, traditionally all consoles have always had weak CPUs. PS4's CPU, although quite weak in single threaded performance, there is a reason why they put 8 cores of it in there instead of regular 4. Because of low level calls, and parallelizable architecture the CPU performance will never be an issue. Please don't continue this unnecessary topic. We all know that the CPU is not the bottleneck here, if anything, it's the GPU. You cannot prove me otherwise with a single example from the Console History, which always had weak CPUs, but never seemed to have an issue. As a note, not only MS and Sony have weak GPUs this generation, in addition to all of their past consoles, Nintendo had even weaker CPUs (WiiU still uses a 1988 technology CPU, now that's an issue!) and compared to the Wii / WiiU, the processor in PS4 is a monster.



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

Pemalite said:


Just a quick question, why are you quoting the CPU performance, instead of the GAMING performance here? Because we all know that Intel CPUs will demolish the Kaveri in CPU bound scenarios but most games are not. On the next 3 pages, there are the game benchmarks, where Kaveri easily outmatches or catches up with High end Intel CPUs which are much more expensive. You seem to have conveniently posted an entirely irrelevant set benchmarks and skip everthing that is relevant!

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/12

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/13

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/14

Please next time, when you quote something about gaming consoles, quote the gaming performance and the impact of CPU. And you know what, this is KAVERI, with discrete graphics, way lower than what PS4 has, and without the low-level improvements and driver optimizations etc...



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

freedquaker said:

So modern GPUs can take it? You mean most games are GPU bound?

Modern PC GPUS.
PC Games are mostly GPU bound.

Consoles have to use both because the cpu and gpu for the xbox one and ps4 are low to mid-range computer components.

But then you have been bouncing back and forth this entire thread trying to make a point that will just never actually work.

freedquaker said:

Just a quick question, why are you quoting the CPU performance, instead of the GAMING performance here?

I will answer for Pemalite, Because in consoles the CPU is used for number crunching, physics, ai, network stack, runtime image, memory management and pool addressing, so your flip flopping between points on PC and Console is useless.

To be blunt, The PS4 is as far behind modern PC tech as the PS3 was, infact, it's FURTHER behind.



kanageddaamen said:
Stats don't matter, show me the results. You can almost see a larger jump from early last gen to late last gen due to a move to deferred rendering/lighting which greatly increased the realism in the lighting of scenes. I have seen nothing from any of the three consoles graphics wise that comes close to ANY other generation leap.

The graphical improvements are marginal at this point.


And I remember with care when X360 was being called XBox 1.5 in its first year because it wasn't a true gen jump while now everybody agrees it was a massive improvement. Let's wait 2 years before we start saying this gen didn't improved anything, this happens every gen.