By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Debunking the Myth that Next Gen Consoles are too weak

freedquaker said:

You know what, I have no objection to any of that. What you are missing is that

a) Those games are not taking advantage of close to the metal programming, and hampered by the high level access of DirectX and Open GL

b) Consoles can utilize CPUs much more efficiently with much faster CPU calss etc. This doesn't mean they'll magically have more CPU muscle but it means the CPU is less of a bottleneck and needed way less

c) Consoles are designed parallel this time around, and it will be taken advantage of, so the single thread performance is not the case here anymore. There is a reason why 8 cores have been in there.

d) Weak CPUs have always been the case in modern Consoles, and their makers gotta be real idiots to put them there otherwise. Again this doesn't mean that on occasion, they'd benefit from faster CPUs but obviously the added performance is not worth it and better spent aelsewhere.

e) For years, I have hardly ever heard developers complaining about the lack of CPU performance (with the exception of Wii). The main culprit of complaint has always been the amount of memory, which is now handled handsomely.

 

It's time to surface up and face the realities of the actual life, rather than diving into some unrealistic technicalities which hardly make practical differences.


Well if you are happy with games that have sevearly limited simulation complexity sure a round of anemic CPUs is fine.  Developers will always design around the bottlenecks of a system and having a weak CPU means designing games which only offer superficial levels of simulation. Basically you will get games that are designed around a weak CPU, so we will get open world games where nothing is simulated but the things you are looking at and everything else just pops out of existance, linear cinimatic experiances and simple arena shooters because that is what devs can do with a weak CPU and a reasonable GPU. Personally I would have liked more games where the game worlds were more complex and persistant. Having lots of RAM is nice but data is useless if you can't proccess it, now don't get me wrong I like high res textures as much as then next guy but if that is all devs can use it for because they don't have much CPU power but plenty of GPU power than that is dissapointing. And before you say it I know GPGPU will allow the GPUs to take over some of the simpler simulation work like particle/cloth physics and maybe some pathfinding and hit detection but it has it's limitations.

A few other notes, while the PS360's GPUs were very limited in terms of IPC compared to PC CPUs of the time (honestly in terms of IPC they were closer to the NetBurst than anything IIRC) but they actually held their own in terms of FLOPS. Especially the CELL, which actually held it's own even against top end CPUs at the time. Which is why PS3s were used as cheap supercomputers, and why it contributed so much to Folding@Home back in the day. No one is doing to the metal programming for the CPU in games in this day and age. Engines are written in C++ for the most part on all platforms, and actual game code is done in high level scripting languages for the most part. As for API overhead that does have in impact sure but games can still be CPU heavy for example

Non-Rendered Civ V AI benchmark should have next to 0 API overhead, A10 gets absolutly crushed.

 

And this is why you haven't herd complaints about the CPUs in the PS360

"

The first point relates to all of the things that are usually handled by the CPU and the second point relates to things that are traditionally processed by the GPU. Over the successive platform generations the underlying technology has changed, with each generation throwing up its own unique blend of issues:

  • Gen1: The original PlayStation had an underpowered CPU and could draw a small number of simple shaded objects.
  • Gen2: PlayStation 2 had a relatively underpowered CPU but could fill the standard-definition screen with tens of thousands of transparent triangles.
  • Gen3: Xbox 360 and PlayStation 3 had the move to high definition to contend with, but while the CPUs (especially the SPUs) were fast, the GPUs were underpowered in terms of supporting HD resolutions with the kind of effects we wanted to produce.

In all of these generations it was difficult to maintain a steady frame-rate as the amount happening on-screen would cause either the CPU or GPU to be a bottleneck and the game would drop frames. The way that most developers addressed these issues was to alter the way that games appeared, or played, to compensate for the lack of power in one area or another and maintain the all-important frame-rate."

on the PS4/XBOne

"Removing these "bubbles" in the CPU pipeline combined with removing some nasty previous-gen issues like load-hit stores means that the CPUs Instruction Per Cycle (IPC) count will be much higher. A higher IPC number means that the CPU is effectively doing more work for a given clock cycle, so it doesn't need to run as fast to do the same amount of work as a previous generation CPU. But let's not kid ourselves here - both of the new consoles are effectively matching low-power CPUs with desktop-class graphics cores.

So how will all of this impact the first games for the new consoles? Well, I think that the first round of games will likely be trying to be graphically impressive (it is "next-gen" after all) but in some cases, this might be at the expense of game complexity. The initial difficulty is going to be using the CPU power effectively to prevent simulation frame drops and until studios actually work out how best to use these new machines, the games won't excel. They will need to start finding that sweet spot where they have a balanced game engine that can support the required game complexity across all target consoles. This applies equally to both Xbox One and PlayStation 4, though the balance points will be different, just as they are with 360 and PS3."

http://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network
dahuman said:

Nah the holding back will happen much faster this time around, just because it's more similar to PCs in a lot of ways, on the dev point of view, it's very beneficial this way, but it also means hitting diminishing returns will happen much faster because devs are more familiar with the general coding, and PC ports will run better in general from porting hence resulting in an even more noticeble performance improvement. What we will most likely see from PC ports are more tessellation and maybe some better assets is about it from ports, but pure console gamers will be wow'ed without problems on 30FPS console games because they don't really know what to expect from these hardwares since the 7th gen started in.... 2005...... so any jump seems big but that won't keep people like me from being disappointed because we know how much power modern gaming PCs really have, it doesn't matter though TBH, I have a PC anyways so I won't be envious about anything, and games matter the most, I mean hell, I have a fucking Wii U and I like the games just fine, this whole power talk with consoles is fun pass time is about it.

I'm pretty sure low end pc hold back high end pc more than consoles do. Frequently updating one's computer isn't a mainstream practice. Microsoft is discontinuing support for XP which people are still using. And when developing for PC the GCD is the most important thing  So if anything consoles, and especially the next gen console will increase the lower bound farther then last gen medium pc. Considering the movement towards mobile development, slowing the increasing cost of development is a priority.

If anything, the focus on graphics and resolution is holding the industry back imo.

torok said:
kanageddaamen said:
Stats don't matter, show me the results. You can almost see a larger jump from early last gen to late last gen due to a move to deferred rendering/lighting which greatly increased the realism in the lighting of scenes. I have seen nothing from any of the three consoles graphics wise that comes close to ANY other generation leap.

The graphical improvements are marginal at this point.


And I remember with care when X360 was being called XBox 1.5 in its first year because it wasn't a true gen jump while now everybody agrees it was a massive improvement. Let's wait 2 years before we start saying this gen didn't improved anything, this happens every gen.

/thread



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

Tagged: zarx giving interesting infos for the choice of mid-low end components with acceptable performances for my next PC.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


freedquaker said:
Pemalite said:


Just a quick question, why are you quoting the CPU performance, instead of the GAMING performance here? Because we all know that Intel CPUs will demolish the Kaveri in CPU bound scenarios but most games are not. On the next 3 pages, there are the game benchmarks, where Kaveri easily outmatches or catches up with High end Intel CPUs which are much more expensive. You seem to have conveniently posted an entirely irrelevant set benchmarks and skip everthing that is relevant!

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/12

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/13

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/14

Please next time, when you quote something about gaming consoles, quote the gaming performance and the impact of CPU. And you know what, this is KAVERI, with discrete graphics, way lower than what PS4 has, and without the low-level improvements and driver optimizations etc...

It's logical why I am not comparing GPU performance.
AMD has the best integrated graphics, it's going to have an edge in gaming related tasks over Intel's Integrated graphics, thus making the CPU performance far less important.
It's still going to get pounded by an Intel Quad with a discreet GPU.

Your claim that Kaveri can compete with high-end Intel CPU's was falsified, the only time you can claim it can is when it's GPU bound. (The Irony.)
Lets not forget either that Kaveri is AMD's high-end APU, Jaguar is not based upon that architecture, instead it's based on Kabini which is an evolutionary step from Brazos, Performance is magnitudes slower with Kabini.

So am I skipping relevant benchmarks? Nope. I'm showing completely CPU bound tasks, not GPU bound tasks in order not to skew the disparity that exists.

freedquaker said:


You know what, I have no objection to any of that. What you are missing is that

a) Those games are not taking advantage of close to the metal programming, and hampered by the high level access of DirectX and Open GL


Developers have a choice not to use it on the PC.
The irony is... A Majority of games on the Xbox 360 and Playstation 3 also didn't use "to the metal" programming methods, it's a slow and expensive task to develop for that and it's pointless unless you're building your own game engine, most games however used a low-level API. - You just have to look at the plethora of game engines that employed/abused the Unreal engine.


freedquaker said:

c) Consoles are designed parallel this time around, and it will be taken advantage of, so the single thread performance is not the case here anymore. There is a reason why 8 cores have been in there.

 


There is always a need for more single threaded performance.
CPU's are designed to be excellent at sequentual processing, GPU's Parrallel processing and not all tasks can be parallelised due to timing dependancies.
It's pretty short sighted to say it's not important.

dahuman said:


It's scary how we did a lot of the same things, the only difference is I moved away from super highend these days lol.... I got a lenovo with a 1.6Ghz Atom years ago and I ripped out the wireless in it and tossed the Crystal HD in there and just hooked up a USB wireless to it lol.... Also the Cyrix shit OMG those were the days.... they sucked so bad lol. I still have the original All-in-Wonder sitting in my drawer as a memory piece and all the connectors too LOL!

I still have my old 3dfx Voodoo 2. :D GLIDE gaming baby!




www.youtube.com/@Pemalite

zarx said:
freedquaker said:

You know what, I have no objection to any of that. What you are missing is that

a) Those games are not taking advantage of close to the metal programming, and hampered by the high level access of DirectX and Open GL

b) Consoles can utilize CPUs much more efficiently with much faster CPU calss etc. This doesn't mean they'll magically have more CPU muscle but it means the CPU is less of a bottleneck and needed way less

c) Consoles are designed parallel this time around, and it will be taken advantage of, so the single thread performance is not the case here anymore. There is a reason why 8 cores have been in there.

d) Weak CPUs have always been the case in modern Consoles, and their makers gotta be real idiots to put them there otherwise. Again this doesn't mean that on occasion, they'd benefit from faster CPUs but obviously the added performance is not worth it and better spent aelsewhere.

e) For years, I have hardly ever heard developers complaining about the lack of CPU performance (with the exception of Wii). The main culprit of complaint has always been the amount of memory, which is now handled handsomely.

 

It's time to surface up and face the realities of the actual life, rather than diving into some unrealistic technicalities which hardly make practical differences.


Well if you are happy with games that have sevearly limited simulation complexity sure a round of anemic CPUs is fine.  Developers will always design around the bottlenecks of a system and having a weak CPU means designing games which only offer superficial levels of simulation. Basically you will get games that are designed around a weak CPU, so we will get open world games where nothing is simulated but the things you are looking at and everything else just pops out of existance, linear cinimatic experiances and simple arena shooters because that is what devs can do with a weak CPU and a reasonable GPU. Personally I would have liked more games where the game worlds were more complex and persistant. Having lots of RAM is nice but data is useless if you can't proccess it, now don't get me wrong I like high res textures as much as then next guy but if that is all devs can use it for because they don't have much CPU power but plenty of GPU power than that is dissapointing. And before you say it I know GPGPU will allow the GPUs to take over some of the simpler simulation work like particle/cloth physics and maybe some pathfinding and hit detection but it has it's limitations.

A few other notes, while the PS360's GPUs were very limited in terms of IPC compared to PC CPUs of the time (honestly in terms of IPC they were closer to the NetBurst than anything IIRC) but they actually held their own in terms of FLOPS. Especially the CELL, which actually held it's own even against top end CPUs at the time. Which is why PS3s were used as cheap supercomputers, and why it contributed so much to Folding@Home back in the day. No one is doing to the metal programming for the CPU in games in this day and age. Engines are written in C++ for the most part on all platforms, and actual game code is done in high level scripting languages for the most part. As for API overhead that does have in impact sure but games can still be CPU heavy for example

Non-Rendered Civ V AI benchmark should have next to 0 API overhead, A10 gets absolutly crushed.

 

And this is why you haven't herd complaints about the CPUs in the PS360

"

The first point relates to all of the things that are usually handled by the CPU and the second point relates to things that are traditionally processed by the GPU. Over the successive platform generations the underlying technology has changed, with each generation throwing up its own unique blend of issues:

  • Gen1: The original PlayStation had an underpowered CPU and could draw a small number of simple shaded objects.
  • Gen2: PlayStation 2 had a relatively underpowered CPU but could fill the standard-definition screen with tens of thousands of transparent triangles.
  • Gen3: Xbox 360 and PlayStation 3 had the move to high definition to contend with, but while the CPUs (especially the SPUs) were fast, the GPUs were underpowered in terms of supporting HD resolutions with the kind of effects we wanted to produce.

In all of these generations it was difficult to maintain a steady frame-rate as the amount happening on-screen would cause either the CPU or GPU to be a bottleneck and the game would drop frames. The way that most developers addressed these issues was to alter the way that games appeared, or played, to compensate for the lack of power in one area or another and maintain the all-important frame-rate."

on the PS4/XBOne

"Removing these "bubbles" in the CPU pipeline combined with removing some nasty previous-gen issues like load-hit stores means that the CPUs Instruction Per Cycle (IPC) count will be much higher. A higher IPC number means that the CPU is effectively doing more work for a given clock cycle, so it doesn't need to run as fast to do the same amount of work as a previous generation CPU. But let's not kid ourselves here - both of the new consoles are effectively matching low-power CPUs with desktop-class graphics cores.

So how will all of this impact the first games for the new consoles? Well, I think that the first round of games will likely be trying to be graphically impressive (it is "next-gen" after all) but in some cases, this might be at the expense of game complexity. The initial difficulty is going to be using the CPU power effectively to prevent simulation frame drops and until studios actually work out how best to use these new machines, the games won't excel. They will need to start finding that sweet spot where they have a balanced game engine that can support the required game complexity across all target consoles. This applies equally to both Xbox One and PlayStation 4, though the balance points will be different, just as they are with 360 and PS3."

http://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators

That's a good article. That's what I got from it (which I already knew).

-The CPU performance has increased over the last generation significantly but not by a magnitude.

-The GPU performance has increased by many magnitudes (which people have been contesting)

-Because there is ample CPU in place now, we may run into situation where GPGPU may be needed (again nothing unknown)

-The CPU performance is limited (as expected) but it does not hinder the GPU performance (this is how & why developers lean onto the graphics & rendering more heavily)

 

At the end of the day, the article gave me nothing (although I enjoyed reading) except that it confirmed that the CPU performance is NOT the bottleneck to the GPU. Yes, it is limited for brute CPU operations, but games simply DON'T NEED BETTER CPUs for 1080p fps graphics on a console (maybe on a PC, but definitely not on a console). Think about it, XB1 and PS4 basically have very similar CPUs but one has a much beefier GPU and faster RAM, and this leads to a huge difference in gaming. Again, here is the...

Question, "why is there such a huge gap between XB1 and PS4?"
  => The answer : "Because of the GPU and faster RAM"...

Question, "But don't we also need faster CPU along with a better GPU to utilize that performance?"
  => The answer : "In CPU bounds scenarios, yes, although to a lesser extent on consoles, but obviously the majority of games, even on PC, are not CPU bound. Therefore, as in the XB1-PS4 comparison, both of which use the same CPU, one has a much better performance JUST BECAUSE OF THE GPU AND THE FASTER RAM, so the additional CPU benefit would be negligble."

And BY THE WAY, You VERY CONVENIENTLY again posted possibly the one of the MOST CPU BOUND games, A STRATEGY GAME with lots of AI elements etc. That's an unbelievably INCONVENIENT example for a console, where strategy games are once in a million. My goodness, a perfect example of how people want to see the world and really bend to their :) Funny...

4lc0h0l said:
I am with Lucidium on this one, the original poster just goes with the numbers that have the single most higher % value which is stupid... learn to sum % think a little and then construct a better argument...

Before calling something stupid, first UNDERSTAND it. What single "most higher value"? What is this, English? Is that a superlative or a comparative? Learn to sum % of what? I am speaking 5 languages, but not this one! All the numbers, I believe you are referring to, are taken from Steam's Website, and the memory amounts are calculated accordingly with the weighted averages. Before speaking against, FYI, I am an economics professor, with bachelors in architecture and economics, dealing with econometrics, statistics, non-parametric analysis such as DEA, and linear programming on a daily basis, so I know what I am doing. But you don't seem to know what you're talking about... just saying.



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

Around the Network
freedquaker said:

That's a good article. That's what I got from it (which I already knew).

-The CPU performance has increased over the last generation significantly but not by a magnitude.

-The GPU performance has increased by many magnitudes (which people have been contesting)

-Because there is ample CPU in place now, we may run into situation where GPGPU may be needed (again nothing unknown)

-The CPU performance is limited (as expected) but it does not hinder the GPU performance (this is how & why developers lean onto the graphics & rendering more heavily)

 

At the end of the day, the article gave me nothing (although I enjoyed reading) except that it confirmed that the CPU performance is NOT the bottleneck to the GPU. Yes, it is limited for brute CPU operations, but games simply DON'T NEED BETTER CPUs for 1080p fps graphics on a console (maybe on a PC, but definitely not on a console). Think about it, XB1 and PS4 basically have very similar CPUs but one has a much beefier GPU and faster RAM, and this leads to a huge difference in gaming. Again, here is the...

Question, "why is there such a huge gap between XB1 and PS4?"
  => The answer : "Because of the GPU and faster RAM"...

Question, "But don't we also need faster CPU along with a better GPU to utilize that performance?"
  => The answer : "In CPU bounds scenarios, yes, although to a lesser extent on consoles, but obviously the majority of games, even on PC, are not CPU bound. Therefore, as in the XB1-PS4 comparison, both of which use the same CPU, one has a much better performance JUST BECAUSE OF THE GPU AND THE FASTER RAM, so the additional CPU benefit would be negligble."

And BY THE WAY, You VERY CONVENIENTLY again posted possibly the one of the MOST CPU BOUND games, A STRATEGY GAME with lots of AI elements etc. That's an unbelievably INCONVENIENT example for a console, where strategy games are once in a million. My goodness, a perfect example of how people want to see the world and really bend to their :) Funny...

You are missing the point, there is a lot more to wanting more powerful consoles than just flashy graphics. Sure if you only want shallow flashy games then there is no problem having a weak CPU and a decent GPU. The whole point of showing a game that was a CPU bound title was to show that there are things that need the CPU power for thing completly unrelated to graphics. There is no reason why console games couldn't be designed with much more advanced simulation if they had the CPU power available.

For example I would love to see a The Elder Scrolls game with a full Radiant AI system as it was originally intended. I want to raid a trade convoy which causes food prices in a nearby town to raise due to a shortage which causes a peasent to go out and try and poach deer and get caught by a guard. Instead we will probably get another game where NPCs pop into existence 2 cells away from the player  just follow a preset path or stand in place until the player talks to them or something agros them, but now with 50% flashier graphis! Iwant bustling cities with hundreds of NPCs instead of a great capital with a population of 20.

There is no reason that console games have to be shallow simulations.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

zarx said:

You are missing the point, there is a lot more to wanting more powerful consoles than just flashy graphics. Sure if you only want shallow flashy games then there is no problem having a weak CPU and a decent GPU. The whole point of showing a game that was a CPU bound title was to show that there are things that need the CPU power for thing completly unrelated to graphics. There is no reason why console games couldn't be designed with much more advanced simulation if they had the CPU power available.

For example I would love to see a The Elder Scrolls game with a full Radiant AI system as it was originally intended. I want to raid a trade convoy which causes food prices in a nearby town to raise due to a shortage which causes a peasent to go out and try and poach deer and get caught by a guard. Instead we will probably get another game where NPCs pop into existence 2 cells away from the player  just follow a preset path or stand in place until the player talks to them or something agros them, but now with 50% flashier graphis! Iwant bustling cities with hundreds of NPCs instead of a great capital with a population of 20.

There is no reason that console games have to be shallow simulations.

Cloud



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

Dr.Henry_Killinger said:

Cloud


Sure if everyone had a fiber connection and lived within 40ms of a server, the cloud would be a valid answer.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

ummm, Google Fiber?



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

Dr.Henry_Killinger said:
ummm, Google Fiber?


Consoles are world wide, Google Fiber is in only a very very very very tiny select few in the USA.
Do you really wish to design a game for such a tiny demographic?




www.youtube.com/@Pemalite