By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - No Goblin co-founder: Saying PS4 is “More Powerful” Than Xbox One Or Vice Versa Is Really Misrepresenting How Games Work

radha said:
the best part it that naughty dog will develop for the most powerful one and will optimise like crazy.


They do make PC games? Nice.



Around the Network

God, I feel so bad for this guy. The crap he's gonna get from fanboys for talking about crap he works on. XD

I mean, seriously, for all this hullabaloo about differences in "power", what differences do we actually get even in multiplat releases? Slight differences in resolution and frame rate that the lay gamer won't even care about?

I don't know about you modern console fanboys, but back in my day consoles had differences in power that were noticeable and affected what kinds of game the console could have. SNES was capable of mode 7 and superior colors, so it often got more colorful games and games with "fake" 3D thanks to built in hardware scaling (Genesis was capable of similar scaling on the software side, but it was so difficult to implement that only a handful of games ever did, and it never looked as good). In the next generation, N64 was easily capable of the smoothest and largest 3D worlds, it was a 3D powerhouse. PS1 had lesser 3D capabilities but utilized CDs which allowed for much larger, cinematic experiences. SEGA Saturn produced the best 2D visuals of its generation by a landslide, but struggled with 3D due to its use of quads and its multiple processors, which should have made it at least as powerful as PS1 on paper, but only a handful of games would ever come close to demonstrating that because the damn thing was virtually impossible to work with.

I mean my god, in the generation after that we at least had some serious graphical effects that Xbox and Gamecube could pull off that PS2/Dreamcast couldn't (like fur shading on characters).

These days the differences between the two graphically comparable consoles can only be measured in frame rate and resolution...something no one outside of fanboys and graphics geeks even care about. What general consumer cares about "900p and 30 frames locked vs 1080p and 30 frames and above"? Lord knows I don't. Of course, I have a PC that runs circles around both, and I game mostly on Wii U, 3DS and Vita these days so I guess I have an outside perspective on this silly console feud.



Locknuts said:

This is not possible. It couldn't run the games it does with lower memory bandwidth than the PS360.

On the second point, I think people underestimate the importance of SDKs. Microsoft know their software and my understanding is that developers universally praise the Microsoft dev kits. Sony I'm not sure about but with the discrepancy in hardware specs, I would say they must be behind on the software side. I can't really see any other explanation. I can't see developers intentionally sabotaging their own work for parity's sake.


1 - It is. It's on the hardware specs. The Wii U has 12.8 GB/s, the PS3 has 24.9 GB/s to the main memory (15.5 GB/s to the GPU's memory) and the 360 has 22.4 GB/s. The Wii U runs the game it does because its GPU is much beefier than the PS360 ones. But if we look close, a lot of games on 720p are presented with no AA, something that wasn't common on the PS360 where at least a FXAA solution was always used. So it's not only possible, it's on the system specs.

2 - From what devs talk about, the SDKs are close. There was a little talk about the X1 one actually having some problems with excessive overhead. The current games just reflect this. A lot of titles are 900p on X1 and 1080p on PS4, a 50% difference comparable to the difference in power. We have Unity running worse on PS4, but the publisher publicy stated they would limit the PS4's version resolution. In all other games PS4 has resolution and/or framerate advantage, eve in extreme cases as Groud Zeroes where we have 720p vs 1080p (a 2.25X difference).



JustBeingReal said:
walsufnir said:
JustBeingReal said:
walsufnir said:
JustBeingReal said:

I think I'll take Ubisoft's own GPU physics simulation over this guys flawed analysis, GPU simulation has also been capable of handling AI since 2009 on AMD GPUs, so the results of the difference between GPU compute on PS4 and XBox One shows the PS4 to be almost 2X more capable in that area and that's before the recent SDK launch.

GPU compute absolutely offsets any benefits and far exceeds the marginally faster CPU performance XB1 apparently has, even though developers have said that PS4's CPU actually performed faster than XB1's.

Sony have unveiled some details about their 2.0 SDK for PS4, which includes a low level API for GPU Physics Simulation. The fact is that PS4 is more capable not just in graphics processing, but everything else should also run significantly faster on PS4.

Perhaps it's Sony's fault because it's not been until recently that they've started rolling out dev kit updates to start actually taking advantage of the benefits that GPU compute can offer, or it's just not that well optimized yet.

As it stands many current engines used for game development seem to still be focusing their physics and AI processing on the CPU, not GPU, but it's something that will benefit all platforms, so it only makes logical sense that in the not too distant future even 3rd party developers will start to update their engines to use GPGPU coding.
With this being a standard thing in Sony's latest SDK newly announced games being revealed over the next few months should start taking advantage of these features on PS4.


But GPGPU is using the same hardware you use for rendering. The simulation where it showed PS4 optimization was using GPUs only for GPGPU purposes so you won't see this advantage in any game. Yes, PS4 has dedicated hardware for an async approach (rendering and GPGPU) but they don't have additional ALUs for that. Still most of the hardware is shared so it will be interesting to see in which way this can be used but I won't expect wonders off it.

 

Graphics rendering is never maxed out throughout 100% of the GPU time, GPU downtime is what Compute Queues are designed to take advantage of, it's all about making sure processing time isn't wasted, so what I'm talking about is using the hardware as efficiently as possible, without wasting resources that would otherwise go unused.

An example of this is something like Assassin's Creed Unity, where we have Ubisoft stating that if it wasn't for the weaker CPUs in either PS4 or Xbox One they could run the game at 100FPS, well they're running the game at less than 1/3 of that speed most of the time, wasting all of that GPU down time which could be otherwise used on physics and AI.

If we look at Ubisoft's benchmarks and Sony's recent SKD 2.0 slides (http://www.dualshockers.com/2014/11/23/ps4-sdk-2-0-revealed-includes-a-lot-of-interesting-new-tech-and-features-game-developers-can-use/), the CPU can be used for physics or AI that the player directly interacts with, so more close-up specific stuff, while the GPU could easily handle huge crowds filled with either. The GPU can supplement part of the demand, so it can easily be used to make up for the weaker CPU.

PS4 does have additional ALUs compared to XB1, it also has extra texture mapping units and ROPs, so higher resolutions, better AA, AF and more demanding textures can all be taken advantage of, while even better physics and AI simulation is easily programmable by developers.


But you can't use it if other resources or channels are already congested. I don't say it can't be used but the GPU is restricted and dependant of other resources and these are also used by CPU and by GPU contexts so we will have to see to what extent devs will be able to use it but I am not sure if it is that much. There will be a trade-off between using the GPU for graphics or GPGPU, you can't have both in that case without penalting the other.

 

That's the point of the way the whole APU has been designed, developers can easily queue up compute commands, have them waiting for when GPU hardware is free to process it. The GPU is never completely saturated dealing with graphical tasks, there's always tonnes of times when Stream Processors are sitting idle, so this is when AI and Physics are handled. Remember I said efficiency?

You should be sure that developers can use this to a great level of extent, because that's what they asked for when Sony did all of their research in what hardware to put in PS4.

There's no need for a trade off with this because it's part of the system's design to use resources as efficiently as possible.

Not only, you might also use 1 Full CU of the GPU for generic task together with the CPU and async compute on the others. PS4 has so much versatility, having 50% more CUs count than XOne.  A very simple concept that some have missed.  And, it appears that XOne CPU(alone) is slighty faster on paper than PS4 CPU(alone), but in game application it's not true. It looks like PS4 CPU is faster. Do I remember wrong ?  But whatever it is about CPU alone, the combo CPU + 18 CUs of PS4 and the extra memory bandwidth makes the Sony Machine the superior machine in EVERYTHING, graphics, phisics, animations, AI, etc in any combination. It's up to the Developers.  

This is just the beginning : http://www.dualshockers.com/2014/11/23/ps4-sdk-2-0-revealed-includes-a-lot-of-interesting-new-tech-and-features-game-developers-can-use/



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Wow this thread - so I should dismiss a game dev , you know a person who actually makes games on the system, who says that on paper spec's don't tell the full story , but instead listen to all the armchair fan boys who probably don't have a clue how either system really works but think they know better. wow



Around the Network
Techmaster said:
Wow this thread - so I should dismiss a game dev , you know a person who actually makes games on the system, who says that on paper spec's don't tell the full story , but instead listen to all the armchair fan boys who probably don't have a clue how either system really works but think they know better. wow


Yes, because a fanboy emotionally invested in the outcome of an argument is FAR more reliable then a guy who makes games for a living. You don't know that?



Techmaster said:
Wow this thread - so I should dismiss a game dev , you know a person who actually makes games on the system, who says that on paper spec's don't tell the full story , but instead listen to all the armchair fan boys who probably don't have a clue how either system really works but think they know better. wow

Tech is tech, if you have time you might read this http://www.dualshockers.com/2014/11/23/ps4-sdk-2-0-revealed-includes-a-lot-of-interesting-new-tech-and-features-game-developers-can-use/

Sometimes devs don't have yet enough resources and knowledge to push a new harware, and sometimes they make a statement to make everybody happy or because they were pushed by a Company to say so.  You don't know this ? It's a big surprise for you ?



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

nuckles87 said:
Techmaster said:
Wow this thread - so I should dismiss a game dev , you know a person who actually makes games on the system, who says that on paper spec's don't tell the full story , but instead listen to all the armchair fan boys who probably don't have a clue how either system really works but think they know better. wow


Yes, because a fanboy emotionally invested in the outcome of an argument is FAR more reliable then a guy who makes games for a living. You don't know that?

Many developers did games and lied to the media or simply said something to be polite with everybody.  I'm not a fanboy if I say so right ?

I'm not a fanboy if I say that a Console with 50% more CUs count which can do both graphics and generic tasks and far more effective memory bandwidth is a more capable system in everything right ?  If I say 1+1 = 2 am I a fanboy ?



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Nate4Drake said:
JustBeingReal said:
walsufnir said:
JustBeingReal said:
walsufnir said:
JustBeingReal said:

I think I'll take Ubisoft's own GPU physics simulation over this guys flawed analysis, GPU simulation has also been capable of handling AI since 2009 on AMD GPUs, so the results of the difference between GPU compute on PS4 and XBox One shows the PS4 to be almost 2X more capable in that area and that's before the recent SDK launch.

GPU compute absolutely offsets any benefits and far exceeds the marginally faster CPU performance XB1 apparently has, even though developers have said that PS4's CPU actually performed faster than XB1's.

Sony have unveiled some details about their 2.0 SDK for PS4, which includes a low level API for GPU Physics Simulation. The fact is that PS4 is more capable not just in graphics processing, but everything else should also run significantly faster on PS4.

Perhaps it's Sony's fault because it's not been until recently that they've started rolling out dev kit updates to start actually taking advantage of the benefits that GPU compute can offer, or it's just not that well optimized yet.

As it stands many current engines used for game development seem to still be focusing their physics and AI processing on the CPU, not GPU, but it's something that will benefit all platforms, so it only makes logical sense that in the not too distant future even 3rd party developers will start to update their engines to use GPGPU coding.
With this being a standard thing in Sony's latest SDK newly announced games being revealed over the next few months should start taking advantage of these features on PS4.


But GPGPU is using the same hardware you use for rendering. The simulation where it showed PS4 optimization was using GPUs only for GPGPU purposes so you won't see this advantage in any game. Yes, PS4 has dedicated hardware for an async approach (rendering and GPGPU) but they don't have additional ALUs for that. Still most of the hardware is shared so it will be interesting to see in which way this can be used but I won't expect wonders off it.

 

Graphics rendering is never maxed out throughout 100% of the GPU time, GPU downtime is what Compute Queues are designed to take advantage of, it's all about making sure processing time isn't wasted, so what I'm talking about is using the hardware as efficiently as possible, without wasting resources that would otherwise go unused.

An example of this is something like Assassin's Creed Unity, where we have Ubisoft stating that if it wasn't for the weaker CPUs in either PS4 or Xbox One they could run the game at 100FPS, well they're running the game at less than 1/3 of that speed most of the time, wasting all of that GPU down time which could be otherwise used on physics and AI.

If we look at Ubisoft's benchmarks and Sony's recent SKD 2.0 slides (http://www.dualshockers.com/2014/11/23/ps4-sdk-2-0-revealed-includes-a-lot-of-interesting-new-tech-and-features-game-developers-can-use/), the CPU can be used for physics or AI that the player directly interacts with, so more close-up specific stuff, while the GPU could easily handle huge crowds filled with either. The GPU can supplement part of the demand, so it can easily be used to make up for the weaker CPU.

PS4 does have additional ALUs compared to XB1, it also has extra texture mapping units and ROPs, so higher resolutions, better AA, AF and more demanding textures can all be taken advantage of, while even better physics and AI simulation is easily programmable by developers.


But you can't use it if other resources or channels are already congested. I don't say it can't be used but the GPU is restricted and dependant of other resources and these are also used by CPU and by GPU contexts so we will have to see to what extent devs will be able to use it but I am not sure if it is that much. There will be a trade-off between using the GPU for graphics or GPGPU, you can't have both in that case without penalting the other.

 

That's the point of the way the whole APU has been designed, developers can easily queue up compute commands, have them waiting for when GPU hardware is free to process it. The GPU is never completely saturated dealing with graphical tasks, there's always tonnes of times when Stream Processors are sitting idle, so this is when AI and Physics are handled. Remember I said efficiency?

You should be sure that developers can use this to a great level of extent, because that's what they asked for when Sony did all of their research in what hardware to put in PS4.

There's no need for a trade off with this because it's part of the system's design to use resources as efficiently as possible.

Not only, you might also use 1 Full CU of the GPU for generic task together with the CPU and async compute on the others. PS4 has so much versatility, having 50% more CUs count than XOne.  A very simple concept that some have missed.  And, it appears that XOne CPU(alone) is slighty faster on paper than PS4 CPU(alone), but in game application it's not true. It looks like PS4 CPU is faster. Do I remember wrong ?  But whatever it is about CPU alone, the combo CPU + 18 CUs of PS4 and the extra memory bandwidth makes the Sony Machine the superior machine in EVERYTHING, graphics, phisics, animations, AI, etc in any combination. It's up to the Developers.  

This is just the beginning : http://www.dualshockers.com/2014/11/23/ps4-sdk-2-0-revealed-includes-a-lot-of-interesting-new-tech-and-features-game-developers-can-use/


Yep developers can use a chunk of the GPU specifically for Compute processing all the time if needed, it's a flexible design so developers can use as big or small a chunk of the GPU for whatever they need. Mark Cerny just described the way the GPU could be shared as a slice of the milliseconds available per frame of processing time, mainly because it's a more efficient use of resources to slot compute commands in where it's available within each frame and that would probably be far more suitable in graphically demanding games. Developers can definitely take the same amount of processing performance as the CPU offers from the GPU, double up on resources for AI and Physics and still have more hardware for graphics than the XB1 offers.

 

As for how fast PS4's CPU is in reality compared to XB1's there was a confirmed developer on Neogaf that outright stated PS4's CPU to be faster, I don't think any official developer has come out and said that. The overall design of the APU could lead to the CPU performaning better if it's a more efficient layout.

 

People need to stop looking at these systems as one component over the other, rather as a full package and how the sum total of what the APU and RAM have to offer, PS4 is definitely better here, as well as being designed to make as efficient use of those extra resources the outright system offers.



Oh, so Xbox is better at gameplay now? I've heard everything.

 

Honestly, the whole article reads as: We cba to optimize for each platform so we will pretend they are the same to the lowest level. Arent we great developers?!

This isnt a beneficial mindset from developers for either PS or XB customers.