walsufnir said:
JustBeingReal said:
I think I'll take Ubisoft's own GPU physics simulation over this guys flawed analysis, GPU simulation has also been capable of handling AI since 2009 on AMD GPUs, so the results of the difference between GPU compute on PS4 and XBox One shows the PS4 to be almost 2X more capable in that area and that's before the recent SDK launch. GPU compute absolutely offsets any benefits and far exceeds the marginally faster CPU performance XB1 apparently has, even though developers have said that PS4's CPU actually performed faster than XB1's. Sony have unveiled some details about their 2.0 SDK for PS4, which includes a low level API for GPU Physics Simulation. The fact is that PS4 is more capable not just in graphics processing, but everything else should also run significantly faster on PS4. Perhaps it's Sony's fault because it's not been until recently that they've started rolling out dev kit updates to start actually taking advantage of the benefits that GPU compute can offer, or it's just not that well optimized yet. As it stands many current engines used for game development seem to still be focusing their physics and AI processing on the CPU, not GPU, but it's something that will benefit all platforms, so it only makes logical sense that in the not too distant future even 3rd party developers will start to update their engines to use GPGPU coding. With this being a standard thing in Sony's latest SDK newly announced games being revealed over the next few months should start taking advantage of these features on PS4.
|
But GPGPU is using the same hardware you use for rendering. The simulation where it showed PS4 optimization was using GPUs only for GPGPU purposes so you won't see this advantage in any game. Yes, PS4 has dedicated hardware for an async approach (rendering and GPGPU) but they don't have additional ALUs for that. Still most of the hardware is shared so it will be interesting to see in which way this can be used but I won't expect wonders off it.
|
Graphics rendering is never maxed out throughout 100% of the GPU time, GPU downtime is what Compute Queues are designed to take advantage of, it's all about making sure processing time isn't wasted, so what I'm talking about is using the hardware as efficiently as possible, without wasting resources that would otherwise go unused.
An example of this is something like Assassin's Creed Unity, where we have Ubisoft stating that if it wasn't for the weaker CPUs in either PS4 or Xbox One they could run the game at 100FPS, well they're running the game at less than 1/3 of that speed most of the time, wasting all of that GPU down time which could be otherwise used on physics and AI.
If we look at Ubisoft's benchmarks and Sony's recent SKD 2.0 slides (http://www.dualshockers.com/2014/11/23/ps4-sdk-2-0-revealed-includes-a-lot-of-interesting-new-tech-and-features-game-developers-can-use/), the CPU can be used for physics or AI that the player directly interacts with, so more close-up specific stuff, while the GPU could easily handle huge crowds filled with either. The GPU can supplement part of the demand, so it can easily be used to make up for the weaker CPU.
PS4 does have additional ALUs compared to XB1, it also has extra texture mapping units and ROPs, so higher resolutions, better AA, AF and more demanding textures can all be taken advantage of, while even better physics and AI simulation is easily programmable by developers.
Another thing to bare in mind is that XB1 doesn't have as many compute queues as PS4, this is important because when XB1's Stream Processors are free they can be sitting idle, so will essentially go wasted, where as developers working PS4 can queue up more commands for when the same instance happens in processing time on PS4. It's not just packing more graphical hardware, it's packing hardware to make more out of that extra hardware.