By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Tech Talk Part 1: PS4 GPU Compute Explained, and some other stuff, AC;Unity CPU load XB1 ESRAM 204Gb/s..etc

Nice OP. I remember how Housemarque explained how they used GPGPU compute in Resogun and boy, did it look good when it came out. I can't wait to see the PS4's future games when this feature will be more commonly utilized.

It's also great that the PS4 also contains the Onion and Garlic buses, so that the CPU and GPU, respectively, can directly access the GDDR5 memory pool.



Around the Network
Dark_Feanor said:
You didn´t have to repost your all explanation I´ve read and understood the point you are trying to make.

But first thing first, AC-U is not the first 30 fps sub 1080p game, there is another that uses the same engine (or very similar) and it´s also an open world game (surely it´s also crossgen), but you understand. And also remember that AC4 needed a patch to reach 1080p.

Remember the 9 women. And you might also be familiar with the "Phylosofers Lanch (or dinner?)"

CPU and GPU have to share the same memory pool. The CPU has to have room to do the calculation, as @ethomaz said, they are not using GPGPU.

If they have extra time to toiled the engine for the extra compute units to work without overloading the memory bus they could reach 1080p on both consoles, may be...

But there is time to market, they have to ship the game right now the best way they can.

PS: Good luck with your best app ever, hope you don´t expend too much time optimising for every platform...

There is nothing wrong with a game being sub 1080p, and the fact that they needed to patch PS4 AC4 to reach 1080p pretty much tells you all you need to know about Ubisoft. I was just pointing out why the reason or excuse they gave is BS. As I said, there is a reason for why they did what they did.  And I know what it is, its nothing bad per say... just that the reason they gave is not true.. and honestly insulting if you know anything about how games render pipelines work.

And them using GPGPU has nothing to do with this.

I also think some people don't realize how litle bandwidth or actual data any CPU tasks uses. To help put things in perspective? The CPU you are using to read this right now (assuming you are reading this on some sort of desktop/ laptop. Of Which its OS uses DDR3 ram. And I am going to assume its using amongst the fastest DDR3 ram out there. Has a peak memory bandwidth of under 20GB/s. More like 17.8GB/s to be exact (well depending on cpu it can be pretty high tho). Now do you think a game that will run on that that same PC, along with its memory hungry OS all within the confines of that 20GB/s, some how is bandwidth starved on consoles with 60GB+ and 190GB+ of memory???????

Oh, and thanks for the well wishes.. I hve only been working on that particular app on and off for the past 3 years lol. Becoming clear to me its not a one man job.

Fun fact:
Do you know you can completely saturate all the processing resources of a CPU and literally bring that CPU to its knees with code no bigger than 4kb? 



Intrinsic said:

There is nothing wrong with a game being sub 1080p, and the fact that they needed to patch PS4 AC4 to reach 1080p pretty much tells you all you need to know about Ubisoft. I was just pointing out why the reason or excuse they gave is BS. As I said, there is a reason for why they did what they did.  And I know what it is, its nothing bad per say... just that the reason they gave is not true.. and honestly insulting if you know anything about how games render pipelines work.

And them using GPGPU has nothing to do with this.

I also think some people don't realize how litle bandwidth or actual data any CPU tasks uses. To help put things in perspective? The CPU you are using to read this right now (assuming you are reading this on some sort of desktop/ laptop. Of Which its OS uses DDR3 ram. And I am going to assume its using amongst the fastest DDR3 ram out there. Has a peak memory bandwidth of under 20GB/s. More like 17.8GB/s to be exact. Now do you think a game that will run on that that same PC, along with its memory hungry OS all within the confines of that 20GB/s, some how is bandwidth starved on consoles with 60GB+ and 190GB+ of memory???????

Oh, and thanks for the well wishes.. I hve only been working on that particular app on and off for the past 3 years lol. Becoming clear to me its not a one man job.

Fun fact:
Do you know you can completely saturate all the processing resources of a CPU and literally bring that CPU to its knees with code no bigger than 4kb? 


Unless we start using thinfoil hats and have a big conspiracy do downgrade the game we have to assume Ubi stumble in some kind of problem to reach 1080p and retain all the scope of the game.

And as you pointed console memory bw are very high and devs try to feed as much as possible to the GPU. But the CPU problem claim is understandable:

If the CPU is demanding to much bw that chunk will be unavailabe to the GPU as long as the CPU is bussy. And a lot of computation has to be done in the 33us back and forth CPU->GPU even if the GPU becomes idle it has to wait the CPU finish their tasks and free the memory.

I don´t know if all the PS4 memory pool could be read/written at the same cicles, or how big the pipeline could get before CPU start starving. If I remember even Cerny said that that was the reason they considered ERAM for a long time into development, but decided for a big and fast GDDRAM.

There are bootlenecks everywhere anywhere. Dosen´t matter how advanced is the archtecture.  



Dark_Feanor said:


Unless we start using thinfoil hats and have a big conspiracy do downgrade the game we have to assume Ubi stumble in some kind of problem to reach 1080p and retain all the scope of the game.

And as you pointed console memory bw are very high and devs try to feed as much as possible to the GPU. But the CPU problem claim is understandable:

If the CPU is demanding to much bw that chunk will be unavailabe to the GPU as long as the CPU is bussy. And a lot of computation has to be done in the 33us back and forth CPU->GPU even if the GPU becomes idle it has to wait the CPU finish their tasks and free the memory.

I don´t know if all the PS4 memory pool could be read/written at the same cicles, or how big the pipeline could get before CPU start starving. If I remember even Cerny said that that was the reason they considered ERAM for a long time into development, but decided for a big and fast GDDRAM.

There are bootlenecks everywhere anywhere. Dosen´t matter how advanced is the archtecture.  

There is more to all this though... but no need to get into all that.

The simple reason why the game runs at 900p on both consoles is optimization. Or the lack of it there of. ESRAM does not give the XB1 any kinda of advantage over the PS4. But I will talk about that some other time. In truth, there is not a single thing hardware wise that the XB1 has that is better... well, except the higher clocked CPU.

The game was simply built primarily on the XB1. Plain and simple. They got the game running at 900p 30fps on the XB1 and are just going to (if they haven't already) do a very simple straight port to the PS4. It is easier porting to the PS4 from the XB1 than porting to the XB1 from the PS4. Basically, f you can get anything running on the XB1, then it will run with minimal effort on the PS4.

There is nothing wrong with them prioritizing XB1 development, if I were them its exactly what I would have done too. Its just that they didn't feel the need to have to deal with getting the PS4 version optimized, they figured 900p is good enough for everyone. Which is ok.... their excuse just wasn't true though.



Intrinsic said:
ethomaz said:

I agree, what Ubi said is bullshit.

AC didn't even use GPU Compute.

To be fair, I don't think any game made for the PS4 thus for or coming out in the next year... even from sony first party has bothered to start using GPU compute yet.


Only Housemarque with Resogun and Q game with their new new game (it's explained here) uses compute.

Cerny also gave a roadmap for GPGPU use and in fact by now no engine run a proper PS4 spec engine with GPGPU.
Cerny talk to Gamasutra that those engine will debute in 2015 (Here).

For Guerilla they used the SPURS (SPU Runtime System) which allow developer to simulate the SPU of the CELL on PS4.

And to finish even Guerilla Game, Sucker Punch, Naughty Dog and Quantic Dream (for The Dark Sorcerer demo) uses PS3 engine upgraded for PS4.



Around the Network
Dark_Feanor said:
Source?

Is that your opinion? If so, what are your qualifications?

But you ask.

"what happend to the 40% more GPU that the PS4 has? "

9 pregnant women don´t output 1 babe each month.

That is basicaly why we have seem a lot of resolution delta and only Tomb Raider with lock vs unlock frame rate. And that is why games with the same resolution on both the XOne gets only a 10% hit in intense sceanes.

9 pregnant women output 9 babies, and that, if correctly scheduled, can lead to 1 baby output each month. What the point ?



No offense. But I tryst a developer over a forum poster.

There's a lot of know it all articles on the net who have agendas.



TheAdjustmentBureau said:
No offense. But I tryst a developer over a forum poster.

There's a lot of know it all articles on the net who have agendas.

None taken.... You could also just like you know? try and educate yourself over these things. If a firefighter tells you fire isn't gonna burn you if you put your hand in it, you should trust him just becaue he is a fire fighter.

What you are saying is basically that a dev/producer cannot lie about the game they are making. And sorry, what excatly is my agenda here?

Ah well nevermind.



TheAdjustmentBureau said:
No offense. But I tryst a developer over a forum poster.

There's a lot of know it all articles on the net who have agendas.


So, if you have no agenda and if you trust the developper, trust this from ubisoft employee :  http://www.worldsfactory.net/2014/10/15/ps4-gpgpu-doubles-xb1-gpgpu-ubisoft-test



Predictions for end of 2014 HW sales:

 PS4: 17m   XB1: 10m    WiiU: 10m   Vita: 10m

 

Aerys said:
TheAdjustmentBureau said:
No offense. But I tryst a developer over a forum poster.

There's a lot of know it all articles on the net who have agendas.


So, if you have no agenda and if you trust the developper, trust this from ubisoft employee :  http://www.worldsfactory.net/2014/10/15/ps4-gpgpu-doubles-xb1-gpgpu-ubisoft-test

Exactly. I have quickly learnt tht people that are too quick to dismiss anything are usually the ones that have agendas. Especially when its so easy to get valid and accurate information these days.

Whats really funny is that everything I have said are things that developers have said themselves, or that anyone with a snippet of coputer knowlege can see for themselves with just a little research. 

I would really have loved him to point out whatever it was that I said that he felt was not true. But he didn't so I didn't bother explaining.