By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Tech Talk Part 1: PS4 GPU Compute Explained, and some other stuff, AC;Unity CPU load XB1 ESRAM 204Gb/s..etc

Intrinsic said:

Not once did I imply in relation to assasins creed that GPU compute on the PS4 could be somehow used to improve the performance of the game. Rather what I implied was that if both the PS4 dn XB1 spend the same or near similar time processing their CPU based taskes before passing on the reolve date to the GPU, the fact that the PS4 GPU is just simply more powerful would mean that in the time left for both GPUs to do their thing, the PS4 should be able to do more GPU based work than the XB1. I cannot even start to understand how you or anyone could read all this and imply I was saying that GPU compute could be used to improve AC;U on the PS4. The other guy in this thread also made such accusations, I just asumed he and now maybe you too probably just didnt understand what I said or was saying.


If you're bottlenecked by your CPU, it doesn't matter what the GPU can do especially if your rendering/lighting engine is CPU-bound. I'm calling bullshit on your calling bullshit. In order for Unity to be updated such that the GPU can be used to improve AC:U on the PS4 would likely require that the entire game's engine be re-written not only off of nVidia tech optimization but also specifically to utilize PS4 hardware, making the game have two seperate development chains... which was what developers were trying to get away from in this age of skyrocketting prices. (There's also doubt on whether or not the CPU-bound bottlenecks can even be circumvented by GPGPU as we have a lacking amount of details on the whole matter (GPGPU is not some sort of magic wand to suddenly offload CPU workloads) aside from an insider developer stating that getting this game even working as it is on these consoles with high fidelity graphics has been nothing short of a clusterfuck nightmare.) Moreover, if you unload an inefficient process for GPGPU from the CPU, you're robbing the game of the ability to spend its GPU power on other aspects of IQ; this is afterall a zero-sum game of power we're working with. For all we know, when Unity launches, the PS4 version may boast better texture details and other graphical qualities due to its higher GPU specs which would have had to have been sacrificed in the scenario where you unload a potentially ruinously ineffecient process from the CPU to GPGPU just so that you can reach 1080p.

I don't disagree with the rest of the statements as they are common sense facts, the PS4 has a stronger GPU and can do more things with said GPU and Ace (and lack thereof of esdram). Ya, Ubisoft is full of shit but they're not just crippling the game out of some twisted sense of satisfaction. There are deadlines, costs, and what not else that you and I are not privy to with our armchair development discussions.



Around the Network

The GPGPU during idle time is basically a free boost. It's like a car turbocharger which uses the exhaust gases to increase power.

Does anyone know the X1s capabilities in this area? I know the PS4s 64 compute threads (X1 has 2) really help in this idle time, as you have the full 1.84 Teraflops to use in this idle time.

My expectation: Microsoft didn't really add to the features that you get out of the box with gpgpu, mainly because of their work in other parts of the apu and their cloud focus. They can add more support in an sdk, but I don't think they can add more compute threads with a software change.



PS, PS2, Gameboy Advance, PS3, PSP, PS4, Xbox One

Intrinsic said:
TheAdjustmentBureau said:


Theres no derail. Just because someone sees through something and it doesnt fit tour agenda. Ax unity is multiplatform. Sucker punch had far more time with ps4 than the unity team. Infamous had 25 Boca max onscreen. The op asked for Xbox one vs ps4.

Wow.. ah well...

I guess you would still find a way to make this about the XB1 vs PS4 even if the OP is  talking about the wiiU and the dreamcast.

Because someone points out the bad in someting does not mean he is automatically saying something else is better. That kinda minset will make you see everything like it were some sort of fanboy war. Do whatever makes you happy I guess. If you thik I am picking on the XB1 here I wonder what you would say when you read part two.


In your op you very clearly brought into the equation a parity clause that doesn't exist. Calling bs to developers info etc. To fit an agenda.its clear after your comments what you intend with these tech talk posts. So ill let you guys carry on. I'll leave you to it.



XB1 vs PS4 is not a derail. Making this thread about "agendas" and referenda on user habits or intentions, is definitely a derail. Personal jabs stop NOW or bans will follow.



Monster Hunter: pissing me off since 2010.

Intrinsic said:

 Its interesting cause if say the CPU heavy load takes 25ms to complete its task for the next frame then passes off the render instructions to the GPU, then theoretically limiting the resolution means that they wanted to give the GPU less work to do so it spit outs the frame on time still hitting that 33ms limit. But this is where there is a problem with the story. If the GPUs have only 8ms to complete its task and the XB1 completes that with a  900p frame, what happend to the 40% more GPU that the PS4 has? The PS4 should be able to complete the exact same task 40% faster than the XB1.

What you're describing is a case where the game is trying to reduce input latency, but that's not usually how things are done except for example for VR games (oculus, morpheus, etc...).

Usually games use pipelining so the GPU processes frame N while the CPU processes frame N+1, thus giving full 33ms to both CPU & GPU.



My 8th gen collection

Around the Network
beeje13 said:

The GPGPU during idle time is basically a free boost. It's like a car turbocharger which uses the exhaust gases to increase power.

Does anyone know the X1s capabilities in this area? I know the PS4s 64 compute threads (X1 has 2) really help in this idle time, as you have the full 1.84 Teraflops to use in this idle time.

My expectation: Microsoft didn't really add to the features that you get out of the box with gpgpu, mainly because of their work in other parts of the apu and their cloud focus. They can add more support in an sdk, but I don't think they can add more compute threads with a software change.

The XB1 is also capable of GPU compute. Thought in th e traditional sense of it. The PS4 can do 64 simultaneous GPGPU instructions while the XB1 does 4. This is just due to th efact that more of the PS4s GPU can be used for it. In addition to this the PS4's GPU has HUMA. Which basiacally means that the GPU can read and write directly to system memory and completely bypass the systems L caches. Basically, this is saying that GPGPU compute on the PS4 is an independent task that doesn't take up any CPU cycles.

And yes, MS and sony appraoched their APUs very differently. Its actually funny, sony didnt really add anything to their APU... they just tried to develop what was already on theer as much as possible while MS added ESRAM and data move engines to theirs. The XB1s APU is actually the more expensive of the two.



TheAdjustmentBureau said:

Theres no derail. Just because someone sees through something and it doesnt fit tour agenda. Ax unity is multiplatform. Sucker punch had far more time with ps4 than the unity team. Infamous had 25 Boca max onscreen. The op asked for Xbox one vs ps4.

However, Ubisoft has way more people working for them and already released two huge games: Assassin's Creed IV: Black Flag and Watch Dogs. You don't have any evidence that Sucker Punch had "far more time" with the PS4 than the Unity team.

At the end of this day, this thread is talking about the PS4's tech, specifically on its ability to do GPGPU compute. The major takeaway from this thread is that while the PS4 and X1 APUs seem identical, they're actually very different from each other. As Intrinsic mentioned already, the PS4 can do signficantly more GPGPU instructions and has HUMA, something the X1 doesn't have. The PS4's GPU can bypass the L caches and directly access the GDDR5. The X1's APU, on the other hand, doesn't have HUMA and its CPU cannot directly access the esRAM. In constrast, the garlic and onion buses in the PS4 allows the GPU and CPU, respectively, directly access its memory pool. The point of the OP was to teach the ins and outs of the PS4 (and X1). Nothing more. Nothing less.



Vena said:


If you're bottlenecked by your CPU, it doesn't matter what the GPU can do especially if your rendering/lighting engine is CPU-bound. I'm calling bullshit on your calling bullshit. In order for Unity to be updated such that the GPU can be used to improve AC:U on the PS4 would likely require that the entire game's engine be re-written not only off of nVidia tech optimization but also specifically to utilize PS4 hardware, making the game have two seperate development chains... which was what developers were trying to get away from in this age of skyrocketting prices. (There's also doubt on whether or not the CPU-bound bottlenecks can even be circumvented by GPGPU as we have a lacking amount of details on the whole matter (GPGPU is not some sort of magic wand to suddenly offload CPU workloads) aside from an insider developer stating that getting this game even working as it is on these consoles with high fidelity graphics has been nothing short of a clusterfuck nightmare.) Moreover, if you unload an inefficient process for GPGPU from the CPU, you're robbing the game of the ability to spend its GPU power on other aspects of IQ; this is afterall a zero-sum game of power we're working with. For all we know, when Unity launches, the PS4 version may boast better texture details and other graphical qualities due to its higher GPU specs which would have had to have been sacrificed in the scenario where you unload a potentially ruinously ineffecient process from the CPU to GPGPU just so that you can reach 1080p.

I don't disagree with the rest of the statements as they are common sense facts, the PS4 has a stronger GPU and can do more things with said GPU and Ace (and lack thereof of esdram). Ya, Ubisoft is full of shit but they're not just crippling the game out of some twisted sense of satisfaction. There are deadlines, costs, and what not else that you and I are not privy to with our armchair development discussions.

Ok, forget about GPGPU for a moment. As far as ubisoft is concerned and according to them they are only using its for tech that deals with cloth (soft body) physics. They said this themselves, but this is not about ubisoft using GPU compute or not.

I have tried to explain how a render pipeline works. I would have gone ito the whole deferred and forward rendering but thats big enough to be a thread on its own. So let me put it simply.

  • You need to have the CPU + GPU complete their respective tasks in time for the next frame to go to the screen. You have a time allotment of 33ms.
  • The CPU starts first (frame 1) and does all its processing so it can feed the GPU with what the GPU needs to render the frame. Lets say it takes a hypothetical 20ms to finish its task.
  • Once finished, it hands over the instructions to the GPU. The CPUs work is done at this point and it immediately starts working on frame 2. 
  • The GPU still needs to render frame 1 and output it to the screen. It however now has exactly 13ms to do this so the entire first frame ends up taking 33ms; total 20ms CPU + 13ms GPU.
  • No matter how bound a game is to the CPU, at the point that the GPU is rendering frame 1 with the 13ms it has to do that its ALL on the GPU. Cause at this point the CPU is busy calculating frame 2. 
  • So basically, at this point both the XB1 and PS4's GPUs have got exactly 13ms to spit out a frame. 
Do you understand the problem now? 
  • Both consoles have approx the same time to render a frame. 13ms
  • The XB1 in that alloted time with its 12 core GPU manages to render and output a frame that is 900p.
  • How is it possible, that the PS4, with its 18 core GPU, still only manages to render the same 900p in the same amount of time?
  • Its taking the PS4s GPU the same time do the exact same amount of work even though the PS4s GPU is 50% more powerful than the XB1. don't you get it? If you read my part two of this series where I talk about the ESRAM I went into calculating the difference betwen 900p and 1080p. 1080p has ~40% more pixels than 900p. Thats typically where the extra power of the PS4 goes to. And whats funny is that by default, with minimal optimization... the PS4 will have that GPU based advantage every single time.
I never said that ubisoft set out to limit or restrict the PS4. I think I actually referred to the whole parity talk as nonsense. What simply happened here is that the XB1 was the lead platform. Ubi got the game running on in 900p@30fps. Ported the game game over to the PS4. Found out that the game wasn't as optimized for the PS4 because it was actually optimized for the other console. So rather than optimize here too, they just took the extra power the PS4 alread had and let the PS4 do all the work. For the PS4 to be running at 900p just means that the game is not optimized for it. And contrary to what you are saying, all devs are supposed to optimize for the respective hardware. And believe me when I say this gen its probably 95% easier optimizing from XB1 to PS4 than it was optimizing for 360 to PS3.


Intrinsic said:
I never said that ubisoft set out to limit or restrict the PS4. I think I actually referred to the whole parity talk as nonsense. What simply happened here is that the XB1 was the lead platform. Ubi got the game running on in 900p@30fps.

 

I understand the details, you don't need to list them out to me.

As for the quoted: yes, this was my point. They built the game on the lowest common denominator. But, as I said listen to the bombcast, there is some ambiguity if, in fact, even the PS4 is capable of more than what Unity can produce. The bombcast went into this, into the insider dev info and into a test-against-experts. There is a real issue here that, simply put, optimizing for GPGPU may not have been able to fix or would have cost too much to offload onto the GPU for some or other inefficient process.

Like I said, zero-sum game but we, as armchair developers, do not know what the sum involves.



Without being on the development team well never really know what could have been in terms of everything with unity. The ps4 Is a bit stronger than the xb1 for sure in terms of what the op has written as well as other things, but in terms of unity who knows if that was enough to make a meaningful difference.