By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Low clock speed a challenge for PS4 development too (Project CARS dev)

Tagged games:

curl-6 said:
fatslob-:O said:
curl-6 said:
zarx said:

The trouble with uing the GPGPU to make up for the weakness of the CPU is that GPGPU is only good at tasks that can be highly parallelized. Which leaves the serialised tasks which the Jaguar CPU is weak at.

Out of curiosity, what kind of tasks are serialised or parallelized?

Tasks such as managing the resources of the game engine are usually serialized whereas rendering is parallelized. That is why GPUs have been doing NOTHING BUT GRAPHICS for a very long time until recently.

Thanks. I'm trying to educate myself on this stuff, haha.

There are other tasks related to game interaction such as physics simulations too.

https://www.youtube.com/watch?v=O04ErnJ8USY (Destuctible environment.)

https://www.youtube.com/watch?v=LhcgjPFykXY (Fluid dynamics.) 



Around the Network
fatslob-:O said:
curl-6 said:
fatslob-:O said:
curl-6 said:
zarx said:

The trouble with uing the GPGPU to make up for the weakness of the CPU is that GPGPU is only good at tasks that can be highly parallelized. Which leaves the serialised tasks which the Jaguar CPU is weak at.

Out of curiosity, what kind of tasks are serialised or parallelized?

Tasks such as managing the resources of the game engine are usually serialized whereas rendering is parallelized. That is why GPUs have been doing NOTHING BUT GRAPHICS for a very long time until recently.

Thanks. I'm trying to educate myself on this stuff, haha.

There are other tasks related to game interaction such as physics simulations too.

https://www.youtube.com/watch?v=O04ErnJ8USY (Destuctible environment.)

https://www.youtube.com/watch?v=LhcgjPFykXY (Fluid dynamics.) 

And are those parallelized  or serialized?



curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:
curl-6 said:
zarx said:

The trouble with uing the GPGPU to make up for the weakness of the CPU is that GPGPU is only good at tasks that can be highly parallelized. Which leaves the serialised tasks which the Jaguar CPU is weak at.

Out of curiosity, what kind of tasks are serialised or parallelized?

Tasks such as managing the resources of the game engine are usually serialized whereas rendering is parallelized. That is why GPUs have been doing NOTHING BUT GRAPHICS for a very long time until recently.

Thanks. I'm trying to educate myself on this stuff, haha.

There are other tasks related to game interaction such as physics simulations too.

https://www.youtube.com/watch?v=O04ErnJ8USY (Destuctible environment.)

https://www.youtube.com/watch?v=LhcgjPFykXY (Fluid dynamics.) 

And are those parallelized  or serialized?

Obviously parallelized LOL. Those effects can ONLY hope to ever run on a CPU and if you looked at the first video on top you will see it running on a GTX 680. 



fatslob-:O said:

Obviously parallelized LOL. Those effects can ONLY hope to ever run on a CPU and if you looked at the first video on top you will see it running on a GTX 680. 

Don't you mean GPU? I thought those were for parallelized tasks while CPUs did things in series?



curl-6 said:
fatslob-:O said:

Obviously parallelized LOL. Those effects can ONLY hope to ever run on a CPU and if you looked at the first video on top you will see it running on a GTX 680

Don't you mean GPU? I thought those were for parallelized tasks while CPUs did things in series?

You need to be a little more sharper than that.



Around the Network
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Obviously parallelized LOL. Those effects can ONLY hope to ever run on a CPU and if you looked at the first video on top you will see it running on a GTX 680

Don't you mean GPU? I thought those were for parallelized tasks while CPUs did things in series?

You need to be a little more sharper than that.

Isn't a GTX 680 a GPU though? If so, then clearly it can't "only hope to ever run on a CPU".



curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Obviously parallelized LOL. Those effects can ONLY hope to ever run on a CPU and if you looked at the first video on top you will see it running on a GTX 680

Don't you mean GPU? I thought those were for parallelized tasks while CPUs did things in series?

You need to be a little more sharper than that.

Isn't a GTX 680 a GPU though? If so, then clearly it can't "only hope to ever run on a CPU".

It's just me belittling CPUs in general and how truly weak they compared to GPUs.

After it can "ONLY HOPE" to ever pull these things off.



Hynad said:

In the first few years of the console cycle, sure. But after that, I think we'll see a lot of devs doing it. I don't think coding to the metal on the PS4 will be as difficult and time consuming as it is with the PS3.


I don't think we will see many developers "coding to the metal".
It was a rarity on the Playstation 3 and Xbox 360, so with that in mind one has to assume the status quo would continue on with this next generation.

Most games use 3rd party game engines and 3rd party middleware then adapt it all to the low-level and/or high-level API's and call it a day.
For example, Mass Effect, Batman, Bioshock, Borderlands and Gears of War all use the Unreal engine that interfaces with the low or high level API.
Oblivion and Skyrim use Gamebryo or a variation there-of with various middleware like Speedtree in Oblivion. (Oblivion used a High-level API on the consoles, hence why it ran and looked like crap, but ran like silk on a PC vastly slower than the Xbox 360.)
Don't forget the plethora of EA games all pushing Frostbite and the CryEngine, Unity Engine and Source Engine powered games either who aren't done to-the-metal. (Probably many more in that list!)

The obvious cases were "coding to the metal" had it's advantages were with the first-party developers who then built their own game engines or heavily customised an old one such as was the case with Halo 4. (Albeit, used allot of Pre-baked lighting and shadowing and removed the tessellated water and had simpler geometric architecture in order to drive up everything else.)
Such games really had nothing else come near it in terms of image quality even after a year of being on the market.

However, the PS4 would be easier to code to the metal in comparison to the PS3, lots of 3rd party open-source tools, documentation and skilled developers who have been working with x86 and AMD/ATI GPU's for decades, that's going to pay off in spades, not just for the Playstation but for the Xbox and PC too.



--::{PC Gaming Master Race}::--

curl-6 said:
RenCutypoison said:
Great article, but multi threading difficulties are no news. Time for devs to start optimizingg (Remember johnattan blow)

This is the first I've heard of devs have difficulties with PS4's clock speed, so unless I missed something, it's news in that sense.

It was foreseeable though, given the issues Wii U had with the same problem, and the lower clock speeds of PS4/Xbone compared to PS3/360.


It is compared to a high end PC, not ps360. ps3 had a 550mhz clocked GPU, ps4 is clocked at +-853mhz



RenCutypoison said:
curl-6 said:
RenCutypoison said:
Great article, but multi threading difficulties are no news. Time for devs to start optimizingg (Remember johnattan blow)

This is the first I've heard of devs have difficulties with PS4's clock speed, so unless I missed something, it's news in that sense.

It was foreseeable though, given the issues Wii U had with the same problem, and the lower clock speeds of PS4/Xbone compared to PS3/360.


It is compared to a high end PC, not ps360. ps3 had a 550mhz clocked GPU, ps4 is clocked at +-853mhz

PS4's CPU runs at around 1.6Ghz, just half of the PS3 and 360's 3.2GHz CPUs.