By using this site, you agree to our Privacy Policy and our Terms of Use. Close
the-pi-guy said:
VanceIX said:

Fixed it in an edit myself right after posting, sorry.

And your example is off. The CPU will in almost all cases be needing just as much RAM as the GPU, especially if you are playing a high-end game with a lot happening on the screen. In that case, the GPU suffers because the CPU will be using 3-4GB of RAM, leaving little for the GPU, or vice-versa, so the dev has to choose between optomizing for the CPU or the GPU with RAM. When you have dedicated graphics, the GPU can use all the memory given to it with 2-3GB DDR5, and then the CPU can use all 8GB available to it, without running into problems.

Now, if the PS4 had 12GB of RAM, with 9GB available for devs, this wouldn't be a problem, as both the CPU and GPU would have plenty of room to work with, but that isn't the case. 5.5GB for both gives little breathing room if needed.

What?  So, then what is the use of having 8GB for the CPU and 3 for the GPU in a PC.  That means you should only have 3 GB for the CPU.  

Killzone used 3 GB for video memory and 1.5 GB for other things.   

‣ Three memory areas

‣ System - CPU   1,536 MB System

‣ Shared - CPU + GPU  128 MB Shared

‣ Video - GPU  3,072 MB Video

There are no problems with having shared memory.  

There are no problems with having shared memory, but there is with have little shared memory.

And I was referring simply to the PS4, not PCs in the GPU/CPU memory optimization. For PCs, 8GB of memory can be utilized to have much buffer space than the PS4, period, because 8GB of dedicated RAM is much better than 3GB, and a good dev will use up as much RAM as they can to speed up their games.

And in Infamous SS, most of the memory available went to the system, with only 1.5 GB of memory dedicated for the GPU.



                                                                                                               You're Gonna Carry That Weight.

Xbox One - PS4 - Wii U - PC