By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Zkuq said:
Intrinsic said:
You need to underatnd how games run to begin with. In simple terms


It takes a certain amount of time for each frame to be rendered. Both the CPU and GPU needs to finish their tasks which results in a new frame.

Even if you make a GPU 10 times more powerful but it still take the CPU the same amount of time to finish its tasks as it did before, then the 10xGPU will still give you the same maximum framerate.

In other words, having a better GPU isn't enough to have higher/double framerates if the CPU remains the same. Yes there is a slight up clock to the CPU, but unless a game was already internally running @ 45-55fps but locked to 30fps on the PS4, the PS4pro wouldn't suddenly make that game start running at 60fps.

Does the workload of the CPU increase linearly with framerate though? If you (want to) double the framerate, is the workload of the CPU also doubled or is it less (and if it is, how much less)? I'm talking about a vague 'typical game' here. At first glance I would imagine the CPU doesn't get that much more work to do if the framerate is simply increased, because game logic should still be able to run at the same pace as before (at least ideally). I don't doubt what you said, I just want to understand better how big of a difference a framerate increase makes for the CPU.

Depends of the algorithmic complexity of the task, but even if we assume that is always just linear,double the framerate -> double the amount of data  to process on the cpu in the same time frame,thus,double the clock is needed(assuming the IPC is the same of course)