By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Why does PS4 Pro not have an option for 60fps?

 

Which do you prefer for Pro?

1080p60fps with a lot better graphics 118 81.38%
 
2K60fps with better graphics 12 8.28%
 
4K30fps upscaled with better graphics 7 4.83%
 
4K60fps upscaled with similar graphics 8 5.52%
 
Total:145

It's up to devs, but they didn't upgrade the CPU, so it can be an issue. ROTR is bringing 3 modes: 4K@30, 1080@30 with better visuals, 1080p + unlocked fps.

I've also read somewhere that multiplayer games won't be allowed to have better performance on the Pro, but most MP games are already 60 fps anyway.



Around the Network

2k is quite an interesting term... because it almost certainly doesn't meant what the OP meant when he created the poll. 2k is nearly the same as 1080p. When you have the resolution nk, it means the horizontal resolution is n thousand. For example, 4k means the horizontal resolution is 4000.

Anyway, whatever 2k is supposed to mean, effects and such beat resolution almost any day, and good framerate beats almost everything else. That makes 1080p, 60 FPS, and superior effects the best choice.



PS4 Pro was not designed with coverting 30fps games to 60fps games in mind.

Eurogamer reported that Sony were considering 2 different outcomes; the $399 system we're seeing now ( the one that went out to devs and was leaked) or a potential $499 system with a new CPU. The latter would have made 60fps very comfortable for devs to achieve but sony opted for the more consumer friendly price tag. Business wise they made the right decision.



Intrinsic said:
You need to underatnd how games run to begin with. In simple terms


It takes a certain amount of time for each frame to be rendered. Both the CPU and GPU needs to finish their tasks which results in a new frame.

Even if you make a GPU 10 times more powerful but it still take the CPU the same amount of time to finish its tasks as it did before, then the 10xGPU will still give you the same maximum framerate.

In other words, having a better GPU isn't enough to have higher/double framerates if the CPU remains the same. Yes there is a slight up clock to the CPU, but unless a game was already internally running @ 45-55fps but locked to 30fps on the PS4, the PS4pro wouldn't suddenly make that game start running at 60fps.

Does the workload of the CPU increase linearly with framerate though? If you (want to) double the framerate, is the workload of the CPU also doubled or is it less (and if it is, how much less)? I'm talking about a vague 'typical game' here. At first glance I would imagine the CPU doesn't get that much more work to do if the framerate is simply increased, because game logic should still be able to run at the same pace as before (at least ideally). I don't doubt what you said, I just want to understand better how big of a difference a framerate increase makes for the CPU.



Zkuq said:
Intrinsic said:
You need to underatnd how games run to begin with. In simple terms


It takes a certain amount of time for each frame to be rendered. Both the CPU and GPU needs to finish their tasks which results in a new frame.

Even if you make a GPU 10 times more powerful but it still take the CPU the same amount of time to finish its tasks as it did before, then the 10xGPU will still give you the same maximum framerate.

In other words, having a better GPU isn't enough to have higher/double framerates if the CPU remains the same. Yes there is a slight up clock to the CPU, but unless a game was already internally running @ 45-55fps but locked to 30fps on the PS4, the PS4pro wouldn't suddenly make that game start running at 60fps.

Does the workload of the CPU increase linearly with framerate though? If you (want to) double the framerate, is the workload of the CPU also doubled or is it less (and if it is, how much less)? I'm talking about a vague 'typical game' here. At first glance I would imagine the CPU doesn't get that much more work to do if the framerate is simply increased, because game logic should still be able to run at the same pace as before (at least ideally). I don't doubt what you said, I just want to understand better how big of a difference a framerate increase makes for the CPU.

Depends of the algorithmic complexity of the task, but even if we assume that is always just linear,double the framerate -> double the amount of data  to process on the cpu in the same time frame,thus,double the clock is needed(assuming the IPC is the same of course)



Around the Network

Wow, look at the poll results! That's quite telling. The overwhelming majority is perfectly happy with 1080p and prefers a higher framerate.

I wonder how many actually have seen 1440p or 4K gaming in action. I have never seen a game in those resolutions, only video content on TV's in electronic stores.



GOWTLOZ said:
aLkaLiNE said:

I think the real take away from these threads is that we have a whole slew of Xbox gamers interested in PSPro. Come on in, the waters fine (:

 

OT: up to the devs, always has been, 60fps was achieved on PS2 so it's really a matter of priority. Better graphics evidently markets better than higher frames or more pixels. Diminishing returns? Idk

Sony should make it a requirement as its taking away many gamers from console to PC as 60fps matters a lot to many people.

I eagerly await your source of information considering it's the second fastest selling console ever which directly contradicts everything you just said.



Because the console was designed to improve resolution, not frame rate, hence the nearly unchanged cpu. Too many people seem to think the gpu is all that matters.



The PS4 has an option for 60fps so I'm assuming the Pro does too.



 

The PS5 Exists. 


OttoniBastos said:
Zkuq said:

Does the workload of the CPU increase linearly with framerate though? If you (want to) double the framerate, is the workload of the CPU also doubled or is it less (and if it is, how much less)? I'm talking about a vague 'typical game' here. At first glance I would imagine the CPU doesn't get that much more work to do if the framerate is simply increased, because game logic should still be able to run at the same pace as before (at least ideally). I don't doubt what you said, I just want to understand better how big of a difference a framerate increase makes for the CPU.

Depends of the algorithmic complexity of the task, but even if we assume that is always just linear,double the framerate -> double the amount of data  to process on the cpu in the same time frame,thus,double the clock is needed(assuming the IPC is the same of course)

This is not correct.

It depends on the type of tasks.

Graphics, such as painting the geometry, shaders and polygons are heavily GPU dependent. A twice as fast GPU can render images roughly twice as fast, but the increased amount of draw calls required to do that by the CPU only increases the CPU load by 10% or so in a typical graphics intensive game, because most of the CPU tasks remain unchanged, such as calculations for AI, physics, scripts, user input etc.

So the PS4 Pro is perfectly balanced if the games remain within the same generation so to speak. It's designed so that you can double the framerate or resolution or other tasks that the GPU handles such as better lighting, anti-aliasing, filtering, shadows, post processing visual effects, but not for more complex physics (think Battlefield and it's destruction, which is calculated by the CPU) or AI and world systems (think all the behaviour, interaction and position of units in an RTS, or all the world simulation, stats and items that are handled by the CPU in Skyrim).

So in summary, a doubling of the framerate does not require a doubled computation power for AI, physics or simulation (typical CPU tasks), but it does require a GPU that is at least twice as fast. In nearly all games, the AI behaviour, physics and simulation stays the same no matter if the game is played on a weak PC or a strong PC, or on the PS4 Pro or the Wii U.