By using this site, you agree to our Privacy Policy and our Terms of Use. Close
curl-6 said:
linkink said:

CPU plays a major role in maintaining  frame rate which is a huge part in making the graphics look better at 60fps or at 30fps, or game struggling to maintain 30fps. We don't know what tricks they are doing/learned over the years to offload cpu work to the GPU. better CPU= less work for the GPU in most cases. The stuff sony exclusives have pulled with ps4 hardware is just as amazing as they have with ps3, and it's a very popular opinion online. I'm always seeing how are they are getting ps4 to do this posts on message boards, and if last us 2 matches the E3 trailer well that's another mind blowing feat.

The games reflecting this is your opinion, and a very arguable one.

As Pemalite says, that it not how it works, at all.

The fact is, PS4 is a less complex system than the PS3 was. This is a good thing as it has made it easier for developers to create games for it. But it also means that the kind of progression seen from launch to end of life with the PS3 just cannot happen on PS4 because the system's power was fully accessible from day 1.

That's not to say there has been no progress; rendering techniques and development tools have improved over the years and this is reflected in the games look better over time. But because there are no large gains to be made from coming to grips with exotic hardware, it's just not possible for PS3's graphical progression to ever be matched on PS4.

PS3 of course is more complex system. That doesn't mean ps4 can't see similar gains, which it has, and  it's not some crazy over the top opinion. After all 360 did see similar gains if not more, it wasn't nearly as complex as ps3. Bigger gains will come from rendering techniques and development tools, i think 360,XB1 and ps4 have proven that. So to say ps4 graphical progression can't match ps3 because ps3 was more complex is 100% false.

Pemalite said:
linkink said:

Well i really don't know how it works, but from i see on PC. better CPU = higher frame rate, I would think it would make it easier for developers to get much better results for graphics. if a game can run at 60fps with a much better CPU doesn't that leave more room for graphics to improve at 30fps?

Except...
Better GPU = Higher Frame rate.
Faster Ram = Higher Frame rate.
Tighter Ram timings = Higher Frame Rate.
More Memory Channels = Higher Frame Rate.
Higher Quality motherboard = Higher Frame rate. (Thanks to better routing resulting in smaller trace lengths, chipset quality etc'.)
More Ram = Sometimes Higher Frame Rate.
Faster storage = Sometimes higher frame rate. (I.E. Better streaming of assets.)

It's a little disingenuous to state that "better CPU = higher framerate" and come to the conclusion you did.

The framerate a game operates at is influenced by each and every single component in a console or PC and the load placed upon said components.

However... The CPU does allot in assisting the rendering of a game, no doubt about it... Such as draw calls.
But whether you are GPU or CPU limited entirely depends on the game, game engine, how many characters/objects are on screen and so many other factors... Even how many audio cues are occurring... And that bottleneck might and can shift instantly as well depending on the games scene.

It's a very complex topic either way... And not one easily explained comprehensively in a singular post on a forum either.

But if a game is running at 60fps, then without question, halving the framerate (And thus doubling the available render time) will open up the capability of improving visual fidelity, but that can be said regardless of the CPU's influence.

And considering how many games employ a dynamic resolution these days and often sit below 4k anyway... We are generally GPU limited first and foremost to a point.

Thanks for the explanation.