linkink said:
Well i really don't know how it works, but from i see on PC. better CPU = higher frame rate, I would think it would make it easier for developers to get much better results for graphics. if a game can run at 60fps with a much better CPU doesn't that leave more room for graphics to improve at 30fps? |
Except...
Better GPU = Higher Frame rate.
Faster Ram = Higher Frame rate.
Tighter Ram timings = Higher Frame Rate.
More Memory Channels = Higher Frame Rate.
Higher Quality motherboard = Higher Frame rate. (Thanks to better routing resulting in smaller trace lengths, chipset quality etc'.)
More Ram = Sometimes Higher Frame Rate.
Faster storage = Sometimes higher frame rate. (I.E. Better streaming of assets.)
It's a little disingenuous to state that "better CPU = higher framerate" and come to the conclusion you did.
The framerate a game operates at is influenced by each and every single component in a console or PC and the load placed upon said components.
However... The CPU does allot in assisting the rendering of a game, no doubt about it... Such as draw calls.
But whether you are GPU or CPU limited entirely depends on the game, game engine, how many characters/objects are on screen and so many other factors... Even how many audio cues are occurring... And that bottleneck might and can shift instantly as well depending on the games scene.
It's a very complex topic either way... And not one easily explained comprehensively in a singular post on a forum either.
But if a game is running at 60fps, then without question, halving the framerate (And thus doubling the available render time) will open up the capability of improving visual fidelity, but that can be said regardless of the CPU's influence.
And considering how many games employ a dynamic resolution these days and often sit below 4k anyway... We are generally GPU limited first and foremost to a point.

www.youtube.com/@Pemalite








