By using this site, you agree to our Privacy Policy and our Terms of Use. Close
LegitHyperbole said:
Otter said:

Sounds like you're talking about camera logic which is ultimately separate from frame rate. All games have slightly different setting which inform how quickly the camera updates/follows the player etc. For example a third person camera that just snaps to the players new location/orientation is likely to feel more choppier then one which slowly interpolates. But the latter may feel laggy if it moves too slowly. There's thumbstick sensitivty, deadzones and a lot of other stuff which can effect the feeling. framerate can make these more or less noticable.

Cyberpunk has pretty heavy input latency, which is a separate issue altogether which means that it takes a while for the game register player input because of all the world logic it first runs through, a lower FPS only makes it worse. I'd say its probably typical of modern games to have more input latency because of the complexity of the worlds and character animations

If it was then performance modes wouldn't solve it. Anyway, asides from that, there is differing levels of chopiness, Alan Wake 2 is clearly more smooth than Cybperpunk, The Witcher 3 or Dragons Dogma 2 in their quality modes and it's not me mistaking it while switching, it is a night and day differnce. 

Framerate objectively changes latency so it would impact it.

There are differing levels of smoothness in 30fps experiences but again it goes back to the game logic, so it's just a case of discerning what aspect of the game (camera movement/pacing, post processing, latency) is making Alan a smoother experience for you because there is objectively no difference in frame rate if both games are hitting a solid 30.