By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Quality Mode or Performance? (Poll)

 

I choose, by default...

Quality mode 5 9.80%
 
Performance mode 38 74.51%
 
Never go near setting at all. 1 1.96%
 
I wish every game had a b... 7 13.73%
 
Total:51

90 seems to be the hieght of what effects camera movement and gameplay, I know people can perceive higher than this but outside of VR, which you can tell very apparently when sonething isn't hitting above 90 cause there is a drag on your head movments, the only game I ever noticed frame rate higher than than around 60 was Minecraft in creative mode, it made the camera and movement more stable and I noticed right up to about 80s which was my goal to keep it at when modding but after 100..90is a very good cap but tbh, I' happy with 60fps on consoles, it's already more than I thought we'd have as standard and it's nice seeing it become a default.

That said, 30fps was never as bad as it is today, it's not 30fps anymore but some abomination of 30fps. Playing Bloodborne last year and it's nothing like ps5 quality mode 30fps where the camera drags or cycles between different weighted feelings, it's actually feels relatively smooth, yeah, It's a draw back all right but it's rarely going to get you killed. 30 fps these days makes me feel like I'm playing GTA Vice City and whatever the hell was wrong with that games framerate.



Around the Network
Norion said:
Hiku said:

I don't know how common it is.
But I've seen people say they don't here and there, and I tried showing a friend the difference between 30 and 60 in Resident Evil 4, and he couldn't tell.

I think he didn't even understand the concept, so maybe he didn't know what he was looking for, despite me describing in what way it's smoother.
Considering that, how many casual players understand what frames per second is?

I truly don't understand how someone couldn't tell since even the difference between 60 and over 100 is big enough that just moving a mouse around feels way smoother. The first time I set my monitor to 144hz the sudden jump in smoothness with even just that was genuinely stunning.

I don't think most peoples brains register this sort of detail, it just dismisses it. I noticed once I got a new laptop and was working between my old moniter and the laptop that the mouse is smoother on the laptop 240hz screen. But until i had the direct comparison moving between both on the 2 screens, I never noticed because the mouse was in both cases doing its job effectively. Even until this discussion its not something which I register on a day to day basis 



LegitHyperbole said:

90 seems to be the hieght of what effects camera movement and gameplay, I know people can perceive higher than this but outside of VR, which you can tell very apparently when sonething isn't hitting above 90 cause there is a drag on your head movments, the only game I ever noticed frame rate higher than than around 60 was Minecraft in creative mode, it made the camera and movement more stable and I noticed right up to about 80s which was my goal to keep it at when modding but after 100..90is a very good cap but tbh, I' happy with 60fps on consoles, it's already more than I thought we'd have as standard and it's nice seeing it become a default.

That said, 30fps was never as bad as it is today, it's not 30fps anymore but some abomination of 30fps. Playing Bloodborne last year and it's nothing like ps5 quality mode 30fps where the camera drags or cycles between different weighted feelings, it's actually feels relatively smooth, yeah, It's a draw back all right but it's rarely going to get you killed. 30 fps these days makes me feel like I'm playing GTA Vice City and whatever the hell was wrong with that games framerate.

Bloodborne has uneven frame pacing so I'm very curious what PS5 30fps experience you're comparing it to? Potentially you are going into a game and toggling between 30 and 60fps which will then make your brain see 30fps as even more stuttery because its just been reading and adjusting to 60. Aside from frame consistency and motion blur there is no magic that makes different 30fps games smoother than others unless the game has wildly different speed of action and camera movements.



Trust me, once you hit 100+ then even 60 fps sucks balls.



Otter said:
LegitHyperbole said:

90 seems to be the hieght of what effects camera movement and gameplay, I know people can perceive higher than this but outside of VR, which you can tell very apparently when sonething isn't hitting above 90 cause there is a drag on your head movments, the only game I ever noticed frame rate higher than than around 60 was Minecraft in creative mode, it made the camera and movement more stable and I noticed right up to about 80s which was my goal to keep it at when modding but after 100..90is a very good cap but tbh, I' happy with 60fps on consoles, it's already more than I thought we'd have as standard and it's nice seeing it become a default.

That said, 30fps was never as bad as it is today, it's not 30fps anymore but some abomination of 30fps. Playing Bloodborne last year and it's nothing like ps5 quality mode 30fps where the camera drags or cycles between different weighted feelings, it's actually feels relatively smooth, yeah, It's a draw back all right but it's rarely going to get you killed. 30 fps these days makes me feel like I'm playing GTA Vice City and whatever the hell was wrong with that games framerate.

Bloodborne has uneven frame pacing so I'm very curious what PS5 30fps experience you're comparing it to? Potentially you are going into a game and toggling between 30 and 60fps which will then make your brain see 30fps as even more stuttery because its just been reading and adjusting to 60. Aside from frame consistency and motion blur there is no magic that makes different 30fps games smoother than others unless the game has wildly different speed of action and camera movements.

Nah, some 30fps are better than some others. Alan Wake 2 is very smooth for example while Cyberpunk has a sluggish feel and...ugh, I can't think of any now but I know there are some that effect the camera movment more than others, I'm not talking choppiness here, I'm talking actual gameplay interference where the camera will jerk or move at different speeds and no, it's no framele pacing. FF16 and Robot cop rogue city as examples that comes to mind where the camera and gameplay is effected but I know there are more than that. Yeah, Bloodborne isn't great but it never interferes with the camera or at least not enough that I've noticed that would cause death. 



Around the Network
LegitHyperbole said:

Nah, some 30fps are better than some others. Alan Wake 2 is very smooth for example while Cyberpunk has a sluggish feel and...ugh, I can't think of any now but I know there are some that effect the camera movment more than others, I'm not talking choppiness here, I'm talking actual gameplay interference where the camera will jerk or move at different speeds and no, it's no framele pacing. FF16 and Robot cop rogue city as examples that comes to mind where the camera and gameplay is effected but I know there are more than that. Yeah, Bloodborne isn't great but it never interferes with the camera or at least not enough that I've noticed that would cause death. 

Sounds like you're talking about camera logic which is ultimately separate from frame rate. All games have slightly different setting which inform how quickly the camera updates/follows the player etc. For example a third person camera that just snaps to the players new location/orientation is likely to feel more choppier then one which slowly interpolates. But the latter may feel laggy if it moves too slowly. There's thumbstick sensitivty, deadzones and a lot of other stuff which can effect the feeling. framerate can make these more or less noticable.

Cyberpunk has pretty heavy input latency, which is a separate issue altogether which means that it takes a while for the game register player input because of all the world logic it first runs through, a lower FPS only makes it worse. I'd say its probably typical of modern games to have more input latency because of the complexity of the worlds and character animations



Norion said:

A part of it might be me having the DPI set kinda high so even small movements of the wrist can move it a lot so it taking slightly longer to react to me stopping moving my wrist at 60hz makes a difference when I'm used to it being at 144hz.

It's been a while since I've looked into it but I think you need much higher than 240hz to have truly perfect clarity to the extent that there's a push for 1000hz monitors which considering that 500hz ones have been a thing since 2023 ones with 1000hz might just be a couple years or so away at this point. In fact I just looked that up and it seems 2027 is a likely year for it. They'll just be 1080p at first though so still some more years than that till truly perfect monitors refresh rate and resolution wise are available.

I guess it depends on the speed at which things travel on a screen. For perfect clarity you need to be able to follow objects with your eyes without them 'jumping' over the screen.

The higher the resolution, the more steps you need to stop moving objects skipping pixels.

Since we know the max resolution of the human eye is about 90 pixels per degree you can also say objects need to make 90 'steps' per degree to appear as perfectly smooth movement. For 1080p, 1920x1080, watching it from a distance that the monitor covers 21 degrees of your fov gives you perfect resolution on that 1080p monitor.

An object that crosses the screen left to right needs to make 1920 steps. At 1000 fps, the max speed at which the pointer can move for perfect clarity is 1.92 seconds to traverse the screen. If I move my pointer left to right on my 1080p 144hz screen, 2 seconds to cross the screen is indeed not a stable pointer, at 1000hz it should be. At 144hz max pointer speed is close to 14 seconds to cross the screen.

2 seconds to cross the screen sounds like a lot but there's also a cut off for how fast you can follow moving objects.

From Google
"Smooth pursuit movements: These smooth, continuous movements are slower than saccades, moving at up to 100 degrees per second. They require constant feedback to track a moving object."

My example does 10.5 degrees per second... 10,000 fps should cover perfect pursuit movement at highest perceptual human vision. There's a limit at least.



Performance



 

My youtube gaming page.

http://www.youtube.com/user/klaudkil

Otter said:
LegitHyperbole said:

Nah, some 30fps are better than some others. Alan Wake 2 is very smooth for example while Cyberpunk has a sluggish feel and...ugh, I can't think of any now but I know there are some that effect the camera movment more than others, I'm not talking choppiness here, I'm talking actual gameplay interference where the camera will jerk or move at different speeds and no, it's no framele pacing. FF16 and Robot cop rogue city as examples that comes to mind where the camera and gameplay is effected but I know there are more than that. Yeah, Bloodborne isn't great but it never interferes with the camera or at least not enough that I've noticed that would cause death. 

Sounds like you're talking about camera logic which is ultimately separate from frame rate. All games have slightly different setting which inform how quickly the camera updates/follows the player etc. For example a third person camera that just snaps to the players new location/orientation is likely to feel more choppier then one which slowly interpolates. But the latter may feel laggy if it moves too slowly. There's thumbstick sensitivty, deadzones and a lot of other stuff which can effect the feeling. framerate can make these more or less noticable.

Cyberpunk has pretty heavy input latency, which is a separate issue altogether which means that it takes a while for the game register player input because of all the world logic it first runs through, a lower FPS only makes it worse. I'd say its probably typical of modern games to have more input latency because of the complexity of the worlds and character animations

If it was then performance modes wouldn't solve it. Anyway, asides from that, there is differing levels of chopiness, Alan Wake 2 is clearly more smooth than Cybperpunk, The Witcher 3 or Dragons Dogma 2 in their quality modes and it's not me mistaking it while switching, it is a night and day differnce. 

Last edited by LegitHyperbole - on 06 January 2025

Typically performance, but for me I go with whichever is more stable. For example, Star Wars Jedi Fallen Order I used performance mode, but for Jedi Survivor I preferred quality because it stayed more consistent than performance mode. I'd rather a stable 30 than a 60 that often drops to 50 or below.



Switch: SW-3707-5131-3911
XBox: Kenjabish