By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Quality Mode or Performance? (Poll)

 

I choose, by default...

Quality mode 5 9.80%
 
Performance mode 38 74.51%
 
Never go near setting at all. 1 1.96%
 
I wish every game had a b... 7 13.73%
 
Total:51
Random_Matt said:

30 fps should be illegal.

as long as its rock solid, with frame timing.... some games (ei. like turn based rpg's) play just fine at 30fps.

It really does depend on the game though.
Some games are not at their best, running 30fps (even with decent frame timing/pacing).



Around the Network

Always performance mode, the higher fps count make the picture look better to me with less stutter more so than hight resolution. So better performance with higher precision to inputs and movements as well as better presentation to me. I have no reason to use any other mode.



JRPGfan said:
Hiku said:

Almost always performance mode, unless it's a game where fps doesn't really matter.

The problem is that 60 fps isn't marketable in still images. For that, quality mode is more effective.
And hell, some people can't even tell the difference between 30fps and 60fps in videos. Especially not when they're not compared side by side. And they're usually not in trailers.

Not even if the videos show movement? or like a person turning the camera view?

I think most people can tell, if its a fast paced game.

I don't know how common it is.
But I've seen people say they don't here and there, and I tried showing a friend the difference between 30 and 60 in Resident Evil 4, and he couldn't tell.

I think he didn't even understand the concept, so maybe he didn't know what he was looking for, despite me describing in what way it's smoother.
Considering that, how many casual players understand what frames per second is?



JRPGfan said:
Random_Matt said:

30 fps should be illegal.

as long as its rock solid, with frame timing.... some games (ei. like turn based rpg's) play just fine at 30fps.

It really does depend on the game though.
Some games are not at their best, running 30fps (even with decent frame timing/pacing).

Racing games were always fine to me as well at 30fps, if stable optimized for input lag. DriveClub and Motorstorm on PS3 were great. I didn't really notice the difference between GT5 and those. Honestly I can't tell from memory what each game ran at. I can remember games that suffered from screen tearing. Rather locked 30fps (without triple buffering or judder / uneven frame pacing) than 60fps with screen tearing.

It was an issue when TVs became slower. CRT tvs had no added input lag, direct display, 60hz CRT meant at most 16ms added lag to finish drawing a frame. Then LCD tvs needed to do all kinds of processing increasing the input lag to 120ms. So then it became an issue to reduce game input lag from triple buffered 30 fps games going over 100+ ms input lag. Combined with the tv lag, not a pleasant experience. NFS Shift 2 had the worst input lag I've ever seen, it was undrive-able at speed. Someone measured that at 200ms input lag.

Nowadays all tvs have game mode, reducing their input lag down to 8ms for 120hz panels. Well made 30 fps games are perfectly fine again with game mode enabled. For most games the difference between 50-60ms and 25-30ms input lag makes no difference. Game mode makes the difference!

Visually turning is what makes 30fps stand out. 60fps looks better, 120fps for smooth turning. But you get used to it, plus you don't turn constantly in most games. Twitch shooters sure it makes a difference, yet in most games it just looks a bit nicer. Same in VR, there it's between 60, 90 and 120 fps. 90fps looks nicer than 60fps with reprojection.

GT7 with frame generation to simulate 120fps native looks nicer than 60fps with reprojection. Same input lag, cleaner image. But apparently not that big of a difference as when I played "My first GT" I didn't notice until finishing it that it didn't have the new reprojection mode and was playing in the old 60fps with reprojection. I noticed it because the HUD was stable (new reprojection mode has some bugs where it creates double images of HUD elements in the frame generation) not because the environment was blurry. Yet switching back to GT7 it did look a bit more clear again.

Image / frame pace stability comes first, then higher (stable) fps to make turning the camera look better. Frame generation (with movement vectors) can do that as well, which is where it seems we are headed.



Always performance mode, the higher fps count make the picture look better to me with less stutter more so than hight resolution. So better performance with higher precision to inputs and movements as well as better presentation to me. I have no reason to use any other mode.



Around the Network
Hiku said:
JRPGfan said:

Not even if the videos show movement? or like a person turning the camera view?

I think most people can tell, if its a fast paced game.

I don't know how common it is.
But I've seen people say they don't here and there, and I tried showing a friend the difference between 30 and 60 in Resident Evil 4, and he couldn't tell.

I think he didn't even understand the concept, so maybe he didn't know what he was looking for, despite me describing in what way it's smoother.
Considering that, how many casual players understand what frames per second is?

I truly don't understand how someone couldn't tell since even the difference between 60 and over 100 is big enough that just moving a mouse around feels way smoother. The first time I set my monitor to 144hz the sudden jump in smoothness with even just that was genuinely stunning.



Norion said:
Hiku said:

I don't know how common it is.
But I've seen people say they don't here and there, and I tried showing a friend the difference between 30 and 60 in Resident Evil 4, and he couldn't tell.

I think he didn't even understand the concept, so maybe he didn't know what he was looking for, despite me describing in what way it's smoother.
Considering that, how many casual players understand what frames per second is?

I truly don't understand how someone couldn't tell since even the difference between 60 and over 100 is big enough that just moving a mouse around feels way smoother. The first time I set my monitor to 144hz the sudden jump in smoothness with even just that was genuinely stunning.

Do you actually look at your mouse pointer when moving it? I look to where I want it to go and just put it there in one move (hence it's so critical mouse pointer speed is always the same)

I just set my display speed to 60hz (from 144hz). I don't see a difference in the mouse pointer :/ Back to 144hz, still the same. Well maybe if I concentrate on the pointer and move it back and forth at constant speed. At 60hz I see the bracket clearer than at 144hz... but making steps rather than more (naturally) motion blurred smoother movement.

I did notice the difference with CRT since higher refresh rates reduce flicker. CRT monitors below 100hz were more tiring to look at all day. But higher refresh rates on those caused pointer trails, frames overlapping. (Then they made pointer trails a feature!)



SvennoJ said:
Norion said:

I truly don't understand how someone couldn't tell since even the difference between 60 and over 100 is big enough that just moving a mouse around feels way smoother. The first time I set my monitor to 144hz the sudden jump in smoothness with even just that was genuinely stunning.

Do you actually look at your mouse pointer when moving it? I look to where I want it to go and just put it there in one move (hence it's so critical mouse pointer speed is always the same)

I just set my display speed to 60hz (from 144hz). I don't see a difference in the mouse pointer :/ Back to 144hz, still the same. Well maybe if I concentrate on the pointer and move it back and forth at constant speed. At 60hz I see the bracket clearer than at 144hz... but making steps rather than more (naturally) motion blurred smoother movement.

I did notice the difference with CRT since higher refresh rates reduce flicker. CRT monitors below 100hz were more tiring to look at all day. But higher refresh rates on those caused pointer trails, frames overlapping. (Then they made pointer trails a feature!)

A mixture between those I suppose and that's strange to me since when I set it to 60hz to test it out it feels kinda choppy and the increased input lag is noticeable making it harder to quickly move it about accurately. Like at 60hz I can't effortlessly glide it to where I wanna put it as easily.



Norion said:

A mixture between those I suppose and that's strange to me since when I set it to 60hz to test it out it feels kinda choppy and the increased input lag is noticeable making it harder to quickly move it about accurately. Like at 60hz I can't effortlessly glide it to where I wanna put it as easily.

Interesting. Maybe I'm just to used to my mouse. I never think about moving it, just put it where it needs to be. The distance on the table is still the same!

But I do see quite a noticeable difference when I move the page up and down by dragging the scrollbar. That's a lot more 'stuttery' at 60hz. Still not smooth at 144hz though, but maybe a better showcase than just the pointer :)

Human motion blur is weird anyway. When you're driving (or better as a passenger) try to tell where the road ahead (or landscape to the side) turns blurry closer to the car. It's not a predefined line and sometimes you see snapshots of sharp images between the blur shooting by. (Your pupils tend to lock on to moving detail to sample an image) Hence I dislike motion blur as stuff I focus on is not blurred irl, the background is (unless it rotates of course, can't lock onto that)

I wonder is 240hz enough to eliminate the difference between real movement and on screen movement? 144hz is not enough yet to see a solid mouse pointer while moving it around, following it with your eyes. But indeed better than at 60 and a lot easier to keep your eyes locked on the pointer while moving it.

Trackmania Turbo was a nice demonstration on PSVR between 60 and 120 fps. Looking at rotating helicopter blades in one of the levels (without motion blur applied by the game) 120fps simply showed twice as many blades as at 60fps. Still far too slow to actually see the blur you see irl. Same when looking sideways at a fence while at speed. Twice as many fence posts at 120 as at 60. But no natural blur. Your mind puts the discreet images together, not seeing it as a 'smear'.

I guess the only way to get that (without adding motion blur) is to provide such a high fps that things only move 1 pixel per frame, not skipping any. So your mind collects the whole path in the sample time instead of multiple discreet images. Which is exactly what motion blur does of course, however the game doesn't know what you're looking at / following with your eyes.

Anyway higher fps, means less motion blur needed = better image!
(But only if that doesn't mean the image has to be much lower resolution to reach 60fps, always trade-offs)



SvennoJ said:
Norion said:

A mixture between those I suppose and that's strange to me since when I set it to 60hz to test it out it feels kinda choppy and the increased input lag is noticeable making it harder to quickly move it about accurately. Like at 60hz I can't effortlessly glide it to where I wanna put it as easily.

Interesting. Maybe I'm just to used to my mouse. I never think about moving it, just put it where it needs to be. The distance on the table is still the same!

But I do see quite a noticeable difference when I move the page up and down by dragging the scrollbar. That's a lot more 'stuttery' at 60hz. Still not smooth at 144hz though, but maybe a better showcase than just the pointer :)

Human motion blur is weird anyway. When you're driving (or better as a passenger) try to tell where the road ahead (or landscape to the side) turns blurry closer to the car. It's not a predefined line and sometimes you see snapshots of sharp images between the blur shooting by. (Your pupils tend to lock on to moving detail to sample an image) Hence I dislike motion blur as stuff I focus on is not blurred irl, the background is (unless it rotates of course, can't lock onto that)

I wonder is 240hz enough to eliminate the difference between real movement and on screen movement? 144hz is not enough yet to see a solid mouse pointer while moving it around, following it with your eyes. But indeed better than at 60 and a lot easier to keep your eyes locked on the pointer while moving it.

Trackmania Turbo was a nice demonstration on PSVR between 60 and 120 fps. Looking at rotating helicopter blades in one of the levels (without motion blur applied by the game) 120fps simply showed twice as many blades as at 60fps. Still far too slow to actually see the blur you see irl. Same when looking sideways at a fence while at speed. Twice as many fence posts at 120 as at 60. But no natural blur. Your mind puts the discreet images together, not seeing it as a 'smear'.

I guess the only way to get that (without adding motion blur) is to provide such a high fps that things only move 1 pixel per frame, not skipping any. So your mind collects the whole path in the sample time instead of multiple discreet images. Which is exactly what motion blur does of course, however the game doesn't know what you're looking at / following with your eyes.

Anyway higher fps, means less motion blur needed = better image!
(But only if that doesn't mean the image has to be much lower resolution to reach 60fps, always trade-offs)

A part of it might be me having the DPI set kinda high so even small movements of the wrist can move it a lot so it taking slightly longer to react to me stopping moving my wrist at 60hz makes a difference when I'm used to it being at 144hz.

It's been a while since I've looked into it but I think you need much higher than 240hz to have truly perfect clarity to the extent that there's a push for 1000hz monitors which considering that 500hz ones have been a thing since 2023 ones with 1000hz might just be a couple years or so away at this point. In fact I just looked that up and it seems 2027 is a likely year for it. They'll just be 1080p at first though so still some more years than that till truly perfect monitors refresh rate and resolution wise are available.

Last edited by Norion - on 06 January 2025