By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Quality Mode or Performance? (Poll)

 

I choose, by default...

Quality mode 5 9.80%
 
Performance mode 38 74.51%
 
Never go near setting at all. 1 1.96%
 
I wish every game had a b... 7 13.73%
 
Total:51
LegitHyperbole said:
Otter said:

Sounds like you're talking about camera logic which is ultimately separate from frame rate. All games have slightly different setting which inform how quickly the camera updates/follows the player etc. For example a third person camera that just snaps to the players new location/orientation is likely to feel more choppier then one which slowly interpolates. But the latter may feel laggy if it moves too slowly. There's thumbstick sensitivty, deadzones and a lot of other stuff which can effect the feeling. framerate can make these more or less noticable.

Cyberpunk has pretty heavy input latency, which is a separate issue altogether which means that it takes a while for the game register player input because of all the world logic it first runs through, a lower FPS only makes it worse. I'd say its probably typical of modern games to have more input latency because of the complexity of the worlds and character animations

If it was then performance modes wouldn't solve it. Anyway, asides from that, there is differing levels of chopiness, Alan Wake 2 is clearly more smooth than Cybperpunk, The Witcher 3 or Dragons Dogma 2 in their quality modes and it's not me mistaking it while switching, it is a night and day differnce. 

Framerate objectively changes latency so it would impact it.

There are differing levels of smoothness in 30fps experiences but again it goes back to the game logic, so it's just a case of discerning what aspect of the game (camera movement/pacing, post processing, latency) is making Alan a smoother experience for you because there is objectively no difference in frame rate if both games are hitting a solid 30.  



Around the Network
Otter said:
LegitHyperbole said:

If it was then performance modes wouldn't solve it. Anyway, asides from that, there is differing levels of chopiness, Alan Wake 2 is clearly more smooth than Cybperpunk, The Witcher 3 or Dragons Dogma 2 in their quality modes and it's not me mistaking it while switching, it is a night and day differnce. 

Framerate objectively changes latency so it would impact it.

There are differing levels of smoothness in 30fps experiences but again it goes back to the game logic, so it's just a case of discerning what aspect of the game (camera movement/pacing, post processing, latency) is making Alan a smoother experience for you because there is objectively no difference in frame rate if both games are hitting a solid 30.  

Strange that they can't make them all a smooth 30fps, whatever the problem may be. If one studio can do it, they all should be aiming for it to be smooth instead of forcing performance mode in games that don't need it. 



Otter said:
Norion said:

I truly don't understand how someone couldn't tell since even the difference between 60 and over 100 is big enough that just moving a mouse around feels way smoother. The first time I set my monitor to 144hz the sudden jump in smoothness with even just that was genuinely stunning.

I don't think most peoples brains register this sort of detail, it just dismisses it. I noticed once I got a new laptop and was working between my old moniter and the laptop that the mouse is smoother on the laptop 240hz screen. But until i had the direct comparison moving between both on the 2 screens, I never noticed because the mouse was in both cases doing its job effectively. Even until this discussion its not something which I register on a day to day basis 

Since about 75% of PS5 users go for performance mode I dunno about that but then again it seems my bf didn't notice much difference when he tried out my PC. I could get used to 60hz again with little issue if I was forced to use that but when setting it back to over 100 I'd probably feel amazed at the jump in smoothness all over again.

SvennoJ said:
Norion said:

A part of it might be me having the DPI set kinda high so even small movements of the wrist can move it a lot so it taking slightly longer to react to me stopping moving my wrist at 60hz makes a difference when I'm used to it being at 144hz.

It's been a while since I've looked into it but I think you need much higher than 240hz to have truly perfect clarity to the extent that there's a push for 1000hz monitors which considering that 500hz ones have been a thing since 2023 ones with 1000hz might just be a couple years or so away at this point. In fact I just looked that up and it seems 2027 is a likely year for it. They'll just be 1080p at first though so still some more years than that till truly perfect monitors refresh rate and resolution wise are available.

I guess it depends on the speed at which things travel on a screen. For perfect clarity you need to be able to follow objects with your eyes without them 'jumping' over the screen.

The higher the resolution, the more steps you need to stop moving objects skipping pixels.

Since we know the max resolution of the human eye is about 90 pixels per degree you can also say objects need to make 90 'steps' per degree to appear as perfectly smooth movement. For 1080p, 1920x1080, watching it from a distance that the monitor covers 21 degrees of your fov gives you perfect resolution on that 1080p monitor.

An object that crosses the screen left to right needs to make 1920 steps. At 1000 fps, the max speed at which the pointer can move for perfect clarity is 1.92 seconds to traverse the screen. If I move my pointer left to right on my 1080p 144hz screen, 2 seconds to cross the screen is indeed not a stable pointer, at 1000hz it should be. At 144hz max pointer speed is close to 14 seconds to cross the screen.

2 seconds to cross the screen sounds like a lot but there's also a cut off for how fast you can follow moving objects.

From Google
"Smooth pursuit movements: These smooth, continuous movements are slower than saccades, moving at up to 100 degrees per second. They require constant feedback to track a moving object."

My example does 10.5 degrees per second... 10,000 fps should cover perfect pursuit movement at highest perceptual human vision. There's a limit at least.

I'll likely be upgrading my monitor in the next 1-2 years so I'll make the jump from 1440p 144hz to 4k 240hz which should be plenty for close to a decade. Also I wonder just how much GPU power it would take to run something like maxed out Cyberpunk at that fps lol.

Last edited by Norion - on 06 January 2025

Otter said:

Framerate objectively changes latency so it would impact it.

There are differing levels of smoothness in 30fps experiences but again it goes back to the game logic, so it's just a case of discerning what aspect of the game (camera movement/pacing, post processing, latency) is making Alan a smoother experience for you because there is objectively no difference in frame rate if both games are hitting a solid 30.  

That's only true for the difference it takes for a frame to appear. Game engines aren't tied to frame rate anymore and some like racing games poll the controller input and execute the physics simulation at a much higher rate. Forza claimed to run the physics at 360hz, iRacing the same.

Controller lag also plays a role:

The PS5 controller wireless does avg 600-800hz. Wired it runs at 1000hz (and can even be boosted to 8000hz on PC)
However on Windows it defaults to 250hz over USB.



That's the polling rate or update rate from the controller, internally they run slower
In general PlayStation controllers have an update rate of 250Hz (250 times a second) and Xbox controllers have an update rate of 124Hz. (124 times a second)

This is the difference between 250hz polling (Windows default) and 1000hz polling on top of the internal update rate.


All this information comes from someone optimizing Rocket League input on PC
https://steamcommunity.com/sharedfiles/filedetails/?l=french&id=2419919131

Then the physics update time is added onto that before a frame can be rendered.

With a locked frame rate you can reduce input lag with predictive input tracking. At 30fps you know it takes (should take) a bit less than 33ms to render a frame, thus you can forward predict the physics 33ms to compensate for the render delay. (And a bit more to compensate further for transmission delay) Polling at 360hz, executing the physics in 2.8ms you can bring the input latency way down. And with forward prediction the difference between 33ms and 16ms render time can be negated. (Of course not for digital input like jumping, but while actively steering you can extrapolate the analog input)

GTS, GT7 does all this predictive physics stuff all the time for online racing where it needs to match up the other cars to your time frame to make it all appear lag free. With 50ms latency, moving at 200mph (about 90 m/s), 50 ms makes a difference of 4.5 meters, more than a car length. Hence the other cars are predicted forward for latency to make neck 'n neck racing possible without you feeling any extra input lag. Yet in fact no two races are the same, every person sees the race slightly different.

It's relevant in single player as well for you to start braking on time. You want to see where the car is 'now', not where it was when the game logic update started. I've worked with this stuff in GPS navigation where the poll rate is only once per second. You want to show on screen where the car (pointer) is now, not 2 seconds ago. With the weak hardware we used and low variable frame rate, the engine continuously measured the total input latency to adjust forward prediction for the next render. The last step before starting the render was a forward prediction call to make sure instructions are announced on time and when you see your position on screen it matches where you are on the road, eliminating perceived lag.


But yes there's still the objective difference in waiting 16ms or 33 ms for the next frame to show on the screen.
Which also adds an extra 16ms to digital input delays.


Nowadays input latency mostly depends on how fast you can poll and execute the physics and what kind of game it is.

If the game has to blend animations for walking for example, it will need more time to make a smooth transition after you press the jump button for example. The more natural movement you want from the characters, the more input delay you need to eliminate jerky motions.

If it's an arcade shooter, you can likely execute the physics super fast and reduce the time between pressing a button and shot fired down close to the render and display time. So there the fps becomes the limiting factor.



Norion said:
Otter said:

I don't think most peoples brains register this sort of detail, it just dismisses it. I noticed once I got a new laptop and was working between my old moniter and the laptop that the mouse is smoother on the laptop 240hz screen. But until i had the direct comparison moving between both on the 2 screens, I never noticed because the mouse was in both cases doing its job effectively. Even until this discussion its not something which I register on a day to day basis 

Since about 75% of PS5 users go for performance mode I dunno about that but then again it seems my bf didn't notice much difference when he tried out my PC. I could get used to 60hz again with little issue if I was forced to use that but when setting it back to over 100 I'd probably feel amazed at the jump in smoothness all over again.

SvennoJ said:

I guess it depends on the speed at which things travel on a screen. For perfect clarity you need to be able to follow objects with your eyes without them 'jumping' over the screen.

The higher the resolution, the more steps you need to stop moving objects skipping pixels.

Since we know the max resolution of the human eye is about 90 pixels per degree you can also say objects need to make 90 'steps' per degree to appear as perfectly smooth movement. For 1080p, 1920x1080, watching it from a distance that the monitor covers 21 degrees of your fov gives you perfect resolution on that 1080p monitor.

An object that crosses the screen left to right needs to make 1920 steps. At 1000 fps, the max speed at which the pointer can move for perfect clarity is 1.92 seconds to traverse the screen. If I move my pointer left to right on my 1080p 144hz screen, 2 seconds to cross the screen is indeed not a stable pointer, at 1000hz it should be. At 144hz max pointer speed is close to 14 seconds to cross the screen.

2 seconds to cross the screen sounds like a lot but there's also a cut off for how fast you can follow moving objects.

From Google
"Smooth pursuit movements: These smooth, continuous movements are slower than saccades, moving at up to 100 degrees per second. They require constant feedback to track a moving object."

My example does 10.5 degrees per second... 10,000 fps should cover perfect pursuit movement at highest perceptual human vision. There's a limit at least.

I'll likely be upgrading my monitor in the next 1-2 years so I'll make the jump from 1440p 144hz to 4k 240hz which should be plenty for close to a decade. Also I wonder just how much GPU power it would take to run something like maxed out Cyberpunk at that fps lol.

That's where frame generation comes in. There's really not much point to rendering the entire frame 240 times per second. What you need is movement vectors for elements on screen to generate the in between frames. GT7 does that now in VR on PS5 Pro to simulate 120fps from 60fps. DLSS 3 does that on PC.

That doesn't reduce input lag further of course, yet the difference between 120 fps 8.3ms and 240 fps 4.2ms won't make a difference. The human limit is 13ms, which corresponds to 77 fps.

https://www.pubnub.com/blog/how-fast-is-realtime-human-perception-and-technology/
Humans can respond to incoming visual stimuli at a maximum rate of about 13 milliseconds
. Any data stream faster than this would surpass our perceptual limits.


Anything over 90fps only looks more stable while tracking/following moving objects with your eyes.

(Response time is 150-200 ms for humans so the whole round trip between perceiving and pressing shoot/jump is already mostly on the human side)


I also ran into this article, the human brain does all sorts of forward prediction
https://www.sciencealert.com/to-help-us-see-a-stable-world-our-brains-keep-us-15-seconds-in-the-past#:~:text=Instead%20of%20seeing%20the%20latest,time%20can%20help%20stabilize%20perception.

While you can quickly respond to changes in the center of your vision, the total picture you live in is an amalgamation of the last 15 seconds!



From this you can also see how Eye Tracked Foveated Rendering is the big cost saver for VR rendering. It would help with monitors as well of course. It's pretty wasteful to render the entire screen at 4K resolution when you only see sharp detail in a little dot. Perhaps VR improvements will come back to flat screen monitors. A large 27" 4K monitor with eye tracking sensors could save tons of GPU power.

This is how your visual acuity drops off away from the center

Sitting on top of a PC monitor it fills up to 60 degrees of your fov. 30 degrees from the center of your eye you're down to 10% or 9 pixels per degree. (or 6 based on 20/20 vision, 60 corresponding to 20/20 vision, 90 the limit at which people can perceive a difference on screen)
9 pixels per degree over 60 degrees corresponds to only 540x303 for the whole screen. Yet game engines put as much effort into rendering where you aren't looking as where you are.

With eye tracking that can tell how far away you are sitting and where you are looking (plus the size of the screen), you can optimize the render resolution and add foveated rendering to save on pixel pushing and increase fps instead. Resolutions in games could then be defined in pixels per degree rather than set values that look different depending on screen size and distance.

Render smarter, not harder :)