By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Norion said:
Otter said:

I don't think most peoples brains register this sort of detail, it just dismisses it. I noticed once I got a new laptop and was working between my old moniter and the laptop that the mouse is smoother on the laptop 240hz screen. But until i had the direct comparison moving between both on the 2 screens, I never noticed because the mouse was in both cases doing its job effectively. Even until this discussion its not something which I register on a day to day basis 

Since about 75% of PS5 users go for performance mode I dunno about that but then again it seems my bf didn't notice much difference when he tried out my PC. I could get used to 60hz again with little issue if I was forced to use that but when setting it back to over 100 I'd probably feel amazed at the jump in smoothness all over again.

SvennoJ said:

I guess it depends on the speed at which things travel on a screen. For perfect clarity you need to be able to follow objects with your eyes without them 'jumping' over the screen.

The higher the resolution, the more steps you need to stop moving objects skipping pixels.

Since we know the max resolution of the human eye is about 90 pixels per degree you can also say objects need to make 90 'steps' per degree to appear as perfectly smooth movement. For 1080p, 1920x1080, watching it from a distance that the monitor covers 21 degrees of your fov gives you perfect resolution on that 1080p monitor.

An object that crosses the screen left to right needs to make 1920 steps. At 1000 fps, the max speed at which the pointer can move for perfect clarity is 1.92 seconds to traverse the screen. If I move my pointer left to right on my 1080p 144hz screen, 2 seconds to cross the screen is indeed not a stable pointer, at 1000hz it should be. At 144hz max pointer speed is close to 14 seconds to cross the screen.

2 seconds to cross the screen sounds like a lot but there's also a cut off for how fast you can follow moving objects.

From Google
"Smooth pursuit movements: These smooth, continuous movements are slower than saccades, moving at up to 100 degrees per second. They require constant feedback to track a moving object."

My example does 10.5 degrees per second... 10,000 fps should cover perfect pursuit movement at highest perceptual human vision. There's a limit at least.

I'll likely be upgrading my monitor in the next 1-2 years so I'll make the jump from 1440p 144hz to 4k 240hz which should be plenty for close to a decade. Also I wonder just how much GPU power it would take to run something like maxed out Cyberpunk at that fps lol.

That's where frame generation comes in. There's really not much point to rendering the entire frame 240 times per second. What you need is movement vectors for elements on screen to generate the in between frames. GT7 does that now in VR on PS5 Pro to simulate 120fps from 60fps. DLSS 3 does that on PC.

That doesn't reduce input lag further of course, yet the difference between 120 fps 8.3ms and 240 fps 4.2ms won't make a difference. The human limit is 13ms, which corresponds to 77 fps.

https://www.pubnub.com/blog/how-fast-is-realtime-human-perception-and-technology/
Humans can respond to incoming visual stimuli at a maximum rate of about 13 milliseconds
. Any data stream faster than this would surpass our perceptual limits.


Anything over 90fps only looks more stable while tracking/following moving objects with your eyes.

(Response time is 150-200 ms for humans so the whole round trip between perceiving and pressing shoot/jump is already mostly on the human side)


I also ran into this article, the human brain does all sorts of forward prediction
https://www.sciencealert.com/to-help-us-see-a-stable-world-our-brains-keep-us-15-seconds-in-the-past#:~:text=Instead%20of%20seeing%20the%20latest,time%20can%20help%20stabilize%20perception.

While you can quickly respond to changes in the center of your vision, the total picture you live in is an amalgamation of the last 15 seconds!



From this you can also see how Eye Tracked Foveated Rendering is the big cost saver for VR rendering. It would help with monitors as well of course. It's pretty wasteful to render the entire screen at 4K resolution when you only see sharp detail in a little dot. Perhaps VR improvements will come back to flat screen monitors. A large 27" 4K monitor with eye tracking sensors could save tons of GPU power.

This is how your visual acuity drops off away from the center

Sitting on top of a PC monitor it fills up to 60 degrees of your fov. 30 degrees from the center of your eye you're down to 10% or 9 pixels per degree. (or 6 based on 20/20 vision, 60 corresponding to 20/20 vision, 90 the limit at which people can perceive a difference on screen)
9 pixels per degree over 60 degrees corresponds to only 540x303 for the whole screen. Yet game engines put as much effort into rendering where you aren't looking as where you are.

With eye tracking that can tell how far away you are sitting and where you are looking (plus the size of the screen), you can optimize the render resolution and add foveated rendering to save on pixel pushing and increase fps instead. Resolutions in games could then be defined in pixels per degree rather than set values that look different depending on screen size and distance.

Render smarter, not harder :)