By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Chrkeller said:

I must be super sensitive because for me 120 fps is every bit as big of jump over 60 as 60 is over 30.  I've tested it out in a few games like RE4 where it tracks accuracy.  I get rather large gains.  When games like TLoU and oddly TTW have select areas that run poorly, I noticed immediately and confirmed via software that displays fps.  In both games there was an area that dropped to lows 80s, caught the drop immediately, stands out like a sore thumb.  

I'll leave my prediction, consoles will start offering more and more higher fps...  it will become a thing.  

Perhaps once you game a long time at 120 fps, you get used to it and 60 fps just seems sluggish.  Like a conditioning aspect.  With the exception of Nintendo and a hand full of PC games (souls, hades)  I have not played any games that didn't average 100+ fps in 2 years.  

This sounds like you're very attentive to your own performance and skill level which adds up. More competitive gamers (even with themselves), are more likely to notice.

Although I play games in medium/hard settings, I'm typically just there for the escapism. I don't score track, I don't platinum anything. It's mostly 3rd person.  So your relationship with games, the genres of the games you spend most your time playing, as well the setup all for sure play a role. 

"At 2 m (6.5 ft) viewing distance, a 55–65" 4K TV roughly matches the detail of a 24–27" 1080p monitor at arm’s length."

Pretty much even the best home TV setups are giving only midrange detail compared to the playing on PC.

Our brains are adaptive and get conditioned too, so generationally it can change and console gamers have grown up on aim assist, large reticles, motion blur, fixed framerates decided by the dev (up until PS5/Series X)

Last edited by Otter - on 19 August 2025