By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - How important is 60fps to you?

 

What do you think?

Anything less is unacceptable 20 16.26%
 
It's very important 40 32.52%
 
It's nice, but 30fps is still fine 44 35.77%
 
It's not important 19 15.45%
 
Total:123
JackHandy said:
curl-6 said:

I had a very similar experience with the Donkey Kong Country games. Playing them on a HDTV I died all the time and wondered if I was just a worse gamer than I was when I was younger, but once I played them on a CRT again suddenly it felt instantaneously responsive like how I remembered it.

Looks a hell of a lot better than on a HDTV too.

While I understand the reasoning behind it, it does feel strange when you're playing a game one a modern HDTV and it looks far worse. There's this moment of weird disconnect where you're like, wait... why did we think liquid displays were better again? When a TV that was manufactured in 1999 looks and performs better than a TV made in 2021, something is wrong. Clearly, the people designing these things didn't take any of this into consideration.

Yeah, it is strange; we've effectively traded in responsiveness and that soft smoothness of the image for sharpness it seems.

Digital Foundry did an interesting piece on it: https://www.eurogamer.net/articles/digitalfoundry-2019-modern-games-look-beautiful-on-crt-monitors

Speaking of older games, before I even knew what framerate was I used to switch quite happily between Starfox at 15fps and Mario at 60fps on my SNES. I think the first game that ever made me really notice framerate was Banjo Tooie, and that shit was extremely inconsistent and it was more the fluctuation I noticed when it would go into almost slow motion.

Last edited by curl-6 - on 26 May 2021

Around the Network
KratosLives said:

It's annoying me right now. I'm considering holding off playing my current ps4 games and waiting till I can get a ps5, because I'm picking up on the less responsive 30fps. I just finished playing sekiro and re8 and told myself during the game, I wish this was 60fps. Now i'm playing through kingdom heart 3, and although the gameplay is hectic and beautiful, I can tell it's running in 30fps and can imagine a better smoother experience in 60fps. Once you experience 60fps there is no going back.

Perfect experiment for those in doubt, load up tlou1 remastered, swith 30 fps mode, play for a minute, then switch back to 60fps,play, revert back. The difference is night and day.

Now I have to try play through 30fps.. Sux I can't find a damm ps5.

I did that. Was in fact the first game I played that offered that option. Did very little if anything for me. Went back to quality. 



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

JackHandy said:

While I understand the reasoning behind it, it does feel strange when you're playing a game one a modern HDTV and it looks far worse. There's this moment of weird disconnect where you're like, wait... why did we think liquid displays were better again? When a TV that was manufactured in 1999 looks and performs better than a TV made in 2021, something is wrong. Clearly, the people designing these things didn't take any of this into consideration.

People wanted bigger TVs, but can't carry bigger CRTs. I think my 34" one is 160 pounds, actually 175 pounds
https://www.cnet.com/products/panasonic-ct-34wx53-tau-34-crt-tv-1080i/

It's a great tv and could do split screen between different inputs. I could play Gears of War on 360 with component cables while my wife could watch tv on the same screen, evenly split. No clue why that isn't possible anymore.

Btw analog tv looked better than HD tv lol. HD cable was better when they introduced it in 2004. Since then they have reduced the bandwidth while still sticking to mpeg-2. So now it's 6 to 7 mbps mpeg2 720p / 1080i. It's a bit more bandwidth than Netflix 1080p, yet one day when out cable was out, we watched tv on our old analog set. There were still a couple over the air channels and wow, better colors, no compression artifacts, stable PQ in fast moving scenes / strobe lights / confetti etc. Sure it was lower res, but also a much nicer picture to look at.

Anyway the general public was all about bigger and flatter, while watched badly compressed 4:3 SD programs stretched wide to 16:9. (The digital quality of SD channels is awful). Analog tv had its problems, but if you did your cabling right with the right splitters and amplifiers, beautiful picture. It did take a bit of improvisation to watch DVR recorded programs in the bedroom. Or rather just switch the cable before going to the bedroom and use a infrared relay transmitter. It worked better than streaming!



curl-6 said:
JackHandy said:

The thing I notice right away when playing on a CRT is how there's almost-zero input lag. I was trying to play SMB3 on my 4K TV via the Switch one day and kept dying in ways that I never use to die when I was younger. Curious, I went downstairs, popped the actual cart into my actual NES, turned it on, grabbed my controller and within seconds, I was literally flying through that game on my CRT. It was crazy how much more accurate my button presses and timing was. It was as if I were some sort of cyborg ninja utilizing the full brunt of the force!

After that experience, I completely gave up trying to play anything retro on newer consoles (I own all three). From now on, if it's PS2 or older, it's on a CRT via an actual disc/cart. There's just no other way.

I had a very similar experience with the Donkey Kong Country games. Playing them on a HDTV I died all the time and wondered if I was just a worse gamer than I was when I was younger, but once I played them on a CRT again suddenly it felt instantaneously responsive like how I remembered it.

Looks a hell of a lot better than on a HDTV too.

CRT's aren't "Pixel Perfect". - So any pixelated mess is easy to look good.
In-fact, developers would often leverage the scan-lines and dithering of a CRT and make a checkerboard pattern sprite in order to fake a Transparency effect.
On a modern LCD that is pixel-perfect it's not a transparent asset, but rather a checkerboard sprite... And it looks extremely jarring.
For example LCD vs CRT:


Obviously you can "simulate" these effects on an LCD panel by introducing horizontal blur.

So yes, these older games are designed for older displays and will definitely look and play better on older displays.


SvennoJ said:

In some games it gets too hot (98c) and thermal throttled which causes severe stutters, hence I disable turbo for FH4 and FS2020. Testing ori right now temp stays in the high 70s / low 80s with turbo enabled, so not as taxing. However now I'm paying closer attention to it, it's not smooth while running either, lot of very noticeable judder, even when jumping (flying through the air) so scroll speed should be constant. It's very distracting.

My CPU does go up to 3.8 ghz while Ori runs. I see no other settings, just resolution, set to 1080p, full screen, motion blur off, VSync on. Without Vsync the judders are less, but still no constant scroll speed. More playable anyway, but I need to start over I see. No clue what I'm doing anymore, running back and forth in the same area.

No clue how to change the other settings (hyperthreading, affinity)

You can set the affinity in task manager, you just assign your game to set CPU cores. I.E. CPU Core 0, 2, 4, 6.

You *should* be able to disable Hyper-Threading in the bios, but only if your manufacturer allows it... If you can't, just keep the game off the Hyper-threaded cores in Task Manager so you aren't running the game on only two CPU cores but 4 threads.

Turbo is meant to  get your CPU hot, but after a period of time it should "settle" at a set clockrate for any given task.

JackHandy said:
curl-6 said:

I had a very similar experience with the Donkey Kong Country games. Playing them on a HDTV I died all the time and wondered if I was just a worse gamer than I was when I was younger, but once I played them on a CRT again suddenly it felt instantaneously responsive like how I remembered it.

Looks a hell of a lot better than on a HDTV too.

While I understand the reasoning behind it, it does feel strange when you're playing a game one a modern HDTV and it looks far worse. There's this moment of weird disconnect where you're like, wait... why did we think liquid displays were better again? When a TV that was manufactured in 1999 looks and performs better than a TV made in 2021, something is wrong. Clearly, the people designing these things didn't take any of this into consideration.

CRT's did have a multitude of advantages... Really good contrasts, low input latency, no motion blur... But they were heavy, power hungry, expensive to make large and aren't pixel perfect.
OLED's tend to erode many of the advantages CRT's had, such as motion blur and contrasts.

Either way, low-resolution games like SNES titles will always look better on a CRT, they were simply designed for it.

Last edited by Pemalite - on 27 May 2021


www.youtube.com/@Pemalite

Just jumped between the 30 fps Horizon demo and the 60fps Dying Light 2 showing.
-Appreciate the smoothness of Dying Light 2
-Horizon blew my mind and it even had a few dips under 30

So yeah, really depends on the game but 30fps is perfectly acceptable and sometimes is necessary for those wow moments on affordable hardware.



Around the Network
Pemalite said:
SvennoJ said:

In some games it gets too hot (98c) and thermal throttled which causes severe stutters, hence I disable turbo for FH4 and FS2020. Testing ori right now temp stays in the high 70s / low 80s with turbo enabled, so not as taxing. However now I'm paying closer attention to it, it's not smooth while running either, lot of very noticeable judder, even when jumping (flying through the air) so scroll speed should be constant. It's very distracting.

My CPU does go up to 3.8 ghz while Ori runs. I see no other settings, just resolution, set to 1080p, full screen, motion blur off, VSync on. Without Vsync the judders are less, but still no constant scroll speed. More playable anyway, but I need to start over I see. No clue what I'm doing anymore, running back and forth in the same area.

No clue how to change the other settings (hyperthreading, affinity)

You can set the affinity in task manager, you just assign your game to set CPU cores. I.E. CPU Core 0, 2, 4, 6.

You *should* be able to disable Hyper-Threading in the bios, but only if your manufacturer allows it... If you can't, just keep the game off the Hyper-threaded cores in Task Manager so you aren't running the game on only two CPU cores but 4 threads.

Turbo is meant to  get your CPU hot, but after a period of time it should "settle" at a set clockrate for any given task.

It doesn't settle with FS2020, but that's not well optimized.

Here's the difference between boost and no boost on my laptop with FS2020

FS2020 hammers one core, making it boost up to 4 ghz all the time, causing thermal throttling, the large dips in GPU usage which get more frequent as everything gets hotter and hotter. Without boost fps is a bit lower, but more stable and temps stay in the low 70s. With boost my left USB port starts getting iffy after a while, I assume from the excessive heat. No issues without boost.

This was before I upgraded to 32GB ram. I cleaned the vents out as well while upgrading but it was pretty clean. It didn't make any difference for temps anyway. I have it sitting on a metal table, with a couple small woodblocks at the back end to leave a bigger air space underneath to draw in air, and of course clear space at the back to blast it away. FS2020 doesn't model the plane engine overheating, but I still need to keep my laptop temp in check :)



I tried Ori again. affinity to even processors only (0,2,4 etc) and confirmed it's only using those. Boost enabled, v-sync disabled, no heat issues (stays below 70c) yet still stutters. It's the game logic or something as it's not temporary dips in fps, the actual running speed is affected, scroll speed is not constant. Which is very annoying for a sidescroller. I guess I can play it on XBox one day, Blind forest bad port.



What I think a lot of people tend to overlook or not realize is that going from 30fps to 60fps isn't necessarily just a matter of turning down the resolution; the CPU is also a factor.

If a game is limited by the amount of physics, NPCs, and world simulation it's pushing, you could run it in 240p and it would still be 30fps.

60fps means half as much processing for the CPU as well as the GPU, so 60fps games have to be simpler.



curl-6 said:

60fps means half as much processing for the CPU as well as the GPU, so 60fps games have to be simpler.

I'm sorry, but wtf Curl...



Mankind, in its arrogance and self-delusion, must believe they are the mirrors to God in both their image and their power. If something shatters that mirror, then it must be totally destroyed.

Chazore said:
curl-6 said:

60fps means half as much processing for the CPU as well as the GPU, so 60fps games have to be simpler.

I'm sorry, but wtf Curl...

It's true; at 60fps the CPU has 16.67ms of frametime to work with versus 33.3ms in a 30fps game.

That leaves half as much processing time for CPU tasks like physics, world simulation, AI, etc. That's why a game like say, Breath of the Wild wouldn't be 60fps on Wii U even if you dropped it's graphics to N64 level, because it would still be CPU bound by the physics.

Last edited by curl-6 - on 27 May 2021

240 frames per second. Maybe 600 when i get my 5050 Ti.