Quantcast
Why 30 or 60fps, but never 48, 50, etc... ?

Forums - Gaming Discussion - Why 30 or 60fps, but never 48, 50, etc... ?

You have to also consider V-Sync. If the frames don't divide nicely into the refresh rate (ex. 30fps→60 or 120Hz), visual problems occur (like screen tearing). So, locking it at those frame rates wouldn't really benefit the game.

That's also another reason some in the PC community (which a few like to call 'elitists') brag about their monitors/refresh rates (especially if they have beefy rigs). It isn't necessarily elitism (at least, not always), but a good feeling about a scientific superiority.

Last edited by CGI-Quality - on 27 May 2019

                                                                                                                                            

Around the Network
VAMatt said:
Many (maybe all) TV manufacturers make up marking terms like "super 120" to refer to some (likely not very effective) enhancement techniques that they build in to the TV. This is done to make the consumer think that their TV is running at something higher than the 60hz that it actually is.

Not that there aren't 120hz units out there. But, just because you see 120 or 140 on the marketing materials somewhere does not necessarily mean the TV is running at that rate.

Yea Ive gotten a couple models that claimed 120 on the box but when you switch inputs to different devices always says 60.



Anyway, I think most people would be unaware of these specifics and imagine instead all screens have variable refresh rates and resolutions. When, unfortunately, neither is likely true.



 

 

 

 

 

Since blu-ray came out most modern tvs have supported 24hz playback, yet no dev has ever dared to use the true cinematic frame rate :)

With variable refresh rate games will still aim for that 30 or 60 as it will be decades before all tvs are hdmi 2.1 with vrr. Best to hope for is that games will support an unlocked frame rate mode that uses vrr. Then the capabilities of each specific tv comes into question. There is no standard for minimum and maximum frame rate supported for vrr, likely each tv will have its own sweet spot.

I do like the 144hz display on my laptop, running games in locked 36, 48 or 72 is a lot nicer than the standard 30, 60 steps. Plus the higher the standard refresh rate, the less impact a dropped frame has.

VRR will have to prove itself. It will remove screen tear, yet introduces judder. A locked stable frame rate is still better than going between 40 and 90 depending on what happens on screen. For racing stability is key, with VRR comes variable input/display lag. Solve one problem, introduce another.



SvennoJ said:
Since blu-ray came out most modern tvs have supported 24hz playback, yet no dev has ever dared to use the true cinematic frame rate :)

With variable refresh rate games will still aim for that 30 or 60 as it will be decades before all tvs are hdmi 2.1 with vrr. Best to hope for is that games will support an unlocked frame rate mode that uses vrr. Then the capabilities of each specific tv comes into question. There is no standard for minimum and maximum frame rate supported for vrr, likely each tv will have its own sweet spot.

I do like the 144hz display on my laptop, running games in locked 36, 48 or 72 is a lot nicer than the standard 30, 60 steps. Plus the higher the standard refresh rate, the less impact a dropped frame has.

VRR will have to prove itself. It will remove screen tear, yet introduces judder. A locked stable frame rate is still better than going between 40 and 90 depending on what happens on screen. For racing stability is key, with VRR comes variable input/display lag. Solve one problem, introduce another.

With TVs accepting input other than 30/60/120Hz it should also become possible to lock to framerates other than those. GT Sport on PS5 for example might include a mode to lock to 70 or 80Hz on VRR supporting TVs.



Around the Network

Problem of TV standards. So from the very beginning console games were locked to some TV standard.

On PCs however, back in CRT days, you could choose your refresh rate, but once LCDs become widespread, 60Hz crept in as standard.
That's why very old GPU benchmarks had cut-off point at 40FPS, that was considered as low as you should go without controls/presentation becoming too sluggish.



Lafiel said:
SvennoJ said:
Since blu-ray came out most modern tvs have supported 24hz playback, yet no dev has ever dared to use the true cinematic frame rate :)

With variable refresh rate games will still aim for that 30 or 60 as it will be decades before all tvs are hdmi 2.1 with vrr. Best to hope for is that games will support an unlocked frame rate mode that uses vrr. Then the capabilities of each specific tv comes into question. There is no standard for minimum and maximum frame rate supported for vrr, likely each tv will have its own sweet spot.

I do like the 144hz display on my laptop, running games in locked 36, 48 or 72 is a lot nicer than the standard 30, 60 steps. Plus the higher the standard refresh rate, the less impact a dropped frame has.

VRR will have to prove itself. It will remove screen tear, yet introduces judder. A locked stable frame rate is still better than going between 40 and 90 depending on what happens on screen. For racing stability is key, with VRR comes variable input/display lag. Solve one problem, introduce another.

With TVs accepting input other than 30/60/120Hz it should also become possible to lock to framerates other than those. GT Sport on PS5 for example might include a mode to lock to 70 or 80Hz on VRR supporting TVs.

The option probably won't happen until the ps5 pro comes out. I doubt performance and quality modes will be standard from the beginning and since most tvs will still be locked to 60hz input it's going to be 30 or 60 again, perhaps with an unlock frame rate option for VRR.

Another issue that can happen is overheating, loud fan noise or more pop in. With VRR you basically always run at max while a locked frame rate leaves idle time to spare the hardware a bit or do other things in the background. I think VRR will be used more to catch the frame rate dropping below (prevent screen tearing) rather than make it run faster. It will be a crutch for cross platform games. Instead of some games locked to 30 on ps4 pro and 60 on XBox One X, now there's the option to run it at 45 on one and 60 on the other.



Suppose i'd like to pick a 4k samsung tv that can match 60fps, any suggestions?
Because if i read correctly their 100hz tvs are native 50 hz



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Many TV's that "market" the fact they are 100hz+ panels usually aren't true 100hz+ panels, but instead use frame interpolation to "fake" it.

Some TV's are 50hz, some Consoles will output to 50Hz, which would fit perfectly with games that are running v-sync @ 50fps as well.

But the main reason why games aren't locked to 48/50fps is because it's not an easily divisible number to 60hz, so you end up with screen tear... Where-as something like 30fps can just have a frame be displayed twice as long as 60 which is generally a smooth translation.

Last edited by Pemalite - on 27 May 2019

Because modern TVs are still very very stupid. Which is why God invented monitors.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.