By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - 30FPS vs 60FPS. Can you tell the difference?

 

Can you tell the difference?

Yes, clearly. 233 59.29%
 
Not really. 106 26.97%
 
See results. 54 13.74%
 
Total:393

I have never been able to tell the difference (although I do have very poor eyesight) but I've always felt 30fps was more than sufficient.



Around the Network
pezus said:
Zkuq said:

Too bad because there is a difference. But in case you want something that's easier to see, see this.

Now that's a comparison...seriously, check this out and try to tell yourself you don't see a difference.

Not only does it show a clear difference in fps but that link also confirms to me why I always like to disable motion blur when possible. Motion blur only makes it worse for me.

Set both to 30 fps, one with motion blur, I prefer the one without.



pezus said:

No, not in that way. I meant that movies are filmed at something like 24 FPS, so you can clearly notice the Wagon-Wheel Effect when watching them. So the camera is limiting what we see. But, if we are just watching something in real life ...that's a whole different story (because its FPS is (most often) not limited).

"People said you can lock the FPS at a specific value, so it never drops. That is not true. Capping the FPS is different, FPS still drop, when you have a higher cap, you are less likely to drop below the threshold 30FPS."

Well, what those people stated is partly true. If you're not playing a demanding game and/or your computer is very good, the game will not drop frames. It's really simple, try an old game like Quake 3 and limit the (max) FPS at different values (e.g. 15/30/60/100). Huge difference!

Also, screen tearing does not occur if you employ V-sync and even if you don't, the amount of tearing depends on the game. Some games almost never tear and you sure as hell won't notice it, while others tear quite a bit. Anyway, screen tearing doesn't really matter when discussing about different levels of FPS.

I agree with you though, it's nice when someone backs up their opinion...even if I don't agree :D

I remember changing the FPS for CS Source, but I never saw a difference. If there is a difference, there must be more being added to the game the extra frames. It's likely added effects which mave movements appear smoother also take place.

As for Screen tearing, Play Witcher 2 or Crysis on max. My friends computer can handle Crysis, but not Withcher 2, and there is screen tearing when everything is set to ultimate (he normally plays on high).

What I dod notice however, much more over FPS, is resolution. Man does Mass Effect 2 on PC look good in native 1080p. Blows consoles out of the water, and Crysis/Withcer 2 are even better looking. Next gen damn better go 1080p.

Also I do agree with you console games look better in 60FPS over 30 FPS, but I blaime FPS dip. In that Wikipedia article it says after 24FPS people don't notice the boost in FPS for videos.

As for PC's,again....This isn't really a good arguement, but I was watching discovery channel a long time ago and it was explaining why dogs don't like tv. Basically Dogs and wolves see at 80FPS, so a TV show which runs at 30FPS has a flickering screen to them. So if you ever see an old cathod ray TV in the background of an old movie or something, you too will see a flicker. Dogs can see a screen change frames, where people cannot see the actual transition because our eyes don't pick up signals that fast. So to add to my previous arguement, I know we can definatly see low FPS, but high FPS we don't notice as much of a change because we can't pick up that many frames at once. Dogs will, but only when you show them videos above 80FPS (which is a lot).

As for the arguement in general, I think we both could learn a bit more if we took a course in this, and your good to debate with too, pretty smart unlike most people here, but you are an older member, and you have seen some of the stupidity that is possible.

As for 3D, I have 20/20, and just by changing my focus point, I can see PS3 games and movies in 3D or 2D, when I wear thos glasses. Also when I play 3D, I notice the screen flickers a lot, but others don't.

Zkuq Something isn't right about that video. Both balls are crazy blurry, and when you put them at 30FPS they lag a lot more then they should. When I used to play PC games, I always played them in 30FPS, and never did I see lag there.

Wait there is something seriously wrong with that website. My laptop doesn't go up to 90FPS, and I can also see a difference between 90FPS and 60FPS. Those objects are being desplayed poorly in low FPS, I don't think they are accuratly representing any FPS because physically 60 and 90FPS on my screen should be identical, but it's not. My laptop also doesn;t support 120FPS, and it's running too.. WTH?

As for filming The Hobit in higer FPS, maybe to make more authentic looking effects, they need more frames to generate the effect? I don't know, there could be other factors that make the film look better. The only film I noticed looked particularly better then others was Batman, but that was largely filmed in IMAX or high res, and that shows a lot more.



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results

pezus said:

I actually stumbled on a really good article on this, which explains how our eyes work and discusses the 'supposed' FPS limit of ours: http://amo.net/NT/02-21-01FPS.html . It's really informative so I suggest you read it.

This mentions the 1/220 sec flash test again

The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.

A 30 fps camera will also pick up an image flashed like that. What would be interesting to know is at what rate can the human eye pick up differences in 2 subsequent frames. Show 2 frames 1/220 sec directly after eachother, can you see both or do you see the sum of the 2?

I suspect the latter, which is why fast moving objects still look bad at 60fps. You see a bit of frame 1 mixed with frame 2, you see the object in 2 places at once.

I do agree that adding motion blur to games is pointless. The human eye and brain track things that you are focussing on, compensating for the motion so you can see them clearly. Have you ever sat in a train staring out the window at the tracks speeding by? The faster you go the harder it gets to track the individual rungs until it all becomes a blur, yet every now and again your eyes still manage to catch on and give you a still frame.

Tracking fast moving objects doesn't work with fixed framerates on screens, they merely transition from 1 spot to the next. Adding motion blur doesn't help either. Sure it simulates how things look that you are not currently focussing on or are moving so fast you can't keep up with them, those will have blur in real life too. Giving everything that moves motion blur makes everything look blurry, making it impossible to see clearly that which you are trying to track.

Instead of motion blur, faster moving objects should be displayed at a higher framerate. This is how 120hz and 240hz tvs work, adding in between frames makes it easier to track the moving elements. Basically the distance an object travels between two frames should be constant for all moving elements. Unfortunately we're still stuck with fixed display refresh rates, and more importantly a cap of 60fps over HDMI. If we could send 240fps to the screen then it will be easier to start experimenting with variable frame rates for different elements. (Assuming hardware won't be prowerful enough for awhile to update the whole scene at 240 fps)

The human visual system is just as impressive a stabilizer system as its ability so focus quickly and track objects. Your eyes are constantly moving and scanning without you being aware of it conciously. Look at someone elses eyes and you'll be amazed at how much they jump about depending on what they look at. Yet they experience a perfectly stable mental image.

The mental image you experience is build from all the elements you track with your eyes to form one cohesive stable image. Parts you're not currently paying attention to are pretty much ignored / filled in automatically. Look up inattentional blindness to see what I mean. If games only knew what you were looking at, that could save a lot of rendering time. Actually a lot of games do know. But I guess it's not good for screenshots and you tube videos if game x is only showing high visual fidelity where you are aiming.





Adding to previous posts... Practically ALL movies made for movie theaters are at 24 fps, including 3D movies (even Avatar is 24 fps although Cameron originally wanted 48 fps). Even fast moving action movies are only 24 fps because most of the projectors or other hardware can't show higher fps. Of course, if action is fast enough 24 fps sometimes doesn't seem enough but for most movies it has always been enough. Actually, TV-movies and -series have higher fps than movies at theaters.

Around the Network
pezus said:
Michael-5 said:
pezus said:
 

No, not in that way. I meant that movies are filmed at something like 24 FPS, so you can clearly notice the Wagon-Wheel Effect when watching them. So the camera is limiting what we see. But, if we are just watching something in real life ...that's a whole different story (because its FPS is (most often) not limited).

"People said you can lock the FPS at a specific value, so it never drops. That is not true. Capping the FPS is different, FPS still drop, when you have a higher cap, you are less likely to drop below the threshold 30FPS."

Well, what those people stated is partly true. If you're not playing a demanding game and/or your computer is very good, the game will not drop frames. It's really simple, try an old game like Quake 3 and limit the (max) FPS at different values (e.g. 15/30/60/100). Huge difference!

Also, screen tearing does not occur if you employ V-sync and even if you don't, the amount of tearing depends on the game. Some games almost never tear and you sure as hell won't notice it, while others tear quite a bit. Anyway, screen tearing doesn't really matter when discussing about different levels of FPS.

I agree with you though, it's nice when someone backs up their opinion...even if I don't agree :D

I remember changing the FPS for CS Source, but I never saw a difference. If there is a difference, there must be more being added to the game the extra frames. It's likely added effects which mave movements appear smoother also take place. - no direct effects added, just more frames, but with more frames there is of course more data. That's also the case when going from 10 to 30 FPS.

As for Screen tearing, Play Witcher 2 or Crysis on max. My friends computer can handle Crysis, but not Withcher 2, and there is screen tearing when everything is set to ultimate (he normally plays on high). - then he doesn't enable vsync. Vsync fixes those problems. Does he run Witcher 2 at 60 FPS on high and then maybe 40-50 on ultimate? That would explain why he gets the tearing. 

What I dod notice however, much more over FPS, is resolution. Man does Mass Effect 2 on PC look good in native 1080p. Blows consoles out of the water, and Crysis/Withcer 2 are even better looking. Next gen damn better go 1080p. - Completely agree on that. Resolution is probably the most important "graphical factor" imo.

Also I do agree with you console games look better in 60FPS over 30 FPS, but I blaime FPS dip. In that Wikipedia article it says after 24FPS people don't notice the boost in FPS for videos. - What do you mean when you say you blame FPS dip? Let's say you're running a game at a stable 30 FPS vs. stable 60 FPS (so there are no FPS drops/dips). I didn't notice Wikipedia saying people don't notice increases over 24 FPS for videos. This is something Wikipedia said: "The human visual system does not see in terms of frames; it works with a continuous flow of light information.[12] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straight-forward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth, linear motion) artifacts — and the retinal fusion point can vary in different people, as in different lighting conditions." 

-And this: "Though the extra frames when not filtered correctly, can produce a somewhat video-esque quality to the whole, the improvement to motion heavy sequences is undeniable. "

As for PC's,again....This isn't really a good arguement, but I was watching discovery channel a long time ago and it was explaining why dogs don't like tv. Basically Dogs and wolves see at 80FPS, so a TV show which runs at 30FPS has a flickering screen to them. So if you ever see an old cathod ray TV in the background of an old movie or something, you too will see a flicker. Dogs can see a screen change frames, where people cannot see the actual transition because our eyes don't pick up signals that fast. So to add to my previous arguement, I know we can definatly see low FPS, but high FPS we don't notice as much of a change because we can't pick up that many frames at once. Dogs will, but only when you show them videos above 80FPS (which is a lot). - No, the actual reason for that is that their so-called "flicker fusion frequency", or in other words the minimum FPS they need to see to see a moving picture, is much higher than for us humans. So for them watching a 24 FPS movie is like us watching a 10 FPS movie, or unwatchable. It's not the same thing as the "max FPS" they can perceive. Read the article I mention below, it explains in more detail that we don't see in frames.

As for the arguement in general, I think we both could learn a bit more if we took a course in this, and your good to debate with too, pretty smart unlike most people here, but you are an older member, and you have seen some of the stupidity that is possible. - Without a doubt! This is an interesting subject for sure and you've really pushed me to research it further. 

As for 3D, I have 20/20, and just by changing my focus point, I can see PS3 games and movies in 3D or 2D, when I wear thos glasses. Also when I play 3D, I notice the screen flickers a lot, but others don't. - Yeah, that's one of the flaws of the "active-shutter" glasses. Flicker can be noticeable because each eye is effectively only receiving half of the TV's/monitor's refresh rate.

Zkuq Something isn't right about that video. Both balls are crazy blurry, and when you put them at 30FPS they lag a lot more then they should. When I used to play PC games, I always played them in 30FPS, and never did I see lag there.

Wait there is something seriously wrong with that website. My laptop doesn't go up to 90FPS, and I can also see a difference between 90FPS and 60FPS. Those objects are being desplayed poorly in low FPS, I don't think they are accuratly representing any FPS because physically 60 and 90FPS on my screen should be identical, but it's not. My laptop also doesn;t support 120FPS, and it's running too.. WTH? - It actually displays a warning for me when I try to go over 60 FPS, saying that since my screen doesn't support more than 60 I won't see a difference. And I don't see a difference.

As for filming The Hobit in higer FPS, maybe to make more authentic looking effects, they need more frames to generate the effect? I don't know, there could be other factors that make the film look better. The only film I noticed looked particularly better then others was Batman, but that was largely filmed in IMAX or high res, and that shows a lot more. - I think resolution and FPS are equally as notice-able in films. And 48 FPS = more frames = more data = better effects => FPS over 30 noticeably changes to the better (smoother).

I actually stumbled on a really good article on this, which explains how our eyes work and discusses the 'supposed' FPS limit of ours: http://amo.net/NT/02-21-01FPS.html . It's really informative so I suggest you read it.

Sorry for the late reply, been real caught up with school and work. The debate is largely over, and we both learned a lot. I think the conclusion we do agree upon is for media on tv we can only see up to a specific FPS, then if there is any difference, we can't notice it. For the real world, I'll admit I just don't know what;s going on. I would have to read on it. I do agree with you about the bird flying by a plane window, our eyes need to focus, and that causes motion blur (better seen driving in real life or in GT5 - Trail Mountain, Forza - Old Le Mans Curcuit, going 250km/h plus through a forest.

For V-sync, I've heard him use the term before, but I usually don't pay attention because I don't have a PC speced like that. Witcher 2 on Ultimate is above Crysis 1 on max, it's hard to run. His FPS drops to 20/30 because his computer can't handle the processing power. He can also only run it for a few minutes at that spec. On high (which is Crysis 1 max) he is fine.

We can quote wiki all day and get no where. When I say I blaime FPS dip, I mean that a 30FPS game may dip to 24FPS. I mean even GT5 dips to 15FPS when playing in 3D, and 30FPS when playing 720p on optimal conditions. if GT5 FPS dips to 25% or 50% it's selected output I think most games probably dip in FPS too. Anything below 30FPS is noticable.

Also FPS aside, screen tear is more noticable. If the game can't run smoothly in 60FPS, 30 FPS is the better choice.

For Zkuq's link. It also gives me a warning above 60FPS, but 90FPS and 120FPS look noticably better then 60FPS. 60 FPS is way way too blurry. This makes no sense to me, the only option in my mind is the site is flawed, or motion blur is less noticable when the frames are shorter( that blur distorted the video quality).

As for filming......THAT WAS MY ENTIRE ARGUEMENT! lol I agree with you there, that was the arguement I tried to make why higher FPS look better. I also apply that logic to games. Man I should use math eqn's to explain things more, it's so simple.

That article scares me because it's dedicated to me. Somone else told on this thread told me to check it out, and I felt he wrote it for me.

SvennoJ makes the same arguement I make for observing stuff in real life. If our eyes are limited to 30FPS like I believe they are, it doesn't mean it's impossible to perceive data at higher frequencies.

As for the dogs comment, I like to believe Discovery Channel over that link. Probably not the smartest Idea, but it makes sense to me.

So tired, going back to sleep. Great debate, I wish we both knew more about the field or had time to look into it to learn more.



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results