By using this site, you agree to our Privacy Policy and our Terms of Use. Close
pezus said:
Michael-5 said:
pezus said:
 

No, not in that way. I meant that movies are filmed at something like 24 FPS, so you can clearly notice the Wagon-Wheel Effect when watching them. So the camera is limiting what we see. But, if we are just watching something in real life ...that's a whole different story (because its FPS is (most often) not limited).

"People said you can lock the FPS at a specific value, so it never drops. That is not true. Capping the FPS is different, FPS still drop, when you have a higher cap, you are less likely to drop below the threshold 30FPS."

Well, what those people stated is partly true. If you're not playing a demanding game and/or your computer is very good, the game will not drop frames. It's really simple, try an old game like Quake 3 and limit the (max) FPS at different values (e.g. 15/30/60/100). Huge difference!

Also, screen tearing does not occur if you employ V-sync and even if you don't, the amount of tearing depends on the game. Some games almost never tear and you sure as hell won't notice it, while others tear quite a bit. Anyway, screen tearing doesn't really matter when discussing about different levels of FPS.

I agree with you though, it's nice when someone backs up their opinion...even if I don't agree :D

I remember changing the FPS for CS Source, but I never saw a difference. If there is a difference, there must be more being added to the game the extra frames. It's likely added effects which mave movements appear smoother also take place. - no direct effects added, just more frames, but with more frames there is of course more data. That's also the case when going from 10 to 30 FPS.

As for Screen tearing, Play Witcher 2 or Crysis on max. My friends computer can handle Crysis, but not Withcher 2, and there is screen tearing when everything is set to ultimate (he normally plays on high). - then he doesn't enable vsync. Vsync fixes those problems. Does he run Witcher 2 at 60 FPS on high and then maybe 40-50 on ultimate? That would explain why he gets the tearing. 

What I dod notice however, much more over FPS, is resolution. Man does Mass Effect 2 on PC look good in native 1080p. Blows consoles out of the water, and Crysis/Withcer 2 are even better looking. Next gen damn better go 1080p. - Completely agree on that. Resolution is probably the most important "graphical factor" imo.

Also I do agree with you console games look better in 60FPS over 30 FPS, but I blaime FPS dip. In that Wikipedia article it says after 24FPS people don't notice the boost in FPS for videos. - What do you mean when you say you blame FPS dip? Let's say you're running a game at a stable 30 FPS vs. stable 60 FPS (so there are no FPS drops/dips). I didn't notice Wikipedia saying people don't notice increases over 24 FPS for videos. This is something Wikipedia said: "The human visual system does not see in terms of frames; it works with a continuous flow of light information.[12] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straight-forward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth, linear motion) artifacts — and the retinal fusion point can vary in different people, as in different lighting conditions." 

-And this: "Though the extra frames when not filtered correctly, can produce a somewhat video-esque quality to the whole, the improvement to motion heavy sequences is undeniable. "

As for PC's,again....This isn't really a good arguement, but I was watching discovery channel a long time ago and it was explaining why dogs don't like tv. Basically Dogs and wolves see at 80FPS, so a TV show which runs at 30FPS has a flickering screen to them. So if you ever see an old cathod ray TV in the background of an old movie or something, you too will see a flicker. Dogs can see a screen change frames, where people cannot see the actual transition because our eyes don't pick up signals that fast. So to add to my previous arguement, I know we can definatly see low FPS, but high FPS we don't notice as much of a change because we can't pick up that many frames at once. Dogs will, but only when you show them videos above 80FPS (which is a lot). - No, the actual reason for that is that their so-called "flicker fusion frequency", or in other words the minimum FPS they need to see to see a moving picture, is much higher than for us humans. So for them watching a 24 FPS movie is like us watching a 10 FPS movie, or unwatchable. It's not the same thing as the "max FPS" they can perceive. Read the article I mention below, it explains in more detail that we don't see in frames.

As for the arguement in general, I think we both could learn a bit more if we took a course in this, and your good to debate with too, pretty smart unlike most people here, but you are an older member, and you have seen some of the stupidity that is possible. - Without a doubt! This is an interesting subject for sure and you've really pushed me to research it further. 

As for 3D, I have 20/20, and just by changing my focus point, I can see PS3 games and movies in 3D or 2D, when I wear thos glasses. Also when I play 3D, I notice the screen flickers a lot, but others don't. - Yeah, that's one of the flaws of the "active-shutter" glasses. Flicker can be noticeable because each eye is effectively only receiving half of the TV's/monitor's refresh rate.

Zkuq Something isn't right about that video. Both balls are crazy blurry, and when you put them at 30FPS they lag a lot more then they should. When I used to play PC games, I always played them in 30FPS, and never did I see lag there.

Wait there is something seriously wrong with that website. My laptop doesn't go up to 90FPS, and I can also see a difference between 90FPS and 60FPS. Those objects are being desplayed poorly in low FPS, I don't think they are accuratly representing any FPS because physically 60 and 90FPS on my screen should be identical, but it's not. My laptop also doesn;t support 120FPS, and it's running too.. WTH? - It actually displays a warning for me when I try to go over 60 FPS, saying that since my screen doesn't support more than 60 I won't see a difference. And I don't see a difference.

As for filming The Hobit in higer FPS, maybe to make more authentic looking effects, they need more frames to generate the effect? I don't know, there could be other factors that make the film look better. The only film I noticed looked particularly better then others was Batman, but that was largely filmed in IMAX or high res, and that shows a lot more. - I think resolution and FPS are equally as notice-able in films. And 48 FPS = more frames = more data = better effects => FPS over 30 noticeably changes to the better (smoother).

I actually stumbled on a really good article on this, which explains how our eyes work and discusses the 'supposed' FPS limit of ours: http://amo.net/NT/02-21-01FPS.html . It's really informative so I suggest you read it.

Sorry for the late reply, been real caught up with school and work. The debate is largely over, and we both learned a lot. I think the conclusion we do agree upon is for media on tv we can only see up to a specific FPS, then if there is any difference, we can't notice it. For the real world, I'll admit I just don't know what;s going on. I would have to read on it. I do agree with you about the bird flying by a plane window, our eyes need to focus, and that causes motion blur (better seen driving in real life or in GT5 - Trail Mountain, Forza - Old Le Mans Curcuit, going 250km/h plus through a forest.

For V-sync, I've heard him use the term before, but I usually don't pay attention because I don't have a PC speced like that. Witcher 2 on Ultimate is above Crysis 1 on max, it's hard to run. His FPS drops to 20/30 because his computer can't handle the processing power. He can also only run it for a few minutes at that spec. On high (which is Crysis 1 max) he is fine.

We can quote wiki all day and get no where. When I say I blaime FPS dip, I mean that a 30FPS game may dip to 24FPS. I mean even GT5 dips to 15FPS when playing in 3D, and 30FPS when playing 720p on optimal conditions. if GT5 FPS dips to 25% or 50% it's selected output I think most games probably dip in FPS too. Anything below 30FPS is noticable.

Also FPS aside, screen tear is more noticable. If the game can't run smoothly in 60FPS, 30 FPS is the better choice.

For Zkuq's link. It also gives me a warning above 60FPS, but 90FPS and 120FPS look noticably better then 60FPS. 60 FPS is way way too blurry. This makes no sense to me, the only option in my mind is the site is flawed, or motion blur is less noticable when the frames are shorter( that blur distorted the video quality).

As for filming......THAT WAS MY ENTIRE ARGUEMENT! lol I agree with you there, that was the arguement I tried to make why higher FPS look better. I also apply that logic to games. Man I should use math eqn's to explain things more, it's so simple.

That article scares me because it's dedicated to me. Somone else told on this thread told me to check it out, and I felt he wrote it for me.

SvennoJ makes the same arguement I make for observing stuff in real life. If our eyes are limited to 30FPS like I believe they are, it doesn't mean it's impossible to perceive data at higher frequencies.

As for the dogs comment, I like to believe Discovery Channel over that link. Probably not the smartest Idea, but it makes sense to me.

So tired, going back to sleep. Great debate, I wish we both knew more about the field or had time to look into it to learn more.



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results