Michael-5 said:
pezus said:
No, not in that way. I meant that movies are filmed at something like 24 FPS, so you can clearly notice the Wagon-Wheel Effect when watching them. So the camera is limiting what we see. But, if we are just watching something in real life ...that's a whole different story (because its FPS is (most often) not limited).
"People said you can lock the FPS at a specific value, so it never drops. That is not true. Capping the FPS is different, FPS still drop, when you have a higher cap, you are less likely to drop below the threshold 30FPS."
Well, what those people stated is partly true. If you're not playing a demanding game and/or your computer is very good, the game will not drop frames. It's really simple, try an old game like Quake 3 and limit the (max) FPS at different values (e.g. 15/30/60/100). Huge difference!
Also, screen tearing does not occur if you employ V-sync and even if you don't, the amount of tearing depends on the game. Some games almost never tear and you sure as hell won't notice it, while others tear quite a bit. Anyway, screen tearing doesn't really matter when discussing about different levels of FPS.
I agree with you though, it's nice when someone backs up their opinion...even if I don't agree :D
|
I remember changing the FPS for CS Source, but I never saw a difference. If there is a difference, there must be more being added to the game the extra frames. It's likely added effects which mave movements appear smoother also take place. - no direct effects added, just more frames, but with more frames there is of course more data. That's also the case when going from 10 to 30 FPS.
As for Screen tearing, Play Witcher 2 or Crysis on max. My friends computer can handle Crysis, but not Withcher 2, and there is screen tearing when everything is set to ultimate (he normally plays on high). - then he doesn't enable vsync. Vsync fixes those problems. Does he run Witcher 2 at 60 FPS on high and then maybe 40-50 on ultimate? That would explain why he gets the tearing.
What I dod notice however, much more over FPS, is resolution. Man does Mass Effect 2 on PC look good in native 1080p. Blows consoles out of the water, and Crysis/Withcer 2 are even better looking. Next gen damn better go 1080p. - Completely agree on that. Resolution is probably the most important "graphical factor" imo.
Also I do agree with you console games look better in 60FPS over 30 FPS, but I blaime FPS dip. In that Wikipedia article it says after 24FPS people don't notice the boost in FPS for videos. - What do you mean when you say you blame FPS dip? Let's say you're running a game at a stable 30 FPS vs. stable 60 FPS (so there are no FPS drops/dips). I didn't notice Wikipedia saying people don't notice increases over 24 FPS for videos. This is something Wikipedia said: "The human visual system does not see in terms of frames; it works with a continuous flow of light information.[12] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straight-forward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth, linear motion) artifacts — and the retinal fusion point can vary in different people, as in different lighting conditions."
-And this: "Though the extra frames when not filtered correctly, can produce a somewhat video-esque quality to the whole, the improvement to motion heavy sequences is undeniable. "
As for PC's,again....This isn't really a good arguement, but I was watching discovery channel a long time ago and it was explaining why dogs don't like tv. Basically Dogs and wolves see at 80FPS, so a TV show which runs at 30FPS has a flickering screen to them. So if you ever see an old cathod ray TV in the background of an old movie or something, you too will see a flicker. Dogs can see a screen change frames, where people cannot see the actual transition because our eyes don't pick up signals that fast. So to add to my previous arguement, I know we can definatly see low FPS, but high FPS we don't notice as much of a change because we can't pick up that many frames at once. Dogs will, but only when you show them videos above 80FPS (which is a lot). - No, the actual reason for that is that their so-called "flicker fusion frequency", or in other words the minimum FPS they need to see to see a moving picture, is much higher than for us humans. So for them watching a 24 FPS movie is like us watching a 10 FPS movie, or unwatchable. It's not the same thing as the "max FPS" they can perceive. Read the article I mention below, it explains in more detail that we don't see in frames.
As for the arguement in general, I think we both could learn a bit more if we took a course in this, and your good to debate with too, pretty smart unlike most people here, but you are an older member, and you have seen some of the stupidity that is possible. - Without a doubt! This is an interesting subject for sure and you've really pushed me to research it further.
As for 3D, I have 20/20, and just by changing my focus point, I can see PS3 games and movies in 3D or 2D, when I wear thos glasses. Also when I play 3D, I notice the screen flickers a lot, but others don't. - Yeah, that's one of the flaws of the "active-shutter" glasses. Flicker can be noticeable because each eye is effectively only receiving half of the TV's/monitor's refresh rate.
Zkuq Something isn't right about that video. Both balls are crazy blurry, and when you put them at 30FPS they lag a lot more then they should. When I used to play PC games, I always played them in 30FPS, and never did I see lag there.
Wait there is something seriously wrong with that website. My laptop doesn't go up to 90FPS, and I can also see a difference between 90FPS and 60FPS. Those objects are being desplayed poorly in low FPS, I don't think they are accuratly representing any FPS because physically 60 and 90FPS on my screen should be identical, but it's not. My laptop also doesn;t support 120FPS, and it's running too.. WTH? - It actually displays a warning for me when I try to go over 60 FPS, saying that since my screen doesn't support more than 60 I won't see a difference. And I don't see a difference.
As for filming The Hobit in higer FPS, maybe to make more authentic looking effects, they need more frames to generate the effect? I don't know, there could be other factors that make the film look better. The only film I noticed looked particularly better then others was Batman, but that was largely filmed in IMAX or high res, and that shows a lot more. - I think resolution and FPS are equally as notice-able in films. And 48 FPS = more frames = more data = better effects => FPS over 30 noticeably changes to the better (smoother).
|