Quantcast
24 frames per second is starting to look awkward in 4k.

Forums - Movies Discussion - 24 frames per second is starting to look awkward in 4k.

I prefer film in...

24 fps. 28 62.22%
 
30 fps. 3 6.67%
 
48 fps. 4 8.89%
 
60 fps. 4 8.89%
 
North of 60.... 3 6.67%
 
Any/indifferent/comments/middle America. 3 6.67%
 
Total:45
SvennoJ said:
thismeintiel said:

~ snip

It's not more lifelike, it's more dream like, what movies should be. You can do a lot more with exposure and lighting with 24 fps. You can suggest a lot more with 24 fps. You can hide a lot less with 48 fps or 60 fps. Games choose to hide things in the dark to suggest more than there is. That works for movies as well, however a lower frame rate is just as effective.

Games also use all kinds of filters to make it look more 'realistic', especially motion blur. That pretty much contradicts the whole higher frame rate is better. Games try so hard to simulate (bad) cameras, it's not even funny anymore. Bloom, blur, lens flares, chromatic aberration, depth of field,  camera shake, fake HDR, film grain.  Yet without it they look sterile and fake.

Some of those effects sometimes help games look more realistic (of course, this depends on the genre/scenario) and most of them do not hide the frame rate. Motion blur is the only active feature that truly masks objects that are moving, and even then, it is more effective at lower frame rates. Turning many of those features off, in my experience, doesn't leave a feeling of a sterile production, but then, that again depends on a number of external factors. 

The variation of preference for lower vs higher frame rates is simply preferential. 



                                                                                                                                            

Around the Network
CGI-Quality said:
SvennoJ said:

It's not more lifelike, it's more dream like, what movies should be. You can do a lot more with exposure and lighting with 24 fps. You can suggest a lot more with 24 fps. You can hide a lot less with 48 fps or 60 fps. Games choose to hide things in the dark to suggest more than there is. That works for movies as well, however a lower frame rate is just as effective.

Games also use all kinds of filters to make it look more 'realistic', especially motion blur. That pretty much contradicts the whole higher frame rate is better. Games try so hard to simulate (bad) cameras, it's not even funny anymore. Bloom, blur, lens flares, chromatic aberration, depth of field,  camera shake, fake HDR, film grain.  Yet without it they look sterile and fake.

Some of those effects sometimes help games look more realistic (of course, this depends on the genre/scenario) and most of them do not hide the frame rate. Motion blur is the only active feature that truly masks objects that are moving, and even then, it is more effective at lower frame rates. Turning many of those features off, in my experience, doesn't leave a feeling of a sterile production, but then, that again depends on a number of external factors. 

The variation of preference for lower vs higher frame rates is simply preferential. 

Why do those effects make it look more realistic? Does it really make them look more realistic, or simply more like what you are used to seeing from a camera on a screen? Many of those effects do not make sense in VR at all. Depth of field, useless. Motion blur, nope you can follow things with your eyes. Lens flares, film grain, bloom, chromatic aberration (already comes with the lenses unfortunately) all does not make any sense. Fake HDR breaks immersion and well camera shake, a definite no. VR aims for realism, yet can't use all the techniques that make games look more realistic!

Frame rate of course does make sense in VR as it's you turning your head yourself. Any stutter is bad. However the animation can still run at lower frame rates. It will be interesting to see how games will develop in VR once resolution goes up. Are they going to look more fake or will there be a new bag of tricks to add 'realism'.

My favorite movies run between 8 and 18 fps. Although those are averages and different elements are animated at different frame rates

Nausicaa = 56078/(116*60+27.05) = 8.03
Laputa = 69262/(124*60+4.22) = 9.30
Totoro = 48743/(86*60+20.14) = 9.41
Hotaru = 54660/(88*60+26.19) = 10.30
Kiki = 67317/(102*60+46.12) = 10.92
Omohide = 73719/(118*60+49.05) = 10.34
Porco = 58443/(93*60+18.19) = 10.44
Umi = 25530/(72*60) = 5.91
PomPoko = 82289/(118*60+59.01) = 11.53
Mimi = 64491/(111*60+0.12) = 9.68
OYM = 8053/(6*60+48.21) = 19.73
Mononoke = 144043/(133*60+24.22) = 18.00

Traditional animation uses key frames with exaggerated expressions and movements to enhance the animation. Variable frame rate and frame interpolation messes that up completely. For Monsters university they tried to calculate the hair physics. Turned out, based on filling in the movement between key frames, that wouldn't work at all. Impossible G-forces screwing with the calculations. It's an art to give the brain just enough to fill in the blanks in the way the director wants you to. Perhaps that's why games are still not considered art :) Not enough control over content delivery.



CGI-Quality said:
thismeintiel said:

Lol, 30 FPS is not choppy.  Let's not use hyperbole to try and win a debate.  There is nothing choppy about a locked in 30 FPS. 

And every game could run at 60 FPS if devs so chose.  Sure, they would have to lower some of the fidelity, but it is possible.  Doom, for example, is still a really good looking game.  And while it is really fun to play, that 60 FPS screams that I am playing a video game.  Things do not move that smoothly across your eyes in the real world.  This is the same reason 48 FPS film failed.  While many people are fine with higher framerates in games, as it is a game, they do not want that in their films where it is supposed to be capturing lifelike images.  So, there is much more to the 30 FPS vs 60 FPS and 24 FPS vs 48 FPS than just HW limitations.  It is most often a preference. If I were making a game that I was aiming at complete realism, there is no way I would want it to run higher than 40 FPS.

Nothing hyperbolic about that statement and there is no debate to win. You either prefer higher frame rates or you don't. Yes, by comparison to both what I see in real life and 60+fps gaming, 30fps is choppy. If you truly think it runs 'nearly as good as real life', then it is clear that you haven't spent much time playing higher or you're just choosing the lower frame rate for.....whatever reason. Remove the motion blur (what many 30fps games hide behind) and you have a choppier experience. That's Game Design 101.

It is 100% hyperbolic to say that 30 FPS is choppy.  Movies run at 6 FPS less than those games, and they are not choppy, either.  And it's not just from experience that I know this, it is just a scientific fact.  It only takes ~20 FPS to fool the brain that a series of images are actually in motion, without the choppiness of something like stop motion animation.  At 24 FPS and 30 FPS, it is impossible for it to look choppy.  Sure, it's not as smooth as 60 FPS, but nothing choppy about it.  And motion blur is not to remove any kind of choppiness from low framerate, it is to address image ghosting from previous frames, mainly caused by turning the camera quickly.  It's also used to simulate something our eyes naturally do with motion.  A game running at 18 FPS isn't going to magically look smooth because you threw some motion blur at it.

Well, glad we agree that it is about preference.  30 FPS for games is here to stay because many think it actually looks more cinematic.  Same goes for 24 FPS for film. 



SvennoJ said:
CGI-Quality said:

Some of those effects sometimes help games look more realistic (of course, this depends on the genre/scenario) and most of them do not hide the frame rate. Motion blur is the only active feature that truly masks objects that are moving, and even then, it is more effective at lower frame rates. Turning many of those features off, in my experience, doesn't leave a feeling of a sterile production, but then, that again depends on a number of external factors. 

The variation of preference for lower vs higher frame rates is simply preferential. 

Why do those effects make it look more realistic? Does it really make them look more realistic, or simply more like what you are used to seeing from a camera on a screen? Many of those effects do not make sense in VR at all. Depth of field, useless. Motion blur, nope you can follow things with your eyes. Lens flares, film grain, bloom, chromatic aberration (already comes with the lenses unfortunately) all does not make any sense. Fake HDR breaks immersion and well camera shake, a definite no. VR aims for realism, yet can't use all the techniques that make games look more realistic!

That is why I said the genre and scenario are relevant. As an example, no, lens flares won't make much sense in a 3rd person game with a character jumping from one platform to the next, but certain FPS titles make use of it that make total sense. It's still a case-by-case basis, though. Very few features do I find totally unnecessary (chromatic aberration being the main one).

But, none of that really matters when talking frame rates (unless they inherently interfere with it ~ which is a whole other matter). Those are mainly visual flares.

 

thismeintiel said: 
CGI-Quality said: 

~ snip

It is 100% hyperbolic to say that 30 FPS is choppy. 

Nope.

Movies run at 6 FPS less than those games, and they are not choppy, either. 

Movies and games cannot be compared 1:1 due to the nature of interaction vs not. One thing that is for sure, movies that run at higher Hz appear smoother.

And it's not just from experience that I know this, it is just a scientific fact.  It only takes ~20 FPS to fool the brain that a series of images are actually in motion, without the choppiness of something like stop motion animation. 

I've heard of no scientific fact that can tell one person what is choppy vs someone else, but I'd be interested to see a source of such.

 At 24 FPS and 30 FPS, it is impossible for it to look choppy. 

Nope, it isn't.

Sure, it's not as smooth as 60 FPS, but nothing choppy about it. 

By comparison, it is (I assume you don't have a monitor capable of 120+Hz).

And motion blur is not to remove any kind of choppiness from low framerate, it is to address image ghosting from previous frames, mainly caused by turning the camera quickly. 

Never argued that motion blur removed anything. It attempts to mask lower frame rates (much of what we saw last gen).

It's also used to simulate something our eyes naturally do with motion.  A game running at 18 FPS isn't going to magically look smooth because you threw some motion blur at it.

Indeed, which is why I wouldn't argue that it, nor 30fps, is smooth.

 Well, glad we agree that it is about preference.  30 FPS for games is here to stay because many think it actually looks more cinematic.  Same goes for 24 FPS for film. 

30fps is here to stay (on console) because of power constraints and designers preferring visuals over framerate. Film is a different beast entirely and I've not heard of any correlation between the two.

My take.

And check this out: http://www.technologyx.com/featured/understanding-frame-rate-look-truth-behind-30v60-fps/

Fantastic read that attempts to make sense of much of this.

Last edited by CGI-Quality - on 13 January 2019

                                                                                                                                            

Yeah, sorry...going from 1080p to 4K does not magically makes 24fps look worse.



Around the Network
CGI-Quality said:
SvennoJ said:

Why do those effects make it look more realistic? Does it really make them look more realistic, or simply more like what you are used to seeing from a camera on a screen? Many of those effects do not make sense in VR at all. Depth of field, useless. Motion blur, nope you can follow things with your eyes. Lens flares, film grain, bloom, chromatic aberration (already comes with the lenses unfortunately) all does not make any sense. Fake HDR breaks immersion and well camera shake, a definite no. VR aims for realism, yet can't use all the techniques that make games look more realistic!

That is why I said the genre and scenario are relevant. As an example, no, lens flares won't make much sense in a 3rd person game with a character jumping from one platform to the next, but certain FPS titles make use of it that make total sense. It's still a case-by-case basis, though. Very few features do I find totally unnecessary (chromatic aberration being the main one).

But, none of that really matters when talking frame rates (unless they inherently interfere with it ~ which is a whole other matter). Those are mainly visual flares.

 

thismeintiel said: 

It is 100% hyperbolic to say that 30 FPS is choppy. 

Nope.

Movies run at 6 FPS less than those games, and they are not choppy, either. 

Movies and games cannot be compared 1:1 due to the nature of interaction vs not. One thing that is for sure, movies that run at higher Hz appear smoother.

And it's not just from experience that I know this, it is just a scientific fact.  It only takes ~20 FPS to fool the brain that a series of images are actually in motion, without the choppiness of something like stop motion animation. 

I've heard of no scientific fact that can tell one person what is choppy vs someone else, but I'd be interested to see a source of such.

 At 24 FPS and 30 FPS, it is impossible for it to look choppy. 

Nope, it isn't.

Sure, it's not as smooth as 60 FPS, but nothing choppy about it. 

By comparison, it is (I assume you don't have a monitor capable of 120+Hz).

And motion blur is not to remove any kind of choppiness from low framerate, it is to address image ghosting from previous frames, mainly caused by turning the camera quickly. 

Never argued that motion blur removed anything. It attempts to mask lower frame rates (much of what we saw last gen).

It's also used to simulate something our eyes naturally do with motion.  A game running at 18 FPS isn't going to magically look smooth because you threw some motion blur at it.

Indeed, which is why I wouldn't argue that it, nor 30fps, is smooth.

 Well, glad we agree that it is about preference.  30 FPS for games is here to stay because many think it actually looks more cinematic.  Same goes for 24 FPS for film. 

30fps is here to stay (on console) because of power constraints and designers preferring visuals over framerate. Film is a different beast entirely and I've not heard of any correlation between the two.

My take.

And check this out: http://www.technologyx.com/featured/understanding-frame-rate-look-truth-behind-30v60-fps/

Fantastic read that attempts to make sense of much of this.

Hmm. An article that completely disregards artistic choice or preference, and claims it is always about limitations.  Then goes on to address things I barely see anyone say, creating a strawman.  No one is debating that people can see the difference.  The point some prefer one over the other depending on the goal of realism.  But, PC Master Race argument it is then.  I'm just going to stop this "debate" here then.  We will just agree to disagree.



FPS (first person shooters) sure, response time is important, same as in VR. Animation would be fine at 30 fps though, as long as camera movement is at high frame rate. Didn't Halo 4 reduce the frame rate for things further away?

Here's a link trying to expain why HFR is less immersive
https://gizmodo.com/5969817/the-hobbit-an-unexpected-masterclass-in-why-48-fps-fails

PSVR is on the right track with reprojection. It keeps the camera movement (your head movements) smooth at 120fps, while the game runs at 60fps. Even watching 24fps movies on it is fine as your head movements (and projected screen) are still updated at 120fp. The lost bear, a psvr platformer that plays out on stage while you sit in a theater, has animation that runs at much lower frame rate. Looks perfectly fine.

Perhaps in the future we'll get object based movement like Dolby Atmos for sound. The only way to achieve true smooth motion is to limit the steps objects take to 1 pixel at a time. When you follow an object with your eyes, you collect its light on your retina. If it skips across the screen you can't focus on it. Sometimes that's intentional.

I'm not sure what impact variable frame rate will have. I do a lot of racing, so much I notice the difference in display lag when switching to my other tv. I have to adjust my brake and turn in points slightly in the faster cars not to be too early or too late. However when the game runs between 40 to 80fps, won't that throw me off all the time? The difference is minor, until devs take variable frame rate as a pass to put anything out from 20 to 60fps as acceptable.At least for racing, fps and display lag don't matter all that much as long as they are constant. Any slowdown or judder immediately throws me off.

Anyway I agree, for interactive media, especially first person based, higher frame rate is preferable.



thismeintiel said:
CGI-Quality said:

~ snip

Hmm. An article that completely disregards artistic choice or preference, and claims it is always about limitations.  Then goes on to address things I barely see anyone say, creating a strawman.  No one is debating that people can see the difference.  The point some prefer one over the other depending on the goal of realism.  But, PC Master Race argument it is then.  I'm just going to stop this "debate" here then.  We will just agree to disagree.

If that's what you got from that article, you totally missed the point. But, yeah, it is probably best for you to move on. :P



                                                                                                                                            

pikashoe said:
Higher frame rates look awful in film. It makes sets and effects look bad. Also the comment about vinyl is just factually wrong. Vinyl is the least compressed way to listen to recorded music. It has flaws, but in terms of range vinyl is superior to cds, tapes, mp3 etc.

I legit thought that sentence, complete with the use of the word "fools" and then misspelling of Vinyl was the start of this threads lean into actually being a joke... but then I read on... and it wasn't. Missed potential imo.



Fancy hearing me on an amateur podcast with friends gushing over one of my favourite games? https://youtu.be/1I7JfMMxhf8

John2290 said:
4k 60 fps just looks so incredibly smooth. Even 30 fps is infinitely better than 24 fps but getting at 60 fps, yes it does look odd at first but when you get used to it after 5 or 10 minutes you see it for a new cinematic rate that it is and nostalgia goes put thr window. Watching these trailers on Youtube at 4k 60 fps and then going back yo normal trailers is sickening. I want these films at 60 fps so badly.

Ok, english is not my native language, but I was under the impression that "nostalgia" was the act of yearning for something long past.  

So.. how could I feel nostalgia for something that I use everyday in the present? Like, I'm at home watching my regular 24fps movie, then I set up my newly bought 4K TV and then BOOM! 10 minutes later my "nostalgia" goes thru the window.  Am I missing something here?

As for the smoothness of it, well, idk but when I was watching The Hobbit it felt odd to me during the whole movie, and in some parts I even felt a little dizzy.  And I had no idea what was going on at that time. It wasn´t until much later that I learned that it was filmed in a higher framerate.

And Idk, but it seems to me like you are trying to push your preferences into everybody else.  And sorry, but that's not how it works.