By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General Discussion - While you're all hyped about upgrade. Along comes a downgrade in disguise

Source: http://www.engadgethd.com/2007/10/01/whats-the-deal-with-24p/

What's the deal with 24p?

Posted Oct 1st 2007 2:13PM by Ben Drawbaugh

It's one buzz word after another in the world of HD and the latest is 24p, but what's the big deal? How can less frames per second be better? Like most things that don't make sense, the reason isn't a good one. Without going into much detail, the reason is because movies have been shot at 24 fps for years and to watch a movie at any other frame rate means there's some funny business going on. The problem is that most TVs can't display video at this rate, so even if your player can, you're probably still watching it at 60hz. Then to top it off, if you don't see any judder now, then why worry about it in the first place? Either way, if you want to try to understand the entire mess, follow the read link over to HighDefDigest and read there comprehensive write-up on What's the Big Deal About 1080p24?

Edit: I looked into it a bit more and it seems there's a confusion with 24P vs 24Hz. 24Hz is where the films are shot at and best viewed at that rate. 

/brain explode 



Around the Network

with avid movie buffs, including myself, having the ability to watch movies at 24Hz, or frames per second, is a big deal. this is the speed at which they were meant to be viewed, just as they are played at theaters. the truest goal of an enthusiast's setup is to replicate the theater's experience at home. for years when movies went to vhs or dvd, or shown on tv, they played at 30Hz, the same playback as all television. that's why some guys make such a big deal out of 24Hz, just checkout avsforum or highdefforum and you'll see what i mean.

hope that helped!



it does kind of make sense. When you convert a 24Hz signal into a 30Hz signal, those extra 6 frames have to come from somewhere. So the easiest way to get those extra six frames is just to repeat every fourth frame from the original 24fps source. What you end up with is a slight jitteryness. Barely noticeable, but there are always some people who notice (or at least, claim to notice) these things.



Help! I'm stuck in a forum signature!

that's exactly right... i do feel that as HD penetration increases we will see movies filmed and played back at 60Hz, but not for awhile.



it is a bit unfortunate that two standards were picked that were incompatible. Would it have been that difficult to make the TVs 48Hz or 72Hz?
Isn't over the air TV 24Hz as well?



Help! I'm stuck in a forum signature!

Around the Network

Eh. This is just another minor upgrade that videophiles will go ga-ga over and nobody else will care about. No big deal.



"'Casual games' are something the 'Game Industry' invented to explain away the Wii success instead of actually listening or looking at what Nintendo did. There is no 'casual strategy' from Nintendo. 'Accessible strategy', yes, but ‘casual gamers’ is just the 'Game Industry''s polite way of saying what they feel: 'retarded gamers'."

 -Sean Malstrom

 

 

it's 24Hz to be able to match with your eyes own resolution, i thought eyes only "took a picture" 24 times a second so there is no need to have more, more than 24Hz just means there are frames you don't even get to see.

and if that is true about repeating every 4th frame for 30Hz then that is just silly, i mean that would actually make a jitter as much as having less than 20Hz.

i could be wrong about 24Hz because i vaguely remember it being only 20 pictures per second for the eyes.

----
kind of off-topic: i remember seeing/reading something about pigeons eyes taking about twice as many pictures per second as human eyes (i assume it's the same for most animals that are considered to have better eyesight than humans) so watching a cinema screen to them is like us watchinga slide show.



For years, the Americans have been using an evil technology called telecine to "convert" 25 fps to 30 fps (actually 23.98 to 29.976 or something) using interlaced imagery. Essentially it plays most frames normally, and generates new frames by combining the field of one image (the field being every other line of the full image) with the alternate field of the next frame, creating a kind of smoother transition. Americans have grown up on it and so are used to it (it's not THAT jarring) but people from PAL regions can tell the difference. In PAL regions all TVs display at least 50Hz (25 full frames a second). Very simply movies are sped up from 24 fps to 25 fps. The difference is small enough that you can't tell, but does make each movie a few minutes shorter.



TWRoO said:
it's 24Hz to be able to match with your eyes own resolution, i thought eyes only "took a picture" 24 times a second so there is no need to have more, more than 24Hz just means there are frames you don't even get to see.

and if that is true about repeating every 4th frame for 30Hz then that is just silly, i mean that would actually make a jitter as much as having less than 20Hz.

i could be wrong about 24Hz because i vaguely remember it being only 20 pictures per second for the eyes.

----
kind of off-topic: i remember seeing/reading something about pigeons eyes taking about twice as many pictures per second as human eyes (i assume it's the same for most animals that are considered to have better eyesight than humans) so watching a cinema screen to them is like us watchinga slide show.

Your eyes do not take snapshots at all; the rods and cones in your eyes receive light and react to it in a fluid, non-discrete manner.

The required framerate for a video to look 'smooth' varies depending on the picture source.  24 fps is good enough for movies because the video camera does not take discrete, instantaneous images; rather, it captures the light that comes through the lens during 1/24 of a second, so the individual frames are a bit blurry where things are in motion.  This blurriness is what allows your brain to interpret the video as smooth motion.

If you played a computer game at 24 fps, you would easily be able to see jittering, because the images generated by your computer are not blurred, they are discrete.  Most people can tell a difference up to 60 fps.  Some claim to be able to tell a difference above 60 fps, but this is usually only important if there is very fast motion on the screen.  The faster something is moving, the greater the change between one frame to the next, so you need a higher frame rate to compensate for fast motion. 



That's why the new Sony LCDs and Samsung Plasma's are all capable of a 120hz refresh rate. 120 is divisble by 24, 30, and 60, so they are capable of displaying 24fps movies as well as 30fps and 60fps games.

I can't wait until we get the new Sony XBR4 for Christmas. :D