By using this site, you agree to our Privacy Policy and our Terms of Use. Close
vizunary said:
Lingyis said:
vizunary said:
 

 

@lingysis, i see your point with 24fps, but the interlaced display still only has the ability to refresh 540 lines at a time. from all the HD research i've done my understanding is that the video processor in an interlaced display cannot refresh the lines consecutively like progressive scan does. i may be incorrect, but i believe i understand the tech correctly.

well, entroper already answered it. i'll just repeat in a wordier way.

yes, interlaced means half of 1080 lines = 540 lines at a time, and like you said, does not refresh lines like progressive which would be 1080 lines at a time. however, updates are done at 60 times every second. this means that for 24fps movies, you have more refreshes than is necessary. essentially you need to "stretch" each movie frame into either 3 or 2 refreshes, which is what entroper meant by "3:2 pulldown".

so your 1080p TV is working with more information than really necessary. therefore, as long as your TV does the right thing with the signal (which according to some isn't always the case), there would be zero visual difference between a 1080i and a 1080p video ouput source, on 24fps movies.


 

that does make some sense, i guess i'm not visualising it correctly... i still think that in heavy, fast paced action films(which many use a 60fps signal) there would be a noticeable difference, at least to enthusiasts. you're very probably correct in saying most people would never notice. i happened to notice when i was picking out my tv, then again i can see the red, blue, and green shades on a LCD display and the "rainbow" effect on DLP sets, so i'm not the norm. BTW, you guys thinking this is somehow a death nell for BLU-RAY, think again. the Toshiba A2(this model) has been $199 w/rebate for quite some time alread, it's nothing new, just a little more accessible is all.

 I believe the overwhelmingly vast majority of films are encoded at 24fps.  One of the movie publishers experimented with higher frame rates when there were fast-panning scenes.  I dont' recall the publisher but I only saw it in a few films and it was definitely "wierd" because all of a sudden, everything was sharper and then it went away when the action/panning stopped.

I'm not aware of any other use of framerates higher than 24fps for a major motion picture relase though that doesn't mean they aren't out there.  I'm also not aware of any major CGI movie being shot at anything other than 24fps.  Cars and Incredibles -- two Disney exclusives, are both 24fps.  So is Shrek.

As far as your assertion of interlace vs. progressive, it is true that progressive beats interlace on a CRT.  That doesn't apply to digital sets like LCDs and Plasmas.  My understanding of this is that LCDs and Plasmas are inherently progressive because they don't scan images onto the screen like a CRT but instead display the entire image all at once.

The "1080i and 1080P are the same" argument holds as long as the source is 30fps or less, and either the display or the source provide for proper de-interlacing so the image is displayed without loss.

You'll definitely need to do more research but after having read a LOT of articles, my understanding is that on an LCD or Plasma, 1080i or 1080p, with proper deinterlacing, are identical for movies recorded at 24FPS. 



I hate trolls.

Systems I currently own:  360, PS3, Wii, DS Lite (2)
Systems I've owned: PS2, PS1, Dreamcast, Saturn, 3DO, Genesis, Gamecube, N64, SNES, NES, GBA, GB, C64, Amiga, Atari 2600 and 5200, Sega Game Gear, Vectrex, Intellivision, Pong.  Yes, Pong.