*Sound Of Rain said:
archbrix said:
You're welcome. 
http://hometheater.about.com/od/hometheatervideobasics/qt/The-Difference-Between-720p-And-1080i.htm
720p: "the entire image (720 lines or pixel rows) is sent every 60th of a second (or twice every 30th of a second)."
1080i: "1080i, since it is interlaced, only sends 540 lines (or half the detail) every 60th of a second, with all the detail sent every 30th of a second. On the surface, 1080i produces more detail than 720p, but since the increased detail is only sent every 1/30th of a second, rather than 1/60 of a second, fast moving objects, will exhibit slight interlacing artifacts - which can appear to look like jagged edges or a very slight blurred effect."
|
720P then? Can the naked eye actually notice the blurred effect since it's only every 30th of a second?
|
Well, like that article says, it has a lot to do with how good the particular TV's video processing is. I would suggest trying it on both settings and see what looks better to you:
"the most telltale sign that a processor is not doing a good job is to look for any jagged edges on objects in the image. This will be more noticeable on incoming 1080i signals as the TVs processor only has to scale the resolution up to 1080p or down to 720p (or 768p), but also has to perform a task called "deinterlacing". Deinterlacing requires that the TV's processor combine the odd and even lines or pixel rows of the incoming interlaced 1080i image into a single progressive image to be displayed at least every 60th of a second. Some processors do this very well, and some don't."
From my personal experience it kinda comes down to if you were only viewing still photos, then 1080i would produce more detail, but with a picture that's constantly in motion (such as a show or a game) 720p is superior. Although there are theoretically less pixels in the 720p image, the entire picture being rendered in a single pass (progressively) looks better and smoother in motion.