| archbrix said: "the most telltale sign that a processor is not doing a good job is to look for any jagged edges on objects in the image. This will be more noticeable on incoming 1080i signals as the TVs processor only has to scale the resolution up to 1080p or down to 720p (or 768p), but also has to perform a task called "deinterlacing". Deinterlacing requires that the TV's processor combine the odd and even lines or pixel rows of the incoming interlaced 1080i image into a single progressive image to be displayed at least every 60th of a second. Some processors do this very well, and some don't." From my personal experience it kinda comes down to if you were only viewing still photos, then 1080i would produce more detail, but with a picture that's constantly in motion (such as a show or a game) 720p is superior. Although there are theoretically less pixels in the 720p image, the entire picture being rendered in a single pass (progressively) looks better and smoother in motion. |
Except when the game is 30fps and the tv de-interlaces correctly it's as stable as viewing still photos. Every frame gets send in 2 halves, while it's always sending at 1080i60. For 60fps games you might see this effect
Either a bit blurry where correctly de-interlaced, or two frames weaved together (not de-interlaced at all)







