Actually, this thread, as with the other, is filled with complete misinformation. Interlace "flicker" is a function of CRT displays, NOT LCD or Plasma. Why do people who know nothing about the technology feel compelled to give advice when it is so wrong?
LCD and Plasma inherently display ONLY progressive images -- in other words, they MUST de-interlace every image they display -- each and every frame. That means that it boils down to the source frame rate. Go do some research... There is so much mis-information from every person out there that doesn't know what he is talking about that it truly is confusing.
At 30FPS or less, 1080i is going to beat 720P on an LCD or Plasma as the image is de-interlaced anyway. At framerates greater than 30fps, the de-interlacer for the 1080i image is going to have to do some "interpolation" when displaying. This cause some loss of image quality.
As to your original question:
For Movies: 1080i = 1080p. Movies are encoded at 24FPS. No difference as long as your set de-interlaces properly.
For ESPN HD: 720P is probably going to look better but it is going to depend on the broadcast source. Some say 1080i looks better "over the air". I don't think you will see much difference if the broadcast is a quality feed.
And finally, for games : For every game that is 60fps, 720P will rock. It will keep up with the fast moving objects.








