Alby_da_Wolf said:
HappySqurriel said:
For fast moving images 720p is superior For slow moving images 1080i is superior
|
I confirm. Also, people with more sensitive eyes, could perceive flicker in the outer visual field looking at interlaced still images, as flicker stimulates the eye quite like (**) a fast oscillating motion (*). Slow motion is where interlaced behaves best because time integration made by the eye smoothes interlacing flaws, while isn't fast enough to generate undersampling artefacts.
(*) Emergency vehicles' beacons are almost always blue because human outer visual field, although low res, is more sensitive to blue and fast motion, so you can perceive them fast approaching better.
(**) The appearance is similar, but flicker is much more tiring for the eye than watching true motion.
|
So much nonsense!
If your device can display 720p, then it's progressive, so there's no flicker at all, as even interlaced images will be rendered in progressive : hence no flicker.
If your display is progressive and can display every line of 1080i, then it's a 1080p (Full HD) device. Thus, on such a device, which of 1080i or 720p is better? It depends entirely on the electronics in the device.
Most progressive device (LCD, plasma, ...) can't render interlaced correctly because of "cheap" components.
Rendered with good components, true 1080i will always be better than 720p, even with fast moving images.
But the current reality is that for most progressive devices, 1080i is badly rendered.