@kn: I'm saying that in an attempt to be understood by most people, he voluntarily said things that were half truths. I'm not really believing he's confused, if that's what you wanted me to say. Most of what he says is right, but some is wrong, surely because some techno improvements were out at the same time he wrote his article. For example, there are 1080p@24 sets that display only the 24 progressive images of a movie every second.
And I also clearly said that he said some things to justify some backward compatible connections like component cables (that use YUV). He's clever, as he says that 1080p add nothing to the image compared to 1080i, which is true in a vacuum, but is useless to the discussion. Most clueless people will then wrongly understand that there's no difference between 1080p and 1080i, which is wrong.
All this confusion comes from the fact that most people talking about HD, talk interchangeably of very different points of the system, without telling the uninformed. There are several points in the HDTV process : source -> coded signal -> transmitted signal -> decoded signal -> scaled signal to native resolution.
When the author says that 1080p adds nothing to the signal, look out at what he means :
Given a 1080p source, coded and transmitted as a 1080i signal, IF the coding doesn't mix frames, and IF the display can detect and decode the pulldown used perfectly (it's not always 3:2 with interlaced at least), and IF the display is a native 1080p one, then there is no difference in the transmitted signal is 1080i or 1080p.
Most people didn't understand that at all when reading him. And he "forgot" to say that there is no point in using a 1080i signal. Consider the same scenario with a 1080p transmitted signal:
Given a 1080p source, coded and transmitted as a 1080p signal, IF the display is 1080p, it will display like the source. Even then, I'm lying a bit, as I didn't say if the source was 24 i/s, 25 i/s, 30 i/s or even 60 i/s, which cause other computations.
I noticed that several people are really confused between source, transmitted signal and display resolution.
Some people say their TV "does" 1080i. Right from there, you know they don't know what they're talking about.
And all the confusion goes from this simple misunderstanding. Some people will tell you there is no difference between 1080i and 720p, not even realizing that they never really saw 1080i, because their set just can't display it.
Some will say the same between 720p and 1080p, and the reasons are exactly the same: their set can't display it.
To sum all this up, I (my opinion) see ONLY ONE current configuration when 1080p is useful : you have access to 1080p sources, and have a display bigger than 40".
To be even more specific, there's no point in getting 1080p unless you have a display bigger than 40", and a HD media player (be it HD-DVD or BluRay), connected through HDMI cables.







