makingmusic476 said:
They mean on a 1080p tv. 1080p tvs deinterlace a 1080i signal and display it in full 1080p. However, if you only have a 720p/1080i TV, it will display in 1080i at max. |
I don't think we're on the same sheet here, but for the record, LCD and Plasma screens don't display interlaced images like a conventional CRT. They display a progressive image. 1080P does 1080 lines at 60fps. 1080i does 1080lines fed into the set in "halves" and it is de-interlaced into a single image and then displayed. That means it can, at max, do 1080 lines of resolution at 30fps.
Hence the reason that (quoted from the article and as can be found elsewhere):
"There is no additional or new information in a 1080p signal from movie based content."
What everyone seems to miss on the debate is that 1080i capable sets or 720P sets -- whichever you choose to call them since almost every new 720P set does 1080i -- do not actually display 1080 lines of information. They have to convert it down to 720 lines because that is the native resolution. There is some loss there, no doubt. Therein lies a measuralbe difference in true 1080 sets -- they are 1920x1080 whereas 720P sets are 1280x720. It would be foolish to manufacture a TV that was capable of 1920x1080 and not give it the capability to display 60FPS or "P" format...
At the end of the day, there are far too many people doling out advice -- me included -- who aren't true experts and cannot give you all the different angles. If you do enough research, though, and determine what you will most likely use your set for, you will be informed enough to avoid making a mistake.
In my opinion, and the last note I'm going to add to this thread, the features of the television are just as important as the overall resolution. Zoom modes, PIP, contrast, types and number of inputs, and so on are just as important as the decision over whether you want 720P/1080I or 1080P...








