ookaze said:
NO NO NO! OMG!!! People stop please! Can't you even read an article correctly, I mean, understand it? @W29, there is no debate to have: go 720p, that is the only and best solution for you given your situation. @kber81, no, the limit to see the details of 1080p is not 32", it's 40". 32" is the size of LCD TV under which they can't cram a 1920x1080 pixel matrix. They can make a 1920x1080 under 40", but it's no use apart from taking your money and making a fool of the consumer. @kn, the author of the other article is clearly confused. There is a huge difference between a 1080p or 1080i signal, between the player and the TV. The 1080i signal will cause all kinf of unnecessary computations on most HDTV (most HDTV are progressive like said in the article) that can go wrong, and will induce a delay between image and audio. And there is basically no point in outputting 1080i from the player. This was surely written to (wrongly) justify YUV connections. There is no frame created in 3:2 pulldown either when everything is progressive from source to display. The frame are repeated, that's all. Which is the best situation. The article is completely wrong (and that's inexcusable) in that Again, sorry to tell you that if low cost HDDVD players output 1080i max, then I understand why they are low cost, and they're the worst things you could buy. Because that basically mean that you'll be in the worst configuration I described earlier : 1080p->1080i->1080p. So actually, people that push other people to buy such faulty solutions are the fanboys with an agenda.
720p, 1080i or 1080p was never an issue with the PS3, but an issue with some (bad) TV. That's the TV that must accept all these signals. Trolls and anti-Sony fanboys repeated again and again that the fault was with the PS3, while that wasn't the case at all.
@facher83: you got it all wrong, so don't go lecturing people. 720 or 1080 in 720p and 1080p are not the width of the display, they are the height. If you don't even know that, don't go lecturing people. And no, interlaced does NOTHING better. Interlaced is always worse for image quality. Interlaced is a vestige from the past. It can fool you by using less bandwidth (even if less efficient in compression than progressive) to display the same still image. And for god's sake, no, interlaced was never better for moving images. I can't believe some people still believe that, when we ran away from interlaced in part because of all the artefacts on moving images created by interlaced display. Amazing! |
Ok, I said I wasn't going to post again but I feel compelled to answer. This is but one of many technical articles I have read that discuss the interlace/de-interlace 3:2 pulldown issues. Some argue that there are all sorts of artifacts introduced in the process and some say that it is lossless but only on quality equipment. That is certainly a valid argument. But a quick question and I'm more than happy to stand corrected: Are you saying that Geoffrey Morrison, who writes regularly for Home Theater Mag, doesn't know what he is talking about and that his article is misinformation?








