By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Is there a huge difference between 720p or 1080p HDTV?

makingmusic476 said:
kn said:
Here you go.... all you need to know. Get the 720P and move on with your life.

http://blog.hometheatermag.com/geoffreymorrison/0807061080iv1080p/

That tells it like it is. Let me take a quote from it for you:

"When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none)."

They mean on a 1080p tv. 1080p tvs deinterlace a 1080i signal and display it in full 1080p. However, if you only have a 720p/1080i TV, it will display in 1080i at max.


I don't think we're on the same sheet here, but for the record, LCD and Plasma screens don't display interlaced images like a conventional CRT.  They display a progressive image.  1080P does 1080 lines at 60fps.  1080i does 1080lines fed into the set in "halves" and it is de-interlaced into a single image and then displayed.  That means it can, at max, do 1080 lines of resolution at 30fps.

Hence the reason that (quoted from the article and as can be found elsewhere):

"There is no additional or new information in a 1080p signal from movie based content."

What everyone seems to miss on the debate is that 1080i capable sets or 720P sets -- whichever you choose to call them since almost every new 720P set does 1080i -- do not actually display 1080 lines of information.  They have to convert it down to 720 lines because that is the native resolution.  There is some loss there, no doubt.  Therein lies a measuralbe difference in true 1080 sets --  they are 1920x1080 whereas 720P sets are 1280x720.  It would be foolish to manufacture a TV that was capable of 1920x1080 and not give it the capability to display 60FPS or "P" format...

At the end of the day, there are far too many people doling out advice -- me included -- who aren't true experts and cannot give you all the different angles.  If you do enough research, though, and determine what you will most likely use your set for, you will be informed enough to avoid making a mistake.

In my opinion, and the last note I'm going to add to this thread, the features of the television are just as important as the overall resolution.  Zoom modes, PIP, contrast, types and number of inputs, and so on are just as important as the decision over whether you want 720P/1080I or 1080P... 



I hate trolls.

Systems I currently own:  360, PS3, Wii, DS Lite (2)
Systems I've owned: PS2, PS1, Dreamcast, Saturn, 3DO, Genesis, Gamecube, N64, SNES, NES, GBA, GB, C64, Amiga, Atari 2600 and 5200, Sega Game Gear, Vectrex, Intellivision, Pong.  Yes, Pong.

Around the Network
makingmusic476 said:
kn said:
Here you go.... all you need to know. Get the 720P and move on with your life.

http://blog.hometheatermag.com/geoffreymorrison/0807061080iv1080p/

That tells it like it is. Let me take a quote from it for you:

"When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none)."

They mean on a 1080p tv. 1080p tvs deinterlace a 1080i signal and display it in full 1080p. However, if you only have a 720p/1080i TV, it will display in 1080i at max.


NO NO NO! OMG!!! People stop please! Can't you even read an article correctly, I mean, understand it?

@W29, there is no debate to have: go 720p, that is the only and best solution for you given your situation.

@kber81, no, the limit to see the details of 1080p is not 32", it's 40". 32" is the size of LCD TV under which they can't cram a 1920x1080 pixel matrix. They can make a 1920x1080 under 40", but it's no use apart from taking your money and making a fool of the consumer.

@kn, the author of the other article is clearly confused. There is a huge difference between a 1080p or 1080i signal, between the player and the TV. The 1080i signal will cause all kinf of unnecessary computations on most HDTV (most HDTV are progressive like said in the article) that can go wrong, and will induce a delay between image and audio. And there is basically no point in outputting 1080i from the player. This was surely written to (wrongly) justify YUV connections.

There is no frame created in 3:2 pulldown either when everything is progressive from source to display. The frame are repeated, that's all. Which is the best situation. The article is completely wrong (and that's inexcusable) in that
there IS a difference between sending a 1080i and a 1080p signal. The decoding just isn't the same, as the 3:2 pulldown is diferent too, and creates more artefacts in the 1080i case. That is, if we go source->transmit->display being 1080p->1080i->1080p. It works well when the source is natively 1080i, as it's coded to work well (1080i->1080p->1080p). And FYI, movies are recorded in progressive on HD media. So that best case won't happen for movies

Again, sorry to tell you that if low cost HDDVD players output 1080i max, then I understand why they are low cost, and they're the worst things you could buy. Because that basically mean that you'll be in the worst configuration I described earlier : 1080p->1080i->1080p. So actually, people that push other people to buy such faulty solutions are the fanboys with an agenda.

 

720p, 1080i or 1080p was never an issue with the PS3, but an issue with some (bad) TV.

That's the TV that must accept all these signals. Trolls and anti-Sony fanboys repeated again and again that the fault was with the PS3, while that wasn't the case at all.

 

@facher83: you got it all wrong, so don't go lecturing people. 720 or 1080 in 720p and 1080p are not the width of the display, they are the height. If you don't even know that, don't go lecturing people. And no, interlaced does NOTHING better. Interlaced is always worse for image quality. Interlaced is a vestige from the past. It can fool you by using less bandwidth (even if less efficient in compression than progressive) to display the same still image. And for god's sake, no, interlaced was never better for moving images. I can't believe some people still believe that, when we ran away from interlaced in part because of all the artefacts on moving images created by interlaced display. Amazing!



Q1. does all 720p TVs do 1080i?

Q2. If you have a TV that can do 1080i, is there any point to choose 720p output? 1080i looks better than 720p correct?



crappy old school NES games are more entertaining than next-gen games.

ookaze said:
makingmusic476 said:
kn said:
Here you go.... all you need to know. Get the 720P and move on with your life.

http://blog.hometheatermag.com/geoffreymorrison/0807061080iv1080p/

That tells it like it is. Let me take a quote from it for you:

"When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none)."

They mean on a 1080p tv. 1080p tvs deinterlace a 1080i signal and display it in full 1080p. However, if you only have a 720p/1080i TV, it will display in 1080i at max.


NO NO NO! OMG!!! People stop please! Can't you even read an article correctly, I mean, understand it?

@W29, there is no debate to have: go 720p, that is the only and best solution for you given your situation.

@kber81, no, the limit to see the details of 1080p is not 32", it's 40". 32" is the size of LCD TV under which they can't cram a 1920x1080 pixel matrix. They can make a 1920x1080 under 40", but it's no use apart from taking your money and making a fool of the consumer.

@kn, the author of the other article is clearly confused. There is a huge difference between a 1080p or 1080i signal, between the player and the TV. The 1080i signal will cause all kinf of unnecessary computations on most HDTV (most HDTV are progressive like said in the article) that can go wrong, and will induce a delay between image and audio. And there is basically no point in outputting 1080i from the player. This was surely written to (wrongly) justify YUV connections.

There is no frame created in 3:2 pulldown either when everything is progressive from source to display. The frame are repeated, that's all. Which is the best situation. The article is completely wrong (and that's inexcusable) in that
there IS a difference between sending a 1080i and a 1080p signal. The decoding just isn't the same, as the 3:2 pulldown is diferent too, and creates more artefacts in the 1080i case. That is, if we go source->transmit->display being 1080p->1080i->1080p. It works well when the source is natively 1080i, as it's coded to work well (1080i->1080p->1080p). And FYI, movies are recorded in progressive on HD media. So that best case won't happen for movies

Again, sorry to tell you that if low cost HDDVD players output 1080i max, then I understand why they are low cost, and they're the worst things you could buy. Because that basically mean that you'll be in the worst configuration I described earlier : 1080p->1080i->1080p. So actually, people that push other people to buy such faulty solutions are the fanboys with an agenda.

 

720p, 1080i or 1080p was never an issue with the PS3, but an issue with some (bad) TV.

That's the TV that must accept all these signals. Trolls and anti-Sony fanboys repeated again and again that the fault was with the PS3, while that wasn't the case at all.

 

@facher83: you got it all wrong, so don't go lecturing people. 720 or 1080 in 720p and 1080p are not the width of the display, they are the height. If you don't even know that, don't go lecturing people. And no, interlaced does NOTHING better. Interlaced is always worse for image quality. Interlaced is a vestige from the past. It can fool you by using less bandwidth (even if less efficient in compression than progressive) to display the same still image. And for god's sake, no, interlaced was never better for moving images. I can't believe some people still believe that, when we ran away from interlaced in part because of all the artefacts on moving images created by interlaced display. Amazing!


Ok, I said I wasn't going to post again but I feel compelled to answer.  This is but one of many technical articles I have read that discuss the interlace/de-interlace 3:2 pulldown issues.  Some argue that there are all sorts of artifacts introduced in the process and some say that it is lossless but only on quality equipment.  That is certainly a valid argument.  But a quick question and I'm more than happy to stand corrected: Are you saying that Geoffrey Morrison, who writes regularly for Home Theater Mag, doesn't know what he is talking about and that his article is misinformation?



I hate trolls.

Systems I currently own:  360, PS3, Wii, DS Lite (2)
Systems I've owned: PS2, PS1, Dreamcast, Saturn, 3DO, Genesis, Gamecube, N64, SNES, NES, GBA, GB, C64, Amiga, Atari 2600 and 5200, Sega Game Gear, Vectrex, Intellivision, Pong.  Yes, Pong.

No. If you get a set that does 720p & 1080i - there is very little (if any) difference between one that also does 1080p.

I just picked up a new Plasma set that does everything - except 1080p - and have no regrets. Everything looks amazing on it.



Gesta Non Verba

Nocturnal is helping companies get cheaper game ratings in Australia:

Game Assessment website

Wii code: 2263 4706 2910 1099

Around the Network
shams said:
No. If you get a set that does 720p & 1080i - there is very little (if any) difference between one that also does 1080p.

I just picked up a new Plasma set that does everything - except 1080p - and have no regrets. Everything looks amazing on it.

You see I made my decision already I'm getting the 720p HDTV. Since there is little difference between the two. I finally made my decision, just one question. When I do hook my HDTV up with my 360, do I use the HDMI cable that came with my Elite or use my regular component cables? 



@kn: I'm saying that in an attempt to be understood by most people, he voluntarily said things that were half truths. I'm not really believing he's confused, if that's what you wanted me to say. Most of what he says is right, but some is wrong, surely because some techno improvements were out at the same time he wrote his article. For example, there are 1080p@24 sets that display only the 24 progressive images of a movie every second.
And I also clearly said that he said some things to justify some backward compatible connections like component cables (that use YUV). He's clever, as he says that 1080p add nothing to the image compared to 1080i, which is true in a vacuum, but is useless to the discussion. Most clueless people will then wrongly understand that there's no difference between 1080p and 1080i, which is wrong.

All this confusion comes from the fact that most people talking about HD, talk interchangeably of very different points of the system, without telling the uninformed. There are several points in the HDTV process : source -> coded signal -> transmitted signal -> decoded signal -> scaled signal to native resolution.

When the author says that 1080p adds nothing to the signal, look out at what he means :
Given a 1080p source, coded and transmitted as a 1080i signal, IF the coding doesn't mix frames, and IF the display can detect and decode the pulldown used perfectly (it's not always 3:2 with interlaced at least), and IF the display is a native 1080p one, then there is no difference in the transmitted signal is 1080i or 1080p.
Most people didn't understand that at all when reading him. And he "forgot" to say that there is no point in using a 1080i signal. Consider the same scenario with a 1080p transmitted signal:
Given a 1080p source, coded and transmitted as a 1080p signal, IF the display is 1080p, it will display like the source. Even then, I'm lying a bit, as I didn't say if the source was 24 i/s, 25 i/s, 30 i/s or even 60 i/s, which cause other computations.

 

I noticed that several people are really confused between source, transmitted signal and display resolution.

Some people say their TV "does" 1080i. Right from there, you know they don't know what they're talking about.

And all the confusion goes from this simple misunderstanding. Some people will tell you there is no difference between 1080i and 720p, not even realizing that they never really saw 1080i, because their set just can't display it.

Some will say the same between 720p and 1080p, and the reasons are exactly the same: their set can't display it.

 

To sum all this up, I (my opinion) see ONLY ONE current configuration when 1080p is useful : you have access to 1080p sources, and have a display bigger than 40".

To be even more specific, there's no point in getting 1080p unless you have a display bigger than 40", and a HD media player (be it HD-DVD or BluRay), connected through HDMI cables.



/looks around

/hugs 60" 1080p television



If you're going to use the tv for PC use sometimes, then 1080p is far better. I'm an a/v nut, so I like to be able to view things like high-def movies in their native 1:1 resolution, rather than viewing downscaled 1080i on a 720p native display. I think 1080p is worth it, but it's different for everyone. Many people are fine with 720p native displays.

 My 1080p display is an 'off brand' one (and does a great job for what I paid for it) - I would never pay the exorbitant prices that companies like Sony and Samsung are asking for their 1080p displays. *coughMARKUPcough*



W29 said:

I'm just asking cause I'm thinking about getting me a new TV. Its time for me to leave that SDTV format and quickly switch. I'm waiting for Black Friday to come, cause I know its going to be cheaper to find a decent 720p or 1080p HDTV.

Is there a major difference will I miss out on some little graphic enhancements?


For movies? The difference is hardly perceptible, if at all, between 1080i and 1080p. 720p is very close to 1080i (but between 1080i and 720p I'd get 1080i).

http://www.hometheatermag.com/gearworks/1106gear/ - second paragraph.

For games? Yes, there is a difference, but only if the game actually supports 1080p. That is, even if you get a 1080p screen but your games are limited to 720p or 1080i your image will still be upscaled to 1080p, which arguably is worse than simply displaying the game in its native resolution.

If you've ever seen a Wii upscaled from 480p to 720p, then you have a rough idea of what a 720p game will look upscaled to a 1080p screen.

Movies are normally shot in 24fps while games are normally made to support 60fps. An HDTV is capable of displaying progressive signals, but since the current crop of HDTVs operate at 60Hz, it has to go through a process known as 2:3:3 pulldown to compensate for the difference in framerates vs timing.  That's why when all is said and done, the difference between 1080i and 1080p film is negligible.

Really ,it depends on whether your main purpose is games or movie viewing, and even then, you have to consider what is the native resolution of the bulk of the media that you will be using to view on the new HDTV.