By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - 720p vs 1080i

thank you, you were helpful.

i didnt know 1080i was on a 720x1280 screen

so 1080p = 1920×1080

1080i = 720x1280? ?

720p = 720x1280 progressive scan



Around the Network

Technically, 720p is better. 1080i is technically a "bigger" resolution, but due to the fact that it's an interlaced format, it only displays half of its scanlines at any one time. Therefore,

720p: 720 scanlines
1080i: 540 scanlines

Therefore, while images may look a tiny bit sharper on 1080i resolution, 720p offers vastly superior color definition and none of the subtle screen flickering and other annoying issues associated with interlaced resolutions.



"'Casual games' are something the 'Game Industry' invented to explain away the Wii success instead of actually listening or looking at what Nintendo did. There is no 'casual strategy' from Nintendo. 'Accessible strategy', yes, but ‘casual gamers’ is just the 'Game Industry''s polite way of saying what they feel: 'retarded gamers'."

 -Sean Malstrom

 

 

holy moly, gracian smith answered my question!!

720p: 720 scanlines
1080i: 540 scanlines

sorry i didnt get it earlier



dgm6780 said:
holy moly, gracian smith answered my question!!

720p: 720 scanlines
1080i: 540 scanlines

sorry i didnt get it earlier

I have read this before too, I was just lazy and didn't dig up the article.  I will go ahead and give it to you now.

 http://www.alvyray.com/DigitalTV/Naming_Proposal.htm

 



We had two bags of grass, seventy-five pellets of mescaline, five sheets of high-powered blotter acid, a salt shaker half full of cocaine, a whole galaxy of multi-colored uppers, downers, screamers, laughers…Also a quart of tequila, a quart of rum, a case of beer, a pint of raw ether and two dozen amyls.  The only thing that really worried me was the ether.  There is nothing in the world more helpless and irresponsible and depraved than a man in the depths of an ether binge. –Raoul Duke

It is hard to shed anything but crocodile tears over White House speechwriter Patrick Buchanan's tragic analysis of the Nixon debacle. "It's like Sisyphus," he said. "We rolled the rock all the way up the mountain...and it rolled right back down on us...."  Neither Sisyphus nor the commander of the Light Brigade nor Pat Buchanan had the time or any real inclination to question what they were doing...a martyr, to the bitter end, to a "flawed" cause and a narrow, atavistic concept of conservative politics that has done more damage to itself and the country in less than six years than its liberal enemies could have done in two or three decades. -Hunter S. Thompson

Actually, this thread, as with the other, is filled with complete misinformation. Interlace "flicker" is a function of CRT displays, NOT LCD or Plasma. Why do people who know nothing about the technology feel compelled to give advice when it is so wrong?

LCD and Plasma inherently display ONLY progressive images -- in other words, they MUST de-interlace every image they display -- each and every frame. That means that it boils down to the source frame rate. Go do some research... There is so much mis-information from every person out there that doesn't know what he is talking about that it truly is confusing.

At 30FPS or less, 1080i is going to beat 720P on an LCD or Plasma as the image is de-interlaced anyway. At framerates greater than 30fps, the de-interlacer for the 1080i image is going to have to do some "interpolation" when displaying. This cause some loss of image quality.

As to your original question:

For Movies: 1080i = 1080p. Movies are encoded at 24FPS. No difference as long as your set de-interlaces properly.

For ESPN HD: 720P is probably going to look better but it is going to depend on the broadcast source. Some say 1080i looks better "over the air". I don't think you will see much difference if the broadcast is a quality feed.

And finally, for games :  For every game that is 60fps, 720P will rock.  It will keep up with the fast moving objects.



I hate trolls.

Systems I currently own:  360, PS3, Wii, DS Lite (2)
Systems I've owned: PS2, PS1, Dreamcast, Saturn, 3DO, Genesis, Gamecube, N64, SNES, NES, GBA, GB, C64, Amiga, Atari 2600 and 5200, Sega Game Gear, Vectrex, Intellivision, Pong.  Yes, Pong.

Around the Network
dgm6780 said:
holy moly, gracian smith answered my question!!

720p: 720 scanlines
1080i: 540 scanlines

sorry i didnt get it earlier


Technically there is still 1080 scanlines in 1080i but every field is comprised of 540 lines, in a 30fps tv show you would have 60 fields.  The TV draws the odd number scanline first and then draws the even number from top to bottom.  If you have an LCD screen you will prefer 720p as that is it's native resolution (unless you have a true 1080p set which I'll get into in a second).  If you have a CRT monitor, chances are 1080i will look a bit nicer due to the native resolution of the TV.

Some TV sets which say "1080p ready" are actually 720p sets with the downconvert hardware built into the TV.  So be careful what you buy.



Prepare for termination! It is the only logical thing to do, for I am only loyal to Megatron.

kn said:

Actually, this thread, as with the other, is filled with complete misinformation. Interlace "flicker" is a function of CRT displays, NOT LCD or Plasma. Why do people who know nothing about the technology feel compelled to give advice when it is so wrong?

LCD and Plasma inherently display ONLY progressive images -- in other words, they MUST de-interlace every image they display -- each and every frame. That means that it boils down to the source frame rate. Go do some research... There is so much mis-information from every person out there that doesn't know what he is talking about that it truly is confusing.

At 30FPS or less, 1080i is going to beat 720P on an LCD or Plasma as the image is de-interlaced anyway. At framerates greater than 30fps, the de-interlacer for the 1080i image is going to have to do some "interpolation" when displaying. This cause some loss of image quality.

As to your original question:

For Movies: 1080i = 1080p. Movies are encoded at 24FPS. No difference as long as your set de-interlaces properly.

For ESPN HD: 720P is probably going to look better but it is going to depend on the broadcast source. Some say 1080i looks better "over the air". I don't think you will see much difference if the broadcast is a quality feed.

And finally, for games : For every game that is 60fps, 720P will rock. It will keep up with the fast moving objects.


If it automatically deinterlaces then what is the penalty?  It obviously doesn't magically turn into 1080p.  Does it turn into 1080p with half the framerate? 

I appreciate that you are trying to correct misinformation, but you mostly seem to be saying that the previous explanation is wrong without offering the correct one (although you do explain which one of 1080i/720p is better in your opinion and why).   



Tag (courtesy of fkusumot): "Please feel free -- nay, I encourage you -- to offer rebuttal."
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
My advice to fanboys: Brag about stuff that's true, not about stuff that's false. Predict stuff that's likely, not stuff that's unlikely. You will be happier, and we will be happier.

"Everyone is entitled to his own opinion, but not his own facts." - Sen. Pat Moynihan
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
The old smileys: ; - ) : - ) : - ( : - P : - D : - # ( c ) ( k ) ( y ) If anyone knows the shortcut for , let me know!
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
I have the most epic death scene ever in VGChartz Mafia.  Thanks WordsofWisdom! 

This question is NONSENSICAL!
Normally, before anyone could answer this question, they should ask a definition of better. So, define "better"!
This question is nonsensical, because it's like asking: sugar or salt, which is better?

Well, except for games, there I can give an answer. There is no point in any interlaced signal for a game, except for backward compatibility, AKA duct tape connectivity.



Final-Fan said:
kn said:

Actually, this thread, as with the other, is filled with complete misinformation. Interlace "flicker" is a function of CRT displays, NOT LCD or Plasma. Why do people who know nothing about the technology feel compelled to give advice when it is so wrong?

LCD and Plasma inherently display ONLY progressive images -- in other words, they MUST de-interlace every image they display -- each and every frame. That means that it boils down to the source frame rate. Go do some research... There is so much mis-information from every person out there that doesn't know what he is talking about that it truly is confusing.

At 30FPS or less, 1080i is going to beat 720P on an LCD or Plasma as the image is de-interlaced anyway. At framerates greater than 30fps, the de-interlacer for the 1080i image is going to have to do some "interpolation" when displaying. This cause some loss of image quality.

As to your original question:

For Movies: 1080i = 1080p. Movies are encoded at 24FPS. No difference as long as your set de-interlaces properly.

For ESPN HD: 720P is probably going to look better but it is going to depend on the broadcast source. Some say 1080i looks better "over the air". I don't think you will see much difference if the broadcast is a quality feed.

And finally, for games : For every game that is 60fps, 720P will rock. It will keep up with the fast moving objects.


If it automatically deinterlaces then what is the penalty?  It obviously doesn't magically turn into 1080p.  Does it turn into 1080p with half the framerate? 

I appreciate that you are trying to correct misinformation, but you mostly seem to be saying that the previous explanation is wrong without offering the correct one (although you do explain which one of 1080i/720p is better in your opinion and why).   

1080i draws 2 parts of the same frame in seperate batches and ghosting does occur because in order to de interlace an image it has to reconstruct the full frame. The problem with this is is that films are played back at 24 fps and 1080i is at 30 fps(60 per field). the process is called a 3:2 pull down and is used for both the ATSC and NTSC standards. In tubes you get flicker because tubes were original designed to skip  one line of pixels to allow for the phosphates to cool down after being excited. It isn't as bad on an HDTV since it does still fill the screen and the increased resolution helps to hide some of the adverse effects. Wikipedia has some great examples of the negative effects that can occur when using interlacing or just converting a film with 24fps to 30 fps(reverse telecine[a relecine converts film to other formats like digital, Dat, VHS what not]). The image will be ghosted and with 1080i it is more apparant. For me I would use720P or 1080P but honestly most people wouldn't be able to tell the difference.