By using this site, you agree to our Privacy Policy and our Terms of Use. Close
shams said:

Here is a question for you all...

Is the Interlaced/Progressive something handled by the video card when generating the output, or something internal in the machine?

i.e. if a game is running at 480i, is the internal frame buffer 240 pixels high? Or 480 pixels high? (and its the video card that samples every second line, and builds the appropriate analog signal going out to the display device?).

Might also be interesting to discuss NTSC vrs PAL. I actually run my Wii on non-progressive PAL mode (576i), as I think it looks a lot better (in some apps) than 480p does. Sort of less detail, but a higher resolution. 576p would be the best though :)

(definitely use component cables though, the difference in image colours/quality/stability is nothing short of amazing).


 480i means that the screen is displaying 240 lines for one frame, and then 240 lines for the next frame.

For what your eye can see this does not make a big difference to the apparent level of visual detail... especially when you are moving from 480i to 576i.

The way in which this all looks depends on the quality of your television... LCD's and Plasmas don't work the same as CRT's but we're talking about the interlaced versions at the moment so i will talke about how CRT's work with interlaced, because that's what it was built for.

The lines that are displayed alternate back and forth.  In fast moving games you will get an 'interlaced' effect.  I found this was especially apparent in games like Sonic Adventure on Dreamcast (you can see an 'interlaced' shadow where the frame used to be.

You could probably still notice this on your Wii games if you look carefully.

TECHNICALLY 576i shows more lines of picture because there is overall more lines being generated, however when you switch to 480p rather than showing 240 lines alternating at the speed of the tv units refresh rate you get the FULL frame for each and every frame... this does does use slightly more processing power however not heaps because generally the games are rendering the scene in 480 or 576 lines anyway progressive just provides the output to show more.

The same goes for 480i and 576i.  I don't like anything in 576i because i'm very sensitive to frame rate differences and I can notice the drop, especially if a game isn't optimised for PAL very well.  

480i runs at 29 - 30 fps (or so) while 576i runs at 24 - 25 fps which is a sacrifice for the extra lines... so you are getting more lines but less frames per second.  However there is PAL 60hz which is 576i at 30fps (60 hz is the number of refreshes of each interlaced line 2x the frames).

This was a big problem during the older console days (especially with MegaDrive because it typically had a lot of faster games - ie; Sonic) because the games weren't optimised for the PAL scene... you got the very apparent these days PAL borders top and bottom of the screen (something I HATE so much) and the games slowed down because if the game was running 30fps for example you would lose 5 frames from every 30 the game rendered which is in essence a 17% drop in speed because the tv is just 'skipping' these frames... i can notice the speed various in Sonic on Megadrive.


I hope that helps you understand those? 480p is the best because you are getting all of the detail all of the time at the best frame rate, however it's slightly less resolution. Would be nice if the Wii did 576p however it's not a popular format.


Then you move onto the technicalities of 720p, 1080i and 1080p.

1080p is the absolute best HDTV resolution currently available, and in the original introduction of digital formats wasn't generally included... it was 480p, 720p and 1080i.  1080p is the most recent and has become the most prominant and the 'true hd' you hear about these days.

The reason for the introduction of this resolution as a main output was because of the interlaced nature of 1080i and in most cases 720p was visibly more appealing that 1080 (put what I said before into perspective and realistically you only have 540 lines shown at any one time in 1080i whereas 720p is 720 lines all the time) however for the naked eye the screen is refreshing generally 60 times every second so each 1:60th of a second is 540 lines.  Over the course of a second 720p will provide more lines

720p = 720x60 = 43200 lines
1080i = 540x60 = 32400 lines
1080p = 1080x60 = 64800

HOWEVER it should really be 720x30 because for two of the 60 refreshes on a 60hz tv the same frame will be shown but that depends on the output... an Xbox360 outputting a game a 60fps through 720p will generally be much better looking and smoother than a game displaying at 1080i.

In the end the original question about whether it uses much extra on the processor of the Wii, the answer is no for this generation because the scene is generally rendered in the 480 (or stretched to 576) resolution.

I'm not going to talk like I know how much extra processing power the difference between 720p to 1080p is for TV's because I don't know how the output is different for these consoles, but I believe that yes to take that jump it should (it's like changing your PC games resolution from 1280x720 to 1920x1080 you need a beefy video card to do so without taking any fps performance hit, but because I don't know how these consoles are outputting the higher resolutions I don't want to speak about that without further research... I understand most of the games are created for the 720 resolution for 360 with some more emphasis on 1080 since the elite became available however and the PS3 I think works better with 1080 as well, but don't quote me on all of that because i'm not sure about their differences in HD outputs.