By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - IGN Reviews PS3 upscaling games

The PS3 upscaling looks decent, and it does appear that they are rendering in a higher resolution (but it looks more like 640x480 vs 320x240). I can hardly tell a difference in the PS2 games.



Around the Network

Which HDTV's upscale? Just one person provide a source please! My $2400 Sony SXRD doesn't (2007) model, neither does a friend's 2007 Sharp Aquous, or another's Sony XBR3, or lastly a buddies Vizio. ALL of our TVs are 2007 models with 1080p native displays. I'm not trying to be an ass, maybe these TVs aren't "high tech" enough? I should really tell my friend with the $6000 XBR3 to take it back.



KruzeS said:
ChichiriMuyo said:
The upscaled versions actually do look better.

So? You're one of the folks that "will like the cinematic look of games when upscaled and smoothed", or did you miss that part? Others may "prefer 480p when available or perhaps even upscaled but not smoothed" (as it seems they at IGN do), or do you find that so hard to believe? And I'm guessing you also missed that they never said no upscaling was better, and that they just don't like the smoothing filter on some/most games (and always prefer 480p).

Well, I know I for one like the 480p shots better, specially if the HDTV upscales them nicely. There's just no sense in losing detail before upscaling - we don't half a photo's height before upscaling it in photoshop, now do we?

This is still great for people with HDTVs with crappy scalers, as the PS3 does seem to do an above average work at scaling. But really, really, really good would be re-rendering with upscaled textures.


I have the same experience.  I already have a very nice HDTV.  No reason for me to have a PS3 to upscale my PS2 games.



vizunary said:
Which HDTV's upscale? Just one person provide a source please! My $2400 Sony SXRD doesn't (2007) model, neither does a friend's 2007 Sharp Aquous, or another's Sony XBR3, or lastly a buddies Vizio. ALL of our TVs are 2007 models with 1080p native displays. I'm not trying to be an ass, maybe these TVs aren't "high tech" enough? I should really tell my friend with the $6000 XBR3 to take it back.

 I really dont know why this is hard to understand, and I explained it in this thread to boot!

Your TV has a native resolution of 1080x1920, and since its an LCD, thats the resolution it will always display. If you have a lower res. source, like 480x640(from a PS2, for example), it has to upscale that source to match the resolution of the tv. Some tvs do a great job, others dont, but Sonys sets are pretty good.

Hey, wadda ya know, first hit with "hdtv upscale" on google http://answers.yahoo.com/question/index?qid=20070323084447AA7UJcy&show=7

Beleive or not, you can also search google. And its easy!



Leo-j said: If a dvd for a pc game holds what? Crysis at 3000p or something, why in the world cant a blu-ray disc do the same?

ssj12 said: Player specific decoders are nothing more than specialized GPUs. Gran Turismo is the trust driving simulator of them all. 

"Why do they call it the xbox 360? Because when you see it, you'll turn 360 degrees and walk away" 

shams said:

Here is a question for you all...

Is the Interlaced/Progressive something handled by the video card when generating the output, or something internal in the machine?

i.e. if a game is running at 480i, is the internal frame buffer 240 pixels high? Or 480 pixels high? (and its the video card that samples every second line, and builds the appropriate analog signal going out to the display device?).

Might also be interesting to discuss NTSC vrs PAL. I actually run my Wii on non-progressive PAL mode (576i), as I think it looks a lot better (in some apps) than 480p does. Sort of less detail, but a higher resolution. 576p would be the best though :)

(definitely use component cables though, the difference in image colours/quality/stability is nothing short of amazing).


 480i means that the screen is displaying 240 lines for one frame, and then 240 lines for the next frame.

For what your eye can see this does not make a big difference to the apparent level of visual detail... especially when you are moving from 480i to 576i.

The way in which this all looks depends on the quality of your television... LCD's and Plasmas don't work the same as CRT's but we're talking about the interlaced versions at the moment so i will talke about how CRT's work with interlaced, because that's what it was built for.

The lines that are displayed alternate back and forth.  In fast moving games you will get an 'interlaced' effect.  I found this was especially apparent in games like Sonic Adventure on Dreamcast (you can see an 'interlaced' shadow where the frame used to be.

You could probably still notice this on your Wii games if you look carefully.

TECHNICALLY 576i shows more lines of picture because there is overall more lines being generated, however when you switch to 480p rather than showing 240 lines alternating at the speed of the tv units refresh rate you get the FULL frame for each and every frame... this does does use slightly more processing power however not heaps because generally the games are rendering the scene in 480 or 576 lines anyway progressive just provides the output to show more.

The same goes for 480i and 576i.  I don't like anything in 576i because i'm very sensitive to frame rate differences and I can notice the drop, especially if a game isn't optimised for PAL very well.  

480i runs at 29 - 30 fps (or so) while 576i runs at 24 - 25 fps which is a sacrifice for the extra lines... so you are getting more lines but less frames per second.  However there is PAL 60hz which is 576i at 30fps (60 hz is the number of refreshes of each interlaced line 2x the frames).

This was a big problem during the older console days (especially with MegaDrive because it typically had a lot of faster games - ie; Sonic) because the games weren't optimised for the PAL scene... you got the very apparent these days PAL borders top and bottom of the screen (something I HATE so much) and the games slowed down because if the game was running 30fps for example you would lose 5 frames from every 30 the game rendered which is in essence a 17% drop in speed because the tv is just 'skipping' these frames... i can notice the speed various in Sonic on Megadrive.


I hope that helps you understand those? 480p is the best because you are getting all of the detail all of the time at the best frame rate, however it's slightly less resolution. Would be nice if the Wii did 576p however it's not a popular format.


Then you move onto the technicalities of 720p, 1080i and 1080p.

1080p is the absolute best HDTV resolution currently available, and in the original introduction of digital formats wasn't generally included... it was 480p, 720p and 1080i.  1080p is the most recent and has become the most prominant and the 'true hd' you hear about these days.

The reason for the introduction of this resolution as a main output was because of the interlaced nature of 1080i and in most cases 720p was visibly more appealing that 1080 (put what I said before into perspective and realistically you only have 540 lines shown at any one time in 1080i whereas 720p is 720 lines all the time) however for the naked eye the screen is refreshing generally 60 times every second so each 1:60th of a second is 540 lines.  Over the course of a second 720p will provide more lines

720p = 720x60 = 43200 lines
1080i = 540x60 = 32400 lines
1080p = 1080x60 = 64800

HOWEVER it should really be 720x30 because for two of the 60 refreshes on a 60hz tv the same frame will be shown but that depends on the output... an Xbox360 outputting a game a 60fps through 720p will generally be much better looking and smoother than a game displaying at 1080i.

In the end the original question about whether it uses much extra on the processor of the Wii, the answer is no for this generation because the scene is generally rendered in the 480 (or stretched to 576) resolution.

I'm not going to talk like I know how much extra processing power the difference between 720p to 1080p is for TV's because I don't know how the output is different for these consoles, but I believe that yes to take that jump it should (it's like changing your PC games resolution from 1280x720 to 1920x1080 you need a beefy video card to do so without taking any fps performance hit, but because I don't know how these consoles are outputting the higher resolutions I don't want to speak about that without further research... I understand most of the games are created for the 720 resolution for 360 with some more emphasis on 1080 since the elite became available however and the PS3 I think works better with 1080 as well, but don't quote me on all of that because i'm not sure about their differences in HD outputs.



Around the Network

@sieanr I still don't think it's the same. Whenever I input a 480 source my TV says 480p/i, same goes for 720p and 1080p/i. Maybe I just need to do some(more) homework on it.



sieanr I think you are confusing some things...

Not all input sources are up-scaled... upscaling is a CPU alogorithm that scales the size of the image to correctly match the output resolution of that particular tv unit and generally is the best way to make things look good...

However, it's still possible for the input source to be a different resolution than the native resolution of the display unit and that input is just stretched to the right size... there is a difference between upscaling the source and stretching it... most of them just stretch.



vizunary said:
@sieanr I still don't think it's the same. Whenever I input a 480 source my TV says 480p/i, same goes for 720p and 1080p/i. Maybe I just need to do some(more) homework on it.

 The source is 480p/i, but if it was displayed at the resolution than it would be very small on the screen.

@ OriGin - TVs use interpolation to upconvert/downconvert to native res, thats the same way as most upscaling dvd players work - although some use more advanced algorithims to preserve details. Some tvs do use straight interpolation to native res, but many more expensive sets have special image processors that handle scaling quite well (see sonys bravia engine)



Leo-j said: If a dvd for a pc game holds what? Crysis at 3000p or something, why in the world cant a blu-ray disc do the same?

ssj12 said: Player specific decoders are nothing more than specialized GPUs. Gran Turismo is the trust driving simulator of them all. 

"Why do they call it the xbox 360? Because when you see it, you'll turn 360 degrees and walk away" 

Yeah... I think that's what's confusing viz though.

Sometimes I don't know why resolution is such a hard thing for people to grasp!? (just talking in general)

I don't mean interpolation... that is still a form of up-scaling... i mean stretching... as in a 4x4 square of pixels = 1 pixel for the up-stretched low resolution.

Some TV's have very bad up-scaling (good example - SHARP) and look MUCH better when they are just stretched.

Also most TV's have this up-scaling ability however it produces a minute lag for gameplay and reaction times... the best way to test this I find is Mario for SNES (this test works for me the best) and you can feel Mario jump a couple of split seconds after you press the button when you use the up-scaling.. however when you use the direct source stretched the same issue isn't present.

I had some bad experiences with up-scaling with a Panasonic being slow and I couldn't figure out how to disable it... pity cause I like picture of most Pana's.



OriGin said: I don't mean interpolation... that is still a form of up-scaling... i mean stretching... as in a 4x4 square of pixels = 1 pixel for the up-stretched low resolution.

Now I see where you're getting caught up.

The problem is you can't do streching with HD. If you wanted to do your method, you would be limited to multiples of 480 and 720 on a 1080p set. So, that would mean you can have a 480x720 image "streched" to 960x1440 vs 1080x1920 - so you would have big black boarders around your image. Virtually all fixed pixel TVs do not use pixel doubling or nearest neighbor scaling, they all usea form of software interpolation because its easy and gives the best results.

Sharps do more than just interpolation for SD images, thats why they are more expensive (most cheap LCDs are using the same screens as higher end models), and thats why they look better by doing things to preserve edge detail, ect.

There is no "direct streching" as the TV is still using interpolation, however you are likly just turning off features that enhance the scalled image or other image enhancements. My samsung has a similar mode that turns off DNIe and other enhancements.



Leo-j said: If a dvd for a pc game holds what? Crysis at 3000p or something, why in the world cant a blu-ray disc do the same?

ssj12 said: Player specific decoders are nothing more than specialized GPUs. Gran Turismo is the trust driving simulator of them all. 

"Why do they call it the xbox 360? Because when you see it, you'll turn 360 degrees and walk away"