Vetteman94 said:
Xelloss said:
Vetteman94 said:
Xelloss said:
vlad321 said:
gameover said:
Full HD means 1080p and it's a widescreen 16:9 display
1920x1200 is not Full HD.. it's capable but is it doesn't meet the widescreen standard.. it's 16:10
i would avoid a screen like that (although i have one but i would not buy again) for gaming because i don't like to have borders on top and bottom while playing a 1080p game... stick with 1920x1080 dipsplays
btw my LG monitor has a HD ready sticker although its 1920x1200.. even if a monitor has a higher resolution it cant be called full hd..
|
I have never had a game play with black bars on the monitor when it runs on 1920x1200 to be honest. I've been looking fo these mostly because after I got used to a 1920x1200 the 1080s look kind of funny and small vertical-wise. Also playing even WoW on a 1200 over a 1080 shows some differences (the spell icons look a bit more hand drawn).
|
Are you sure you are referring to different monitors, not just different res settings on the same monitor?
LCD monitors have a native resolution, and aspect ratio.. if you force a deviation frm this in your settings, your results will always suck.
IE: You can run 1920x1080 on a 1920x1200 monitor, but its going to look like shit. There are plenty of example, but always stick to native res for best results. Even worse than simply being in wrong res, is being in wrong aspect ratio. Running 16:10 on a 16:9 and vice versa will always look like shit. Some game engines do not run natively on both aspect ratios though. Not sure about WoW but im thinking you either were trying different resses on the same monitor, or have something setup incorrectly.
I've run a 16:9 and 16:10 side by side quite a bit, there really is no difference in quality. An extra 120 pixels really doesnt matter for overall quality, it only matter if you are trying to display something that thinks it has those 120 pixels.
|
You mean an extra 120 lines with 1920 pixels in each one, which is 230,400 more pixels. And it is a significant difference if the source is 1920x1200. If the source was only 1920x1080, it wouldnt look any different on either monitor.
|
Incorrect, if source does not match monitor aspect ratio it has to be scaled, and does not look right. The number of pixels difference sounds impressive on paper, but is really not much of a difference in practice, not enough to be noticable in the slightest. Scaling 16:10 to 16:9 looks like crap to the discernign viewer, and vice-versa, whereas the difference in pixel count is more or less all in your head.
|
Thats not true. In one case if you were to view a 1920x1080 source on a 1920x1200 tv, its can easily be setup to just not use 60 lines from the top and bottom. The opposite shouldnt even happen, there would be no reason to setup the source to 1920x1200 if you only have a monitor that supports 1920x1080. There would be no scaling needed in either case
|
Sorry, but it just is.
The thing your missing here, is its not as simple as "adding or dropping" a few pixels. Its a conversion between 16:9 and 16:10 and this conversion happens quite often, depending on what you are runnign and how you have it set up. Some game engines can only nativly output into one of the other, and the scaling is handled by your video drivers, other engines can nativly output in both . When you watch video, plenty of noobs go fullscreen with a 16:9 encode when their aspect ratio is 16:10, and I lawl@them when I see it, because it doesnt look right.
The only thing you are correct about, is that it is true there are probably very few instances where something is hard coded at x1200 that need scaled to x1080... but there are things that are 16:10 that need scaled, though the native output will generally be available for all 16:10 resolutions 1400x900 etc
Ofc there are myriad ways to configure it, and with a little effort you can get good picture out of either. But I have seen enough people manage to screw it up that its worth noting. Mostly in video, if a game looks off its usually the developers fault.