By using this site, you agree to our Privacy Policy and our Terms of Use. Close
solidpumar said:
OMG... this thread is noob town.

The FULL HD name is only a standard that the media studios and electronics come up. The SD standard was outdated, so all electronic decided to do higher resolution TV, choosing the 16:9 because that was already the movie standard. The choice of being 1080p lines was also because would give the same detail as movies did.

The 1920X1200 was created only for monitor because monitor have the 16:10 standard, created because unlike the TV the monitor needs the TAB/options bar/explorer/ USER INTERFACE etc. So these 120 lines were for the UI not to affect the 16:9 view.

The is no max resolution.... the 1080p is the standard now, simply because there is no need for more now.

There is already lots of monitors, 30''sizes with 2560 x 1600. In fact the 2560 x 1600 is becoming now the new PC ENTHUSIAST RICH GAMERS standard, slowly getting mainstream in the PC core gamers. The 3 SLI 200series gamers out there sure love maxing Crysis at that resolution.

Both me and Vlad know why they call it full HD.  We were commenting on how ambigous the term "HD" is and how it is just created in the first place to make producs sound good.

16:9 is not a movie standard.  Sometimes, movies will have black bars on the top and bottom even on a widescreen TV.  This is because the movie was made in a aspect ratio not 16:9.  If it were standard, all movies would be the same.

Not all monitors have the 16:10 aspect ratio.  In fact, The Witcher was made for 16:9 aspect ratio monitors.

There are benefits to having a resolution higher than 1080p if you have a big enough TV.  It is just that nobody makes movies/games that go higher than 1080p since they don't make TVs higher than 1080p.

I would get your facts straight before calling anyone else noob.