Short Answer.
720p resolution is still good for movies and games, and should be considered HD. The HD term was adopted to contrast past lower standard definitions (SD) such as 240p/480i/480p with higher definition(HD) standards such as 720p and 1080p/i. Today standards as 1080p FHD (Full HD) and 2160p(4k) UHD(Ultra HD) are gradually making 720p look as standard definition but quality wise, the problem comes with image perception...
Long answer
There are at least 6 things to consider that greatly affect image quality
1 - Source Resolution (Content as Games, Movies, Live TV, etc)
2 - Native Display Resolution (Monitor, TV Display, etc)
3 - Compression
4 - Even Pixel Multiplier (x1, x2, x3, etc)
5 - Pixel Density
6 - Relative distance to Display size
1- Source Resolution : The content resolution is the most important aspect. The higher the resolution the higher details and data the content has. But having a higher resolution doesn't guarantee better image quality. A 1080p image will not look better on a 720p set because down-scaling degrades image quality and motion.
2- Native Display Resolution: All HDTVs have a fixed native resolution, an optimal resolution that produces a 1:1 pixel ratio when feeding content with equal resolution as the set natively supports. So, 720p content will be better on a 720p HDTV, and 1080p content will be better on a 1080p TV rather than on a 4K HDTV. Image degradation occurs when 720p (or lower) content is feed into a larger display say a 1080p or 4k Set. That is because the set has to fit and stretch the 720p image into the 1080p or 4k resolution that it supports. In this scenario, the "upscaling" is responsible for the image quality perceived. Is very important to match content resolution and display resolution for the best viewing experience. Some HDTVS/Monitors are better than others managing the upscaling, and they use different ways and methods to achieve it with different results and variable quality. So a lot of this depends on the quality of the TV built in scaler chip.
3 - Compression: Self explanatory. Usually a 1080p compressed image/video will look worse than a non-compressed 720p image/video. Online streaming is achieved by compressing video information. Depending on the machine and options selected this compression greatly varies in quality. When using online streams , the higher the resolution the more details it retains, but online compression is far behind to offline (stored) compression. For example in H265 the CRF determines the quality and size of the video file. With a higher the CRF you end with lower quality file but small file size, and with a Lower CRF you end with a higher quality video but a larger file size. For streaming the file size is very important (bandwidth limits) so a higher CRF is used sacrificing picture quality as opposed to offline video compression. A 720p movie stored in H265 with the proper settings will outperform a 1080p stream in Youtube any day.
3 - Even pixel multiplier : Usually if the upscaling is performed using an even pixel multiplier (x1, x2, x3, etc) the results will be handled better. The pixels will be evenly multiplied when stretched, thus retaining image quality. But when the upscale is uneven (1.5, 2.25, etc), the pixels will be unevenly matched/deformed and image degradation will occur, especially with moving images (pixelation and blurriness).
| 240p on a 1080p set have a multiplier ratio of 4.5 480p on 1080p set have a multiplier ratio of 2.25 720p on 1080p set have a multiplier ratio of 1.5 |
240p on a 2160p(4k) set have a multiplier ratio of 9 480p on a 2160p(4k) set have a multiplier ratio of 4.5 720p on a 2160p(4k) set have a multiplier ratio of 3 |
So technically 240p/720p feed is better handled by 4k sets than 1080 sets. This is why 720p resolution is so shitty on 1080p sets.
4 - Pixel Density: Usually a higher pixel density results in a better image perception. A 40 Inch 4K TV will be perceived better than a 65 Inch 4k Tv at the same distance.
5 - Relative distance to display Size: The distance to the TV plays a very important role in perceived image quality. Low resolution sets are better viewed at higher distance but with a higher resolution the distance can be halved exponentially as resolution increases.
A 50 inch 720p HDTV set at 12 feet will have no image quality difference than a 50 inch 2160p(4k) set at 6 feet.
Taking terms into context
High Definition: Any resolution larger than 480p/576p standard definitions.
Full High Definition: Is a resolution standard, of 1920x1080(1080p or 2k).
Ultra High Definition: Is a resolution standard of 3840x2160(2160p or 4k)
So...after all, true is that a 720p image will contain less data and detail than one at 1080p or 4k respectively. But this doesn't matter if you ignore the basics behind it. Its High Definition term is still valid today.
Last edited by alexxonne - on 29 March 2020






