So I 've been reading the forum last couple of days, and some of the comments that I see have been driving me nuts. Some people either need to get their eyes fixed, or they are just in denial. Regardless I will try to explain some things from a technical perspective.
1080p has been the standard by TV manufacturers for many many years now. The vast majority of TVs sold in retail output natively at 1920x1080. As a matter of fact the standard has become so widely adopted that even PC monitor makers chose to implement it in their products due to a cost of scale. 1080p is by no means new, neither is it expensive. A 40' TV screen can be had for as little as 400 dollars or less. Some 1080p PC monitors go for as little as 100-150 bucks.
Now someone would argue, but 720p is HD too whats the big deal?
Well here is the deal. If you have a 1080p monitor at home, try to lower a resolution to 1280x720. You will notice that the image on screen is very blurry. Downscaling or upscaling by a factor of non integer numbers produces a blurry image. But let's explain that a little better. If I have an image of 1280x720 and I want to upscale it to 1920x1080, it means I need to increase the length and width of my image by a factor of 1.5. So for every 1 pixel in length or width I need 1.5 pixels respectively. This is not achievable without loss in image quality. You can only upscale without image quality loss if you increase your output by a factor of integer. For example 4k resolution is 3840x2160, exactly x2 the length and width of 1080p. Manufacturers opted for this new standard, because they know you can view the current 1080p content without image quality loss.
That is why developers have been stressing the importance of rendering at native 1080p. The majority of monitors out there output in this resolution, and it will be a filthy mess if you try to render at a lower resolution and then upscale. And don't give me that "only gameplay matters" crap. If that's the case then go play your NES.











