By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - 720p vs 1080p, Does Resolution matter? - A Utilitarian Perspective

 

First of all, this topic is not necessarily about the consoles, but a more broad one which can be applied in several areas.

The main commonly accepted principles about resolution :

a) Resolution ALWAYS matters UNTIL it hits a technical or optical limit (such as the monitor capacity or human eye's limit etc).

b) Humans recongize resolutions not as absolute changes but relative changes. For example, the jump from 640*480 to 1280*960 is a lot larger than from 320*240 to 640*480 but in relative terms, they are both equal (a 4x jump).

c) Utility received from resolution increases by a dimisnishing rate (dimishing returns). In other words, if the increases between 3 resolutions are equal, the first jump will be appreciated more (an increase from 320*240 to 640*480 is more recognizable from 640*480 to 1280*960).

d) Humans are adaptive by nature. This implies that the marginal utility is initially low but after a while this effect increases (a shift up in the dimishing returns). A simple explanation is that it would be nice to go from 640*480 to 1280*960 but not necessarily "revolutionary" at first but after getting used to the higher resolution, it would be unthinkable to go back to 640*480, which implies that the utility you got from higher resolution improved by time.

--------

Now a few mathematical notes.... When we talk about resolution, we usually only refer to horizontal lines. Assuming a constant aspect ratio, an "n%" increase in resolution along only the horizontal line would automatically imply "((1+n)^2-1)%" increase in total resolution. For example, a 50% (0.5) increase along the horizontal line implies a 125% increase in total resolution (1.5^2-1). This is the technical increase. However, the law of dimishing returns dictate that the human mind perceives much less of an increase (or appreciates it less than mathematically suggested).

A common way expressing this dimishing returns is to use a power less than 1 such as  (x^n), where 0<n<1

640* 480 to 1280*960 implies 100% change in both directions and 2x2 -1 = 300% change in total

Assuming utility is given by x^n (where n=0.5)

Total Resolution => 1280*960 = 640*2*480*2 = 640*480*4 (4x original)
New Utility => original utility * 2^0.5x2^0.5 = 2x Original Utility (100% increase)

So basically, assuming n=0.5 (rate of returns) will match the increase in one direction.

-----

Now let's come back to the modern times. The difference between 2 resolutions such as 1280*720 and 1900*1080 is technically staggering (and as much demanding); however, the perception will be lower... Simply put, it will be roughly equal to the change along the horizontal line, which is roughly 50% (1080/720 -1) although the actual pixel count increase by 125% (1900*1080/1280/720-1). That is a BIG difference even with dimishing returns in place.

How about the different between 900p and 1080p. Technically the difference is 44%, although the perception will around 20% (1080/900-1) vs (1080/900)^2-1. Even 20% is a quite large to be unnoticed. Humans usually can recognize differences above 10%. The difference between 720 and 640 (12.5% along one line) is a lot harder for example.

---------

There are 3 more points I'd like to add though.

a) First of all those suggest differences are usually the IDEAL numbers, and your perception may be lower unless your TV set can properly display the differences well. One extreme example is 720p TV set, for which, any resolution over 720 p will be useless (and sometimes worse). Also not all TVs are created equal. It will be much easier to recognize such difference in larger and/or higher quality TV sets.

b) Human mind and eye vision are big factors. Just like TVs, not all humans are created equal. Some can see, not only physically but also perceptually better than others.

c) The recognition and utility you'll get from higher resolution increases over time. As I said, human mind is highly adaptive, and is trained to recognize the differences after a while, and once it gets used to, there is no turning back. Those ultra clear resolutions may not be a big deal for you now, but in a few years, a brain adapted to 1080p will instantly recognize the loss of clarity in 720p and never would enjoy it as much.



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

Around the Network

People who are not technical, but pay attention will notice the loss of clarity when going from 1080p to 720p, it's like getting prescription glasses if your eye sight is not amazing... however you will never see it as evidently as when games went from VGA (320*200) to SVGA (640*480) that step was HUGE!

What we need to reach before resolution becomes a non issue is the densities we have on phones and tablets now (around 300 PPI for computer screens would be amazing and I guess 150PPI to 200PPI for TVs would settle it) I mean, when we are on 4k it will be like paper, when an image is up-scaled a bit I don't think it will be too smudged...

Now very few people have 4K TVs and neither home console can output their game at this resolution, at least not a demanding game! so the discussion for 4k is pretty moot :-/

I think expecting 1080p in 2014 is just reasonable, as should be close to 60fps (I mean, we have Tomb Raider on the PS4, I would be able to live with games that look only like that but are silky smooth...)

I think spacial resolution is as important as temporal resolution.



For myself I can't tell the difference between 1080p and 720p. Maybe I'm just too far away from the screen or the screen is too small but that's just how I am.

However, I can easily notice the difference between low quality 720p and high quality. A big black square will look the same but can be 720p or 1080p without being able to tell the difference so you just have to focus on the textures being presented. a good quality 720p image will look nicer than a poor quality 1080p image



I think the difference between 1080p and 720p is incredibly easy to see.



I think the differences from 900p upscaled to 1080p and 1080p is not important at all.
The only reason some people are counting pixels is that you CAN count them.

"better graphics" was and is always a matter of taste. There are things like lighting effects, artstyle, animations, facial expressions etc etc and IMO all of these things are way more important than resolution. But all of these things are based on personal preferences. Resolution is not. That is why people use it today.

Resolution meant shit last gen. If you asked for the best looking game last gen, very few would mention Wipeout, because it runs at 1080p.

I am not getting tired to mention that MY personal best looking game of last gen is Fable II, because the artstyle and the atmosphere just felt special and really was a beauty.

Maybe I am crazy, but I still think that The Last of Us, Halo 4, Gears of War 3 or Uncharted are great looking games and not one of those games was able to get past 720p at all. I still think X for Wii U looks really really good, but I am not into JRPGs.

Gameplay > visuals and for the visuals the resolution is the least important aspect.



Imagine not having GamePass on your console...

Around the Network
DirtyP2002 said:
I think the differences from 900p upscaled to 1080p and 1080p is not important at all.
The only reason some people are counting pixels is that you CAN count them.

"better graphics" was and is always a matter of taste. There are things like lighting effects, artstyle, animations, facial expressions etc etc and IMO all of these things are way more important than resolution. But all of these things are based on personal preferences. Resolution is not. That is why people use it today.

Resolution meant shit last gen. If you asked for the best looking game last gen, very few would mention Wipeout, because it runs at 1080p.

I am not getting tired to mention that MY personal best looking game of last gen is Fable II, because the artstyle and the atmosphere just felt special and really was a beauty.

Maybe I am crazy, but I still think that The Last of Us, Halo 4, Gears of War 3 or Uncharted are great looking games and not one of those games was able to get past 720p at all. I still think X for Wii U looks really really good, but I am not into JRPGs.

Gameplay > visuals and for the visuals the resolution is the least important aspect.

If you go with the argument that resolution is the least important aspect for visuals then why are we not still at 640X480?

Native resolution is one of the most important factors when it comes to IQ. Its the first thing PC gamers try and hit to get the best visuals...native res. 1080p, 4k, 720p, sub 720p it doesnt matter. Its the native resolution on the screen you have that is the most important.

I also dont see why you bringing up Wipeout? I can play some really old PC games on my PC at 1080p and they look like shit. Its being able to hit the most common native resolution AND still being able to do all the rest thats the most important for visuals. And considering that the next gen consoles need to be around for another 5 - 7 years, i dont think it was too much to ask for to have 1080p as a standard for all games not just small undemanding games. Im pretty sure every console gamer out there who was talking about it was expecting 1080p to be the standard for this gen given that we had 720p as standard last gen.

The only reason this has become a talking point is because MS decided to go for an unusually weak gpu. Other than that everybody around here was looking forward to glorius 1080p 60FPS gaming...something that even the PS4 cant deliver.

So yeah resolution absolutely does matter. Obviously you get some people who dont give a fuck and still play Pokemon on their Gameboy Colour...but that thats another case entirely.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

Well TBH 640x480 is not a bad size. It's just a 4:3 480p image, most of TV history is this format. Obviously since we have gone widescreen the image would have to be 16:9(720x480) so I'll just upload a screenshot from a HQ encode of Titanic in 480p
Not too shabby is it? And this is indeed 480p albeit at 21:9 AR

Now for two different 720p images:
HQ


Not HQ


As can be seen not all 720p is created equal so this validates the point that resolution isn't the most important thing with regards to quality.



DirtyP2002 said:
I think the differences from 900p upscaled to 1080p and 1080p is not important at all.
The only reason some people are counting pixels is that you CAN count them.

"better graphics" was and is always a matter of taste. There are things like lighting effects, artstyle, animations, facial expressions etc etc and IMO all of these things are way more important than resolution. But all of these things are based on personal preferences. Resolution is not. That is why people use it today.

Resolution meant shit last gen. If you asked for the best looking game last gen, very few would mention Wipeout, because it runs at 1080p.

I am not getting tired to mention that MY personal best looking game of last gen is Fable II, because the artstyle and the atmosphere just felt special and really was a beauty.

Maybe I am crazy, but I still think that The Last of Us, Halo 4, Gears of War 3 or Uncharted are great looking games and not one of those games was able to get past 720p at all. I still think X for Wii U looks really really good, but I am not into JRPGs.

Gameplay > visuals and for the visuals the resolution is the least important aspect.

Of course it's not important if your an XBone fan.



It kinda depends, when I play on console on my TV i don't really notice all that much really. When I'm on my PC though sitting close to the monitor the difference is MASSIVE so when I game on my PC anything under 1080p is unacceptable.



I think is more about Xbone costing 100$ more than PS4 while being inferior hardware and multiplats are now better on Sony's console.