By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Zkuq said:
Pemalite said:

1080P as a resolution is mostly CPU bound these days as GPU's have grown exceedingly capable.

Pardon my ignorance, but wouldn't that imply there being only minor differences, if any, between different GPUs, especially between generations? A quick look reveals that this is definitely true for some games, but for some games, there is a noticeable and significant difference compared to older GPUs. (I searched for RTX 4090 reviews and looked how it fared compared to older GPUs.)

If you were to compare an RTX 3090 Ti against a RTX 4090 at 1080P... Not only are you wasting money, but... There is still a performance increase.
Of 16% at 1080P. Or 20fps.
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4090-review/4

This is where the CPU bottleneck comes into play... As the jump between the 3090Ti and 4090 is actually larger than 16%.

At 2160P where we are GPU bound rather than CPU bound, the 4090 will extend it's lead of 50% or more.

So yes, there is a difference, but if you were to go from say... A Ryzen 5900 to a Ryzen 7900, you would see a larger gaming performance increase than 16% at 1080P... To the point where you would have better performance with a Ryzen 7900+RTX 3090 Ti than a Ryzen 5900+RTX 4090.

Zkuq said:
Pemalite said:

Even budget 27" monitors are coming in at 1440P now.

For GPU tests, 1080P is a redundant resolution.

This is my personal preference, but I'm currently quite happy with 1080p and on the other hand not too happy about the performance cost associate with upgrading to 1440p (let alone 4K). This, of course, implies having to get a more expensive GPU to get the same performance, which is an idea I'm not too fond of. This is not affected by price of 1440p monitors but by that of GPUs instead. I imagine not a lot of people think about it this way, but I would guess that given a limited budget, many people would agree with me if they thought about it more. From this point of view, I don't think pushing for increased adoption of higher resolutions is really justified.

Pretty much the Radeon RX 6600XT/Geforce 3060 and better are fine for 1440P. At native.
With DLSS you could get away with even lower end hardware.

You no longer need high-end hardware for 1440P.

Where 1080P may hold an advantage (But is quickly loosing it!) is in the high refresh rate market targeting esports where a 240hz-480hz panel or better is readily available and affordable... But. You still need a stupidly fast CPU, RAM and GPU for that anyway, which makes the point of 1080P and low-end hardware redundant.

If you don't have the hardware for 1440P, running at a lower resolution is not the end of the world, many games have dynamic resolutions or internal resolution scaling, so you can run the game at a lower resolution but still have a really crisp 1440P HUD in-game and a generally better desktop/work environment.


I personally believe that 1440P is the minimum resolution going forward... Heck. Other than the Nintendo Switch, all my devices have a 1440P or better panel, my phone included.

Zkuq said:
Pemalite said:

The thing with 1080P testing is that review outlets are concentrating on desktop components, not notebook.

And there is a reason for that... Notebook hardware is often not equivalent to the desktop model.
I.E. Notebook RTX 3060 will perform worse than the desktop RTX 3060.

And what skews things even further is that different manufacturers impose different TDP's, clockspeeds and memory configurations... There are Notebooks where a RTX 3050 Ti will outperform the 3060 in another device because it has higher TDP headroom and/or more VRAM than the 3060.

Thus when it comes to 1080P and Notebooks, we need to benchmark notebooks individually and judge each notebook on it's individual merits.

This is certainly a fair point, and one I can't really argue against. I suspect 1080 is still a very popular desktop resolution as well, but unifortunately Steam Hardware & Software Survey doesn't really seem to provide, and at least a really quick search doesn't really give much better results either.

However, looking at Amazon's top sellers in monitors, a quick look at the monitors reveals 1080p to be an incredibly popular choice, even among gaming monitors, and in fact there's only one 1440p monitor on the list (well, there are probably more, but I couldn't see any among the top monitors, and Ctrl + F revealed only that one). A similar glance at some popular online retailers in my country also implies that 1080p is still a very popular resolution, although you can see much more 1440p monitors on top sellers lists here. This is certainly a fairly narrow look at the situation, but it definitely seems like 1080p still is a very popular desktop resolution, and that's even excluding monitors for non-gaming purposes.

Keep in mind that display resolution popularity will vary from region to region as well.

Higher-socio-economic areas of the world tend to run with better quality hardware... Because the difference between $100 and $200 is insignificant.

I.E. At my work, every display is 1440P. - But go overseas and you will come across 720P/1080P panels.

There is no doubt that 1080P is still a popular desktop resolution, but keep in mind that Steam doesn't represent all gaming PC's. - A ton of devices that Steam is installed on are simply ancient. - Case in point, 44% of PC's steam is installed on have 4-CPU cores or less, 25% have only 8GB or less Ram.

It's safe to say that a massive portion of those 1080P users are not using new machines, they are using old and out-dated devices and just using them until they fail as it runs what they want.

Zkuq said:
Pemalite said:

Consequently... 1440P can be supersampled down to 1080P for a very crisp image.

If you are going for the low-end, then it's going to be 720P... For CPU testing, it's also going to be 720P as it removes all possible GPU bottlenecks.

Thus I would argue, even if you have a 1080P display like in my notebook, 1440P performance is still relevant.

Maybe if you have a top-tier GPU, but that seems extremely wasteful for anything else unless the game is CPU-bottlenecked. To each their own of course, but I imagine supersampling would be just about the last thing I would try in any game unless the game looked absolutely awful without it. There's usually better use for processing power than supersampling (although I've got to say that this is coming from a guy who would gladly sacrifice resolution in favour of just about anything else graphically).

My notebook only has a 3060 in it. It super-samples 1440P content down to 1080P just fine, that is far from being top-tier.

I don't think people realise how accessible 1440P has become in the modern era.

I tend to replace AA with Super-Sampling as it benefits the entire image and rendering pipeline rather than sampling edges or geometry to remove aliasing... Super Sampling isn't just "resolution". It's using the data of a higher resolution scene to benefit a lower-resolution output.



--::{PC Gaming Master Race}::--