By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:

If you were to compare an RTX 3090 Ti against a RTX 4090 at 1080P... Not only are you wasting money, but... There is still a performance increase.
Of 16% at 1080P. Or 20fps.
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4090-review/4

This is where the CPU bottleneck comes into play... As the jump between the 3090Ti and 4090 is actually larger than 16%.

At 2160P where we are GPU bound rather than CPU bound, the 4090 will extend it's lead of 50% or more.

So yes, there is a difference, but if you were to go from say... A Ryzen 5900 to a Ryzen 7900, you would see a larger gaming performance increase than 16% at 1080P... To the point where you would have better performance with a Ryzen 7900+RTX 3090 Ti than a Ryzen 5900+RTX 4090.

That's still a significant performance increase. Sure, it's more or less wasteful, but it's still something. And for more affordable GPUs, I expect better bang for your buck. This is just my personal preference, but 1080p performance is probably what I'm going to be looking for when upgrading my GPU (well, the whole PC, really), despite CPU bottlenecks.

Pemalite said:

Pretty much the Radeon RX 6600XT/Geforce 3060 and better are fine for 1440P. At native.
With DLSS you could get away with even lower end hardware.

You no longer need high-end hardware for 1440P.

Even a 3060 is still fairly expensive though, considering what kind of GPUs you used to be able to get for that money, and it was criticized even when it was released. But sure, you can get pretty far even with a 3060, but is 1440p really the goal for everyone? I don't think so, because (in somewhat modern games) you have to sacrifice something else for the resolution. DLSS is an excellent point though.

Pemalite said:

Where 1080P may hold an advantage (But is quickly loosing it!) is in the high refresh rate market targeting esports where a 240hz-480hz panel or better is readily available and affordable... But. You still need a stupidly fast CPU, RAM and GPU for that anyway, which makes the point of 1080P and low-end hardware redundant.

Probably so.

Pemalite said:

If you don't have the hardware for 1440P, running at a lower resolution is not the end of the world, many games have dynamic resolutions or internal resolution scaling, so you can run the game at a lower resolution but still have a really crisp 1440P HUD in-game and a generally better desktop/work environment.

That's a good point in the games where it applies, although I can't really say how many games that is. I don't own a lot of computationally intensive games from recent years, but I don't think I've seen this option very commonly in the games I own. I imagine it's much more common in the kind of games I don't own, but hard to tell. Definitely sounds helpful in games that support this, but for games that don't... Well, I don't have good memories of trying to run games in 720p on a 1080p monitor. Looked absolutely awful. Probably not as bad to run a game in 1080p on a 1440p monitor though, although still possibly a bit blurry?

Pemalite said:

I personally believe that 1440P is the minimum resolution going forward... Heck. Other than the Nintendo Switch, all my devices have a 1440P or better panel, my phone included.

Well, you're an enthusiast, willing to spend quite a bit on electronics, are you not? My phone has a 1080p screen, and that's by choice. Higher resolutions offer diminishing returns on phones, but reduce battery life. In fact, 1080p is the highest resolution in my household, and that's including my TV (which I'd like to replace at some point, mind you, but 4K is not a target for my new TV either, although I suspect I will end up with a 4K TV anyway). That's simply because a higher resolution is simply not worth it for me considering the drawbacks. Judging by the abundance of 1080p screens, it seems like 1080p will still be a big thing even going forward, although higher resolutions are definitely gaining ground.

Pemalite said:

Keep in mind that display resolution popularity will vary from region to region as well.

Higher-socio-economic areas of the world tend to run with better quality hardware... Because the difference between $100 and $200 is insignificant.

I.E. At my work, every display is 1440P. - But go overseas and you will come across 720P/1080P panels.

Absolutely. I looked at US Amazon, and as for my local online retailers? Well, I live in Finland, which happens to be a fairly wealthy country with a sizeable PC gaming market, so I would expect the market here to be skewed towards higher-end equipment. Additionally, I work in a software company, and I believe most of our monitors are still 1080p ones. Granted, it's still a growth company at this point, but I believe we've left behind the toughest times financially. Can we draw conclusions from our anecdotal experiences? Probably not. We'd probably need more data, plus I'd argue that 1440p actually is more beneficial for a lot of work than it is for entertainment, so I would expect there to be some extra interest in 1440p monitors in workplaces.

Pemalite said:

There is no doubt that 1080P is still a popular desktop resolution, but keep in mind that Steam doesn't represent all gaming PC's. - A ton of devices that Steam is installed on are simply ancient. - Case in point, 44% of PC's steam is installed on have 4-CPU cores or less, 25% have only 8GB or less Ram.

It's safe to say that a massive portion of those 1080P users are not using new machines, they are using old and out-dated devices and just using them until they fail as it runs what they want.

Absolutely, but do you expect those people to be willing to pay more for 1440p when they eventually do have to upgrade? If they're using such ancient devices, it's probably for a reason, and I expect that reason to be closely related to their budget. When they've likely been happy with 1080p or so for a long time but have noticed other areas in need of improvement, going for a better monitor might not always be a priority.

Pemalite said:

My notebook only has a 3060 in it. It super-samples 1440P content down to 1080P just fine, that is far from being top-tier.

I don't think people realise how accessible 1440P has become in the modern era.

I tend to replace AA with Super-Sampling as it benefits the entire image and rendering pipeline rather than sampling edges or geometry to remove aliasing... Super Sampling isn't just "resolution". It's using the data of a higher resolution scene to benefit a lower-resolution output.

Again, I can only speak from my experience, but I'd much rather suffer suboptimal anti-aliasing than sacrifice something that has a more profound impact on the graphics of a game. I can definitely tolerate some jaggies, so why wouldn't I choose to use the processing power for something else instead? Not everyone needs super-smooth image quality, as nice as it can be. It's probably easy to look at it from an enthusiast viewpoint and value the lack of jaggies very highly, but I'm not sure that's the case for people with more limited budgets (mine's not all that limited anymore, mind you, but it's simply not worth the cost to me).