By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx's PC gaming emporium - Catch up on all the latest PC Gaming related news

 

Zarx changed his avatar again. Thoughts?

Noice 248 61.23%
 
So soon? I just got used to the last one 14 3.46%
 
it sucks 22 5.43%
 
Your cropping skills are lacking 14 3.46%
 
Too noisy, can't tell WTF it even is 14 3.46%
 
Meh 32 7.90%
 
Total:344
Captain_Yuri said:
Zkuq said:

According to the latest Steam Hardware & Software survey, about 65 % of Steam users have 1080p as their resolution. 1440p comes next at 13 %, and it's followed by 1366x768 at 5 %, and only then do we get 4K at a little over 2 %. I'd say that if tests need to get rid of one resolution, it's something other than 1080p.

Remember that the Steam Survey is heavily skewed by laptops for the past decade all running 1080p displays. Hell even flagship laptops with 3080 Ti mobile graphics are generally paired with 1080p displays even now. 1440p is only starting to take off on laptops.

Well, it's still 1080p performance that counts for those laptops, if that's the case. Anyway, I'd be wary of dropping 1080p tests anyway, because they seem to be in surprisingly much use. I mean, I was expecting to find lots of 1080p usage with Steam, but the actual number just blew up mind.



Around the Network
Zkuq said:
Pemalite said:

1080P shouldn't be a tested resolution these days, it's pretty pointless with 1440P being the mainstream resolution to target on PC.

According to the latest Steam Hardware & Software survey, about 65 % of Steam users have 1080p as their resolution. 1440p comes next at 13 %, and it's followed by 1366x768 at 5 %, and only then do we get 4K at a little over 2 %. I'd say that if tests need to get rid of one resolution, it's something other than 1080p.

1080P as a resolution is mostly CPU bound these days as GPU's have grown exceedingly capable.
Even budget 27" monitors are coming in at 1440P now.

For GPU tests, 1080P is a redundant resolution.

Zkuq said:

Well, it's still 1080p performance that counts for those laptops, if that's the case. Anyway, I'd be wary of dropping 1080p tests anyway, because they seem to be in surprisingly much use. I mean, I was expecting to find lots of 1080p usage with Steam, but the actual number just blew up mind.

The thing with 1080P testing is that review outlets are concentrating on desktop components, not notebook.

And there is a reason for that... Notebook hardware is often not equivalent to the desktop model.
I.E. Notebook RTX 3060 will perform worse than the desktop RTX 3060.

And what skews things even further is that different manufacturers impose different TDP's, clockspeeds and memory configurations... There are Notebooks where a RTX 3050 Ti will outperform the 3060 in another device because it has higher TDP headroom and/or more VRAM than the 3060.

Thus when it comes to 1080P and Notebooks, we need to benchmark notebooks individually and judge each notebook on it's individual merits.

Consequently... 1440P can be supersampled down to 1080P for a very crisp image.

If you are going for the low-end, then it's going to be 720P... For CPU testing, it's also going to be 720P as it removes all possible GPU bottlenecks.

Thus I would argue, even if you have a 1080P display like in my notebook, 1440P performance is still relevant.




--::{PC Gaming Master Race}::--

Pemalite said:
Zkuq said:

According to the latest Steam Hardware & Software survey, about 65 % of Steam users have 1080p as their resolution. 1440p comes next at 13 %, and it's followed by 1366x768 at 5 %, and only then do we get 4K at a little over 2 %. I'd say that if tests need to get rid of one resolution, it's something other than 1080p.

1080P as a resolution is mostly CPU bound these days as GPU's have grown exceedingly capable.

Pardon my ignorance, but wouldn't that imply there being only minor differences, if any, between different GPUs, especially between generations? A quick look reveals that this is definitely true for some games, but for some games, there is a noticeable and significant difference compared to older GPUs. (I searched for RTX 4090 reviews and looked how it fared compared to older GPUs.)

Pemalite said:

Even budget 27" monitors are coming in at 1440P now.

For GPU tests, 1080P is a redundant resolution.

This is my personal preference, but I'm currently quite happy with 1080p and on the other hand not too happy about the performance cost associate with upgrading to 1440p (let alone 4K). This, of course, implies having to get a more expensive GPU to get the same performance, which is an idea I'm not too fond of. This is not affected by price of 1440p monitors but by that of GPUs instead. I imagine not a lot of people think about it this way, but I would guess that given a limited budget, many people would agree with me if they thought about it more. From this point of view, I don't think pushing for increased adoption of higher resolutions is really justified.

Pemalite said:
Zkuq said:

Well, it's still 1080p performance that counts for those laptops, if that's the case. Anyway, I'd be wary of dropping 1080p tests anyway, because they seem to be in surprisingly much use. I mean, I was expecting to find lots of 1080p usage with Steam, but the actual number just blew up mind.

The thing with 1080P testing is that review outlets are concentrating on desktop components, not notebook.

And there is a reason for that... Notebook hardware is often not equivalent to the desktop model.
I.E. Notebook RTX 3060 will perform worse than the desktop RTX 3060.

And what skews things even further is that different manufacturers impose different TDP's, clockspeeds and memory configurations... There are Notebooks where a RTX 3050 Ti will outperform the 3060 in another device because it has higher TDP headroom and/or more VRAM than the 3060.

Thus when it comes to 1080P and Notebooks, we need to benchmark notebooks individually and judge each notebook on it's individual merits.

This is certainly a fair point, and one I can't really argue against. I suspect 1080 is still a very popular desktop resolution as well, but unifortunately Steam Hardware & Software Survey doesn't really seem to provide, and at least a really quick search doesn't really give much better results either.

However, looking at Amazon's top sellers in monitors, a quick look at the monitors reveals 1080p to be an incredibly popular choice, even among gaming monitors, and in fact there's only one 1440p monitor on the list (well, there are probably more, but I couldn't see any among the top monitors, and Ctrl + F revealed only that one). A similar glance at some popular online retailers in my country also implies that 1080p is still a very popular resolution, although you can see much more 1440p monitors on top sellers lists here. This is certainly a fairly narrow look at the situation, but it definitely seems like 1080p still is a very popular desktop resolution, and that's even excluding monitors for non-gaming purposes.

Pemalite said:

Consequently... 1440P can be supersampled down to 1080P for a very crisp image.

If you are going for the low-end, then it's going to be 720P... For CPU testing, it's also going to be 720P as it removes all possible GPU bottlenecks.

Thus I would argue, even if you have a 1080P display like in my notebook, 1440P performance is still relevant.

Maybe if you have a top-tier GPU, but that seems extremely wasteful for anything else unless the game is CPU-bottlenecked. To each their own of course, but I imagine supersampling would be just about the last thing I would try in any game unless the game looked absolutely awful without it. There's usually better use for processing power than supersampling (although I've got to say that this is coming from a guy who would gladly sacrifice resolution in favour of just about anything else graphically).





Zkuq said:
Pemalite said:

1080P as a resolution is mostly CPU bound these days as GPU's have grown exceedingly capable.

Pardon my ignorance, but wouldn't that imply there being only minor differences, if any, between different GPUs, especially between generations? A quick look reveals that this is definitely true for some games, but for some games, there is a noticeable and significant difference compared to older GPUs. (I searched for RTX 4090 reviews and looked how it fared compared to older GPUs.)

If you were to compare an RTX 3090 Ti against a RTX 4090 at 1080P... Not only are you wasting money, but... There is still a performance increase.
Of 16% at 1080P. Or 20fps.
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4090-review/4

This is where the CPU bottleneck comes into play... As the jump between the 3090Ti and 4090 is actually larger than 16%.

At 2160P where we are GPU bound rather than CPU bound, the 4090 will extend it's lead of 50% or more.

So yes, there is a difference, but if you were to go from say... A Ryzen 5900 to a Ryzen 7900, you would see a larger gaming performance increase than 16% at 1080P... To the point where you would have better performance with a Ryzen 7900+RTX 3090 Ti than a Ryzen 5900+RTX 4090.

Zkuq said:
Pemalite said:

Even budget 27" monitors are coming in at 1440P now.

For GPU tests, 1080P is a redundant resolution.

This is my personal preference, but I'm currently quite happy with 1080p and on the other hand not too happy about the performance cost associate with upgrading to 1440p (let alone 4K). This, of course, implies having to get a more expensive GPU to get the same performance, which is an idea I'm not too fond of. This is not affected by price of 1440p monitors but by that of GPUs instead. I imagine not a lot of people think about it this way, but I would guess that given a limited budget, many people would agree with me if they thought about it more. From this point of view, I don't think pushing for increased adoption of higher resolutions is really justified.

Pretty much the Radeon RX 6600XT/Geforce 3060 and better are fine for 1440P. At native.
With DLSS you could get away with even lower end hardware.

You no longer need high-end hardware for 1440P.

Where 1080P may hold an advantage (But is quickly loosing it!) is in the high refresh rate market targeting esports where a 240hz-480hz panel or better is readily available and affordable... But. You still need a stupidly fast CPU, RAM and GPU for that anyway, which makes the point of 1080P and low-end hardware redundant.

If you don't have the hardware for 1440P, running at a lower resolution is not the end of the world, many games have dynamic resolutions or internal resolution scaling, so you can run the game at a lower resolution but still have a really crisp 1440P HUD in-game and a generally better desktop/work environment.


I personally believe that 1440P is the minimum resolution going forward... Heck. Other than the Nintendo Switch, all my devices have a 1440P or better panel, my phone included.

Zkuq said:
Pemalite said:

The thing with 1080P testing is that review outlets are concentrating on desktop components, not notebook.

And there is a reason for that... Notebook hardware is often not equivalent to the desktop model.
I.E. Notebook RTX 3060 will perform worse than the desktop RTX 3060.

And what skews things even further is that different manufacturers impose different TDP's, clockspeeds and memory configurations... There are Notebooks where a RTX 3050 Ti will outperform the 3060 in another device because it has higher TDP headroom and/or more VRAM than the 3060.

Thus when it comes to 1080P and Notebooks, we need to benchmark notebooks individually and judge each notebook on it's individual merits.

This is certainly a fair point, and one I can't really argue against. I suspect 1080 is still a very popular desktop resolution as well, but unifortunately Steam Hardware & Software Survey doesn't really seem to provide, and at least a really quick search doesn't really give much better results either.

However, looking at Amazon's top sellers in monitors, a quick look at the monitors reveals 1080p to be an incredibly popular choice, even among gaming monitors, and in fact there's only one 1440p monitor on the list (well, there are probably more, but I couldn't see any among the top monitors, and Ctrl + F revealed only that one). A similar glance at some popular online retailers in my country also implies that 1080p is still a very popular resolution, although you can see much more 1440p monitors on top sellers lists here. This is certainly a fairly narrow look at the situation, but it definitely seems like 1080p still is a very popular desktop resolution, and that's even excluding monitors for non-gaming purposes.

Keep in mind that display resolution popularity will vary from region to region as well.

Higher-socio-economic areas of the world tend to run with better quality hardware... Because the difference between $100 and $200 is insignificant.

I.E. At my work, every display is 1440P. - But go overseas and you will come across 720P/1080P panels.

There is no doubt that 1080P is still a popular desktop resolution, but keep in mind that Steam doesn't represent all gaming PC's. - A ton of devices that Steam is installed on are simply ancient. - Case in point, 44% of PC's steam is installed on have 4-CPU cores or less, 25% have only 8GB or less Ram.

It's safe to say that a massive portion of those 1080P users are not using new machines, they are using old and out-dated devices and just using them until they fail as it runs what they want.

Zkuq said:
Pemalite said:

Consequently... 1440P can be supersampled down to 1080P for a very crisp image.

If you are going for the low-end, then it's going to be 720P... For CPU testing, it's also going to be 720P as it removes all possible GPU bottlenecks.

Thus I would argue, even if you have a 1080P display like in my notebook, 1440P performance is still relevant.

Maybe if you have a top-tier GPU, but that seems extremely wasteful for anything else unless the game is CPU-bottlenecked. To each their own of course, but I imagine supersampling would be just about the last thing I would try in any game unless the game looked absolutely awful without it. There's usually better use for processing power than supersampling (although I've got to say that this is coming from a guy who would gladly sacrifice resolution in favour of just about anything else graphically).

My notebook only has a 3060 in it. It super-samples 1440P content down to 1080P just fine, that is far from being top-tier.

I don't think people realise how accessible 1440P has become in the modern era.

I tend to replace AA with Super-Sampling as it benefits the entire image and rendering pipeline rather than sampling edges or geometry to remove aliasing... Super Sampling isn't just "resolution". It's using the data of a higher resolution scene to benefit a lower-resolution output.



--::{PC Gaming Master Race}::--

Around the Network

One thing to consider is that not everyone cares for 1440p. I'd argue even that most people don't. When I suggest it my brother was like 'not paying that for a PC screen lol, do you know how many Alphabet shares I could buy instead?'. And he's a fairly well-off doctor so not exactly hard-pressed for money.

Case in point, in US Amazon the best-selling monitors are virtually all 1080p. Hell, 900p ones pop up almost as frequently as >1080p displays. We can't forget that we're in an enthusiast bubble here and the lay of the land can be quite different.



 

 

 

 

 

Well by that point, it's like do those people even care about DF and their content considering DFs GPUs are likely well above anything that a 1080p office monitor buyer is running,



             

                   PC Specs: CPU: 5950X || GPU: Strix 4090 || RAM: 32GB DDR4 3600 || Main SSD: WD 2TB SN850

So now that Zen 4 flopped which resulted in a pretty good price discount, even if it's for a limited time. Now it's the 4080s turn. While I suspect the 4090 won't get a price drop anytime soon, the 4080 should get a price drop to $1000 or $900. Imo that is still shit pricing but Nvidia is not gonna price lower than AMD as its very likely that 7900XT has very similar Raster performance to the 4080. The only way the 4080 won't get a price cut imo is if 7900XTX reviews so badly that it is within 5% of the 4080. So if the 7900XTX is 5% faster than a 4080 for $1000, then people in the price range will think to themselves about all the goodies they can get for that additional $200. But it's very unlikely that will be the case.

Imo it will look like this:
4090: 155%
7900XTX: 140%
4080/7900XT: 125%
3090 Ti: 100%



             

                   PC Specs: CPU: 5950X || GPU: Strix 4090 || RAM: 32GB DDR4 3600 || Main SSD: WD 2TB SN850

haxxiy said:

Case in point, in US Amazon the best-selling monitors are virtually all 1080p. Hell, 900p ones pop up almost as frequently as >1080p displays. We can't forget that we're in an enthusiast bubble here and the lay of the land can be quite different.

I don't disagree but for most part is not looking at benchmarks something that mostly contained to enthusiast market?  What percent of people have ever look at any of these benchmarks charts before buying a GPU? I bet it very low.  What percent of the people that bought those 1080P or 900P monitor have ever gone to any tech website or YouTube channel and looked at performance number before buying? 

I bet it way lower then the people buying higher end monitors so when sites decide what they going to test it make sense it going to lean more toward a enthusiast audience because they the one that going to bother reading or watching.

I just know it would be impossible for me to convince my none tech friends or family to do research on the typical tech websites/youtube before buying and there nothing I could do to convince them.  They either going to buy the cheapest or what ever the sale person recommend (most of the time someone that also have no clue what they talking about) or they just buy what I tell them to buy within the price they gave me.  

What the point of focusing on mainstream market if the target audience have 0 interest in reading or watching it.



Pemalite said:

If you were to compare an RTX 3090 Ti against a RTX 4090 at 1080P... Not only are you wasting money, but... There is still a performance increase.
Of 16% at 1080P. Or 20fps.
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4090-review/4

This is where the CPU bottleneck comes into play... As the jump between the 3090Ti and 4090 is actually larger than 16%.

At 2160P where we are GPU bound rather than CPU bound, the 4090 will extend it's lead of 50% or more.

So yes, there is a difference, but if you were to go from say... A Ryzen 5900 to a Ryzen 7900, you would see a larger gaming performance increase than 16% at 1080P... To the point where you would have better performance with a Ryzen 7900+RTX 3090 Ti than a Ryzen 5900+RTX 4090.

That's still a significant performance increase. Sure, it's more or less wasteful, but it's still something. And for more affordable GPUs, I expect better bang for your buck. This is just my personal preference, but 1080p performance is probably what I'm going to be looking for when upgrading my GPU (well, the whole PC, really), despite CPU bottlenecks.

Pemalite said:

Pretty much the Radeon RX 6600XT/Geforce 3060 and better are fine for 1440P. At native.
With DLSS you could get away with even lower end hardware.

You no longer need high-end hardware for 1440P.

Even a 3060 is still fairly expensive though, considering what kind of GPUs you used to be able to get for that money, and it was criticized even when it was released. But sure, you can get pretty far even with a 3060, but is 1440p really the goal for everyone? I don't think so, because (in somewhat modern games) you have to sacrifice something else for the resolution. DLSS is an excellent point though.

Pemalite said:

Where 1080P may hold an advantage (But is quickly loosing it!) is in the high refresh rate market targeting esports where a 240hz-480hz panel or better is readily available and affordable... But. You still need a stupidly fast CPU, RAM and GPU for that anyway, which makes the point of 1080P and low-end hardware redundant.

Probably so.

Pemalite said:

If you don't have the hardware for 1440P, running at a lower resolution is not the end of the world, many games have dynamic resolutions or internal resolution scaling, so you can run the game at a lower resolution but still have a really crisp 1440P HUD in-game and a generally better desktop/work environment.

That's a good point in the games where it applies, although I can't really say how many games that is. I don't own a lot of computationally intensive games from recent years, but I don't think I've seen this option very commonly in the games I own. I imagine it's much more common in the kind of games I don't own, but hard to tell. Definitely sounds helpful in games that support this, but for games that don't... Well, I don't have good memories of trying to run games in 720p on a 1080p monitor. Looked absolutely awful. Probably not as bad to run a game in 1080p on a 1440p monitor though, although still possibly a bit blurry?

Pemalite said:

I personally believe that 1440P is the minimum resolution going forward... Heck. Other than the Nintendo Switch, all my devices have a 1440P or better panel, my phone included.

Well, you're an enthusiast, willing to spend quite a bit on electronics, are you not? My phone has a 1080p screen, and that's by choice. Higher resolutions offer diminishing returns on phones, but reduce battery life. In fact, 1080p is the highest resolution in my household, and that's including my TV (which I'd like to replace at some point, mind you, but 4K is not a target for my new TV either, although I suspect I will end up with a 4K TV anyway). That's simply because a higher resolution is simply not worth it for me considering the drawbacks. Judging by the abundance of 1080p screens, it seems like 1080p will still be a big thing even going forward, although higher resolutions are definitely gaining ground.

Pemalite said:

Keep in mind that display resolution popularity will vary from region to region as well.

Higher-socio-economic areas of the world tend to run with better quality hardware... Because the difference between $100 and $200 is insignificant.

I.E. At my work, every display is 1440P. - But go overseas and you will come across 720P/1080P panels.

Absolutely. I looked at US Amazon, and as for my local online retailers? Well, I live in Finland, which happens to be a fairly wealthy country with a sizeable PC gaming market, so I would expect the market here to be skewed towards higher-end equipment. Additionally, I work in a software company, and I believe most of our monitors are still 1080p ones. Granted, it's still a growth company at this point, but I believe we've left behind the toughest times financially. Can we draw conclusions from our anecdotal experiences? Probably not. We'd probably need more data, plus I'd argue that 1440p actually is more beneficial for a lot of work than it is for entertainment, so I would expect there to be some extra interest in 1440p monitors in workplaces.

Pemalite said:

There is no doubt that 1080P is still a popular desktop resolution, but keep in mind that Steam doesn't represent all gaming PC's. - A ton of devices that Steam is installed on are simply ancient. - Case in point, 44% of PC's steam is installed on have 4-CPU cores or less, 25% have only 8GB or less Ram.

It's safe to say that a massive portion of those 1080P users are not using new machines, they are using old and out-dated devices and just using them until they fail as it runs what they want.

Absolutely, but do you expect those people to be willing to pay more for 1440p when they eventually do have to upgrade? If they're using such ancient devices, it's probably for a reason, and I expect that reason to be closely related to their budget. When they've likely been happy with 1080p or so for a long time but have noticed other areas in need of improvement, going for a better monitor might not always be a priority.

Pemalite said:

My notebook only has a 3060 in it. It super-samples 1440P content down to 1080P just fine, that is far from being top-tier.

I don't think people realise how accessible 1440P has become in the modern era.

I tend to replace AA with Super-Sampling as it benefits the entire image and rendering pipeline rather than sampling edges or geometry to remove aliasing... Super Sampling isn't just "resolution". It's using the data of a higher resolution scene to benefit a lower-resolution output.

Again, I can only speak from my experience, but I'd much rather suffer suboptimal anti-aliasing than sacrifice something that has a more profound impact on the graphics of a game. I can definitely tolerate some jaggies, so why wouldn't I choose to use the processing power for something else instead? Not everyone needs super-smooth image quality, as nice as it can be. It's probably easy to look at it from an enthusiast viewpoint and value the lack of jaggies very highly, but I'm not sure that's the case for people with more limited budgets (mine's not all that limited anymore, mind you, but it's simply not worth the cost to me).