By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Only 1.33% of steam users play in 4k

Azzanation said:
Ganoncrotch said:

This survey and peoples enjoyment of the standard X1 and the Switch suggest that far from everyone has moved to 1080p just yet.

30%+ of the surveyed users are sub 1080p

I see the Switch is an exception to the rule as its also considered a portable device and gamers are happy to accept 720p for that system. However if X1 or PS4 achieved 720p it causes a huge up roar that gamers go crazy because of pixels. However its quite fair to say going from 720p to 1080p, its hard to go back, and same with going from 1080p to 4k, its the same thing. I am now use to 4k that when I lower the resolution to 1080p I instantly notice the clarity change. Everyone will have a different opinion on it as some gamers simply just don't care. I played Zelda BOTW on my 4k TV at 900p and considering how good that game is, I ignored the Pixels.. but in saying that I definitely notice the lower resolution, it did stick out like a sore thumb.

Its like upgrading your eye Glasses, once you get a new pair, using the old pair becomes noticeable and starts to strain your eyes. Especially when you are use to wearing the new pair.

Well I didn't mention the PS4 when it comes to 1080p as that hits the mark fairly frequently, but the base X1 runs less of its game library at 1080p than the Switch does when docked, a few of the dynamic resolution titles like Titanfall hitting a match for the Switches Doom lows of 480p when under load, some of the higher end (terribly optimised) games run at lower than 1080p even on the other systems, Ark running at 720/30 on a ps4 pro if I recall correctly.

But yeah, my point was that the survey and systems people game on as we round off 2018 show that at least 1/3 of gamers are still sub 1080p

I know what you mean though, once you upgrade hardware it's tough to go back, my main display is a projector, for a few months my setup had a "hd ready" one because I wasn't sure if I would make use of it, was grand for gaming and watching things but obviously as a PC monitor it sucked having that lower resolution, now since I splashed out to my 1080p epson TW650 I can genuinely say I would never be able to go back to the 768p one I had been using. I've always been a screen size whore more than a resolution one, but really the extra features and brightness of the "screen" I'm using now would be tough to go without.

I guess when you look at the survey data though some people don't mind that their using sub HD resolutions, simply because they never experienced better than what they are using. Ignorance in this case seems to be bliss.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Around the Network
Cerebralbore101 said:
Pemalite said:

That 4k TV will likely have a ton of input lag.

How much is a ton in milliseconds? 

For a 4k TV...
20ms is considered "great".
15ms is considered "high end".
And sometimes depending on processing... A TV can have 100ms or more.

For computer monitors...
5ms is considered "Great".
3ms is considered "High-End".

Twisted Nematic computer monitors can have a big advantage in this aspect... You can get panels with as low as 0.7ms... But I would rather set my house on fire than have a TN panel in my home.

1 frame at 60fps is 16.67ms to put things into perspective... And this is why I wanted the Xbox One X to have 1440P support at launch... Because my 1440P monitor is a much better gaming orientated display than my Television.

And this all results in advantages, especially in competitive online games.



--::{PC Gaming Master Race}::--

This is a big reason why I don't think consoles are holding back anything with regards to graphics in games. The vast majority of gamers, either PC Master Race or console peasants (like myself) don't play games at the highest fidelity setting.

Look at Crysis and how far ahead it was of its time it was. It barely made it's money back after selling a few million copies, because most people couldn't run the game. Hence why the sequels were streamlined to be playable on (and portable to) less capable machines.

If anything, consoles were ahead earlier on and gave PC gaming a foothold to become popular. Yeah, everything is better on PC now (unless the developer is lazy and doesn't take advantage of the system), but most don't developers don't want to take full advantage of that power not because consoles are holding stuff back, but because there isn't a big enough market.



Cerebralbore101 said:
Pemalite said:

That 4k TV will likely have a ton of input lag.

How much is a ton in milliseconds? 

realistically when it comes to decent brands you can use the old adage of "you get what you pay for" if you're paying this little for a screen with that size and resolution you gotta think of where the cutbacks have come from, sound would likely suffer, up-scaling abilities from non native resolutions (which with a 4k are going to be a ton of things), contrast ratio will be far smaller on a cheap screen and you'll be seeing none or very poor implementations of features such as HDR.

Obviously there are brands of TV which are more expensive from the off so it doesn't always mean you're getting more if you pay more, but if the same manufacturer offers a TV which costs $300 and one which costs $900 and they are listed as being the same size and resolutions, you have to consider that there is something else missing from the cheaper model.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

danasider said:
This is a big reason why I don't think consoles are holding back anything with regards to graphics in games. The vast majority of gamers, either PC Master Race or console peasants (like myself) don't play games at the highest fidelity setting.

Look at Crysis and how far ahead it was of its time it was. It barely made it's money back after selling a few million copies, because most people couldn't run the game. Hence why the sequels were streamlined to be playable on (and portable to) less capable machines.

If anything, consoles were ahead earlier on and gave PC gaming a foothold to become popular. Yeah, everything is better on PC now (unless the developer is lazy and doesn't take advantage of the system), but most don't developers don't want to take full advantage of that power not because consoles are holding stuff back, but because there isn't a big enough market.

worth considering indeed, when you think that if a company creates 4k assets for a game to go onto steam they're creating those assets to be utilized by 1 in every 100 gamers who buy the game, but I think the percentage of pro model ps4's compared to the standard one was somewhere in the region of 20% so far higher chance of those 4k assets getting displayed if you make them for a console game.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Around the Network
danasider said:
If anything, consoles were ahead earlier on and gave PC gaming a foothold to become popular. Yeah, everything is better on PC now (unless the developer is lazy and doesn't take advantage of the system), but most don't developers don't want to take full advantage of that power not because consoles are holding stuff back, but because there isn't a big enough market.

PC has always been ahead of consoles. Always.

Whilst console gamers were just getting used to rudimentary pixel shaders, texture and lighting during the PS2/Gamecube/Xbox era... The PC was starting to experiment with the likes of Tessellation that wouldn't be common until this console cycle.

The PC also had 1080P back in the mid 90's.

4k back in the early 2000's.



--::{PC Gaming Master Race}::--

Ganoncrotch said:

worth considering indeed, when you think that if a company creates 4k assets for a game to go onto steam they're creating those assets to be utilized by 1 in every 100 gamers who buy the game, but I think the percentage of pro model ps4's compared to the standard one was somewhere in the region of 20% so far higher chance of those 4k assets getting displayed if you make them for a console game.

I have a Pro myself and think the games look fantastic. But compared to a powerful rig, it's still weak sauce in terms of fidelity. Sure the art can be great. Heck, Super Mario Odyssey is a fine looking game, and it's nowhere near the fidelity of what can be achieved on the Pro, much less XBox X or an expensive PC.

If the majority of developers wanted to put the resources into making really good looking games, they could. They just don't have the motivation if a huge minority (oxymoron there) is the demographic they'd be selling too. And as much as I enjoy the perks of PS4 Pro over the standard (I owned the original but upgraded after giving my original to my nephew), we're not actually getting native 4k. So, we're still probably lower in percentage than even that 1% of PC gamers.



Pemalite said:
danasider said:
If anything, consoles were ahead earlier on and gave PC gaming a foothold to become popular. Yeah, everything is better on PC now (unless the developer is lazy and doesn't take advantage of the system), but most don't developers don't want to take full advantage of that power not because consoles are holding stuff back, but because there isn't a big enough market.

PC has always been ahead of consoles. Always.

Whilst console gamers were just getting used to rudimentary pixel shaders, texture and lighting during the PS2/Gamecube/Xbox era... The PC was starting to experiment with the likes of Tessellation that wouldn't be common until this console cycle.

The PC also had 1080P back in the mid 90's.

4k back in the early 2000's.

That's some slow adoption rate then when you think that in 18 years a whole 1% of PC gamers use that resolution. Heck 1/3rd of them today are using sub "mid 90s" resolutions.

Just points even more to how it is just enthusiasts who have PC's in the "master race" category, most have toasters.

One point regarding that resolution and PC's though, say if a user on steam has multiple PC's with the same account on it, I wonder would it be a survey of each machine or each user and their best hardware, because my account is on my gaming PC but also on a laptop which is just capable of indie games or 3d titles at 720p ish stuff, wonder do I count for 1 user at 1080 and another at 720p because of that, messes with the numbers somewhat when obviously there are many gamers who have their steam account on things like HTPC's for indie games on a TV.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

And this is why we wont see a mass transition to 4K/60 fps gaming in the very near future that has been one of the go to threads as of late :P



danasider said:
Ganoncrotch said:

worth considering indeed, when you think that if a company creates 4k assets for a game to go onto steam they're creating those assets to be utilized by 1 in every 100 gamers who buy the game, but I think the percentage of pro model ps4's compared to the standard one was somewhere in the region of 20% so far higher chance of those 4k assets getting displayed if you make them for a console game.

I have a Pro myself and think the games look fantastic. But compared to a powerful rig, it's still weak sauce in terms of fidelity. Sure the art can be great. Heck, Super Mario Odyssey is a fine looking game, and it's nowhere near the fidelity of what can be achieved on the Pro, much less XBox X or an expensive PC.

If the majority of developers wanted to put the resources into making really good looking games, they could. They just don't have the motivation if a huge minority (oxymoron there) is the demographic they'd be selling too. And as much as I enjoy the perks of PS4 Pro over the standard (I owned the original but upgraded after giving my original to my nephew), we're not actually getting native 4k. So, we're still probably lower in percentage than even that 1% of PC gamers.

Aye of course, resolution is just one thing, you had games that ran 1080/60 on the X360 but they were titles like Super Meat boy and N+, 2d basic games.

Bring a game like Horizon to the PS4 though and you know that each person who plays that game gets to see the game as the devs created it and wanted it to be experienced, no playstation 4 user is playing that with the foilage turned off at sub 720p to get it to "playable" framerate, they created the game with the hardware in mind and each part of that lovely world they made knowing that the people who would play the game would get to see it and experience it not in a gimped manner because they've got weak hardware. (sure the pro exists but that more adds additional features to titles, I think the base system is still very much where most if not all in some cases of the games devs spend their time)

 

Interesting thread though! gotta catch a nap here but will check in later on some cool replies going about!



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive