By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - The "30fps campaign, 60fps multiplayer" approach

 

What do you think of this method?

I like 95 52.20%
 
I dislike it 36 19.78%
 
Don't feel strongly either way 51 28.02%
 
Total:182
Conina said:

The Steam statistics tell nothing about the settings the "average PC gamer" choses for their games. Some of them will reduce resolution and postprocessing/effects in favor of fps. Others will sacrifice fps and resolution for the best postprocessing/effects. Others will sacrifice fps and postprocessing/effects to play in higher resolution (native or downsampling). Others will sacrifice fps and postprocessing/effects to play in stereoscopic 3D or super-widescreen (triple monitor setups). Others will make compromises while gaming on their laptop but will set the sliders to the max when they switch to their gaming PC. Others will adjust their setting for every game or genre individual. Others won't care at all about setting and will just play in default settings or the settings chosen by Nvidia Experience.

Different PC gamers play different PC games / genres. Some even favor older games (where maxing out the settings is much easier) or browser games. They play on different formfactors with very different performance differences (netbooks, Windows tablets, laptops, gaming laptops, office PCs, "normal" gaming PCs, SLI/CF rigs...).

The concept of the "average PC gamer" is very strange, because the PC gaming community is widely diversified, much more than the community of any other gaming platform. And that's why many PC gamers hate it when a developer limits their options with 30 or 60 fps locks, mandatory V-sync, fixed resolutions and similar stuff.

Looking at the hardware survey, a sizeable buch is running on GTX 650, 550ti and worse GPUs. I had a 650 until recently and it couldn't even get 30 fps on most next gen games at 720p on low. A lot of people there can't reach 60fps even with everything turned off.

While I understand your points that the community is diverse, since it really is and the survey proves that, this data still is enough to invalidate the initial post claiming that it was a minimum thing for all gamers on PC. For me, saying that something is "the minimun required", means that everyone (or almost) has that, so he has a point here.



Around the Network
torok said:
Conina said:

The Steam statistics tell nothing about the settings the "average PC gamer" choses for their games. Some of them will reduce resolution and postprocessing/effects in favor of fps. Others will sacrifice fps and resolution for the best postprocessing/effects. Others will sacrifice fps and postprocessing/effects to play in higher resolution (native or downsampling). Others will sacrifice fps and postprocessing/effects to play in stereoscopic 3D or super-widescreen (triple monitor setups). Others will make compromises while gaming on their laptop but will set the sliders to the max when they switch to their gaming PC. Others will adjust their setting for every game or genre individual. Others won't care at all about setting and will just play in default settings or the settings chosen by Nvidia Experience.

Different PC gamers play different PC games / genres. Some even favor older games (where maxing out the settings is much easier) or browser games. They play on different formfactors with very different performance differences (netbooks, Windows tablets, laptops, gaming laptops, office PCs, "normal" gaming PCs, SLI/CF rigs...).

The concept of the "average PC gamer" is very strange, because the PC gaming community is widely diversified, much more than the community of any other gaming platform. And that's why many PC gamers hate it when a developer limits their options with 30 or 60 fps locks, mandatory V-sync, fixed resolutions and similar stuff.

Looking at the hardware survey, a sizeable buch is running on GTX 650, 550ti and worse GPUs. I had a 650 until recently and it couldn't even get 30 fps on most next gen games at 720p on low. A lot of people there can't reach 60fps even with everything turned off.

While I understand your points that the community is diverse, since it really is and the survey proves that, this data still is enough to invalidate the initial post claiming that it was a minimum thing for all gamers on PC. For me, saying that something is "the minimun required", means that everyone (or almost) has that, so he has a point here.

I have a feeling the post was referring more to enthusiast PC gamers, not the kind who only play low-spec games. Conina beat me to it, but if we could somehow measure the average specs of those who play contemporary AAA games on PC, I'd guess most players would be using better-than-console hardware.



curl-6 said:

I have a feeling the post was referring more to enthusiast PC gamers, not the kind who only play low-spec games. Conina beat me to it, but if we could somehow measure the average specs of those who play contemporary AAA games on PC, I'd guess most players would be using better-than-console hardware.

I agree. However, this would be a biased number. Basically all modern AAA games require a GTX660 at minimum, which is equivalent to PS4. So everyone playing those games will have a better hardware than consoles because if they didn't they would not be able to play it.

Also, the original post was about 60fps. In those game, it isn't quite easy to reach 60 fps, at least not without serious compromises regarding quality.



torok said:

I agree. However, this would be a biased number. Basically all modern AAA games require a GTX660 at minimum, which is equivalent to PS4. So everyone playing those games will have a better hardware than consoles because if they didn't they would not be able to play it.

Counting only PS4 gamers instead of all Playstation gamers (PS1, PS2, PS3, PS4, PSP, PS Vita) is also a biased number with this argumentation. Everyone playing those PS4 games will have a better hardware than PS3 because if they didn't they would not be able to play it.



Conina said:

Counting only PS4 gamers instead of all Playstation gamers (PS1, PS2, PS3, PS4, PSP, PS Vita) is also a biased number with this argumentation. Everyone playing those PS4 games will have a better hardware than PS3 because if they didn't they would not be able to play it.

A bit of a stretch, isn't it? As I said,even considering the subset of users you are proposing, you can't guarantee they will run the game at 60 fps, which is the initial question of the whole discussion.



Around the Network
torok said:
Conina said:

Counting only PS4 gamers instead of all Playstation gamers (PS1, PS2, PS3, PS4, PSP, PS Vita) is also a biased number with this argumentation. Everyone playing those PS4 games will have a better hardware than PS3 because if they didn't they would not be able to play it.

A bit of a stretch, isn't it? As I said,even considering the subset of users you are proposing, you can't guarantee they will run the game at 60 fps, which is the initial question of the whole discussion.

I don't guarantee that they will run the game at 60 fps... I even am quite sure that many of them won't have solid 60 fps and I don't agree with Running_Matt's statement that 60 fps are minimum for PC gamers.

My main points are that the PC community is too diverse to find any "average group" within them and that the current Steam stats are a bad basis for diskussions about what percentage of players of a game will hit 60 fps and how much wont.



Conina said:

I don't guarantee that they will run the game at 60 fps... I even am quite sure that many of them won't have solid 60 fps and I don't agree with Running_Matt's statement that 60 fps are minimum for PC gamers.

My main points are that the PC community is too diverse to find any "average group" within them and that the current Steam stats are a bad basis for diskussions about what percentage of players of a game will hit 60 fps and how much wont.

I agree that it is really hard to find an average unless we had the hardware data for owners of individual games. My only gripe was really with the original post by Running_Matt, the rest is spot on.



Don't care.

I grew up playing games on an outdated computer. Anything above 15 is playable to me.



I LOVE ICELAND!

torok said:
curl-6 said:

I have a feeling the post was referring more to enthusiast PC gamers, not the kind who only play low-spec games. Conina beat me to it, but if we could somehow measure the average specs of those who play contemporary AAA games on PC, I'd guess most players would be using better-than-console hardware.

I agree. However, this would be a biased number. Basically all modern AAA games require a GTX660 at minimum, which is equivalent to PS4. So everyone playing those games will have a better hardware than consoles because if they didn't they would not be able to play it.

Also, the original post was about 60fps. In those game, it isn't quite easy to reach 60 fps, at least not without serious compromises regarding quality.

If we include everyone who's ever played a game on PC then yes, I agree, 60fps is clearly not the minimum, as plenty of casual players really don't care about framerates as long as they are playable.

Among enthusiast PC gamers though, 30fps is practically as dirty a word as "720p" was for console gamers back in the ResolutionGate days.

Running the majority of console ports at 60fps on PC doesn't require that high end a rig these days.



curl-6 said:

If we include everyone who's ever played a game on PC then yes, I agree, 60fps is clearly not the minimum, as plenty of casual players really don't care about framerates as long as they are playable.

Among enthusiast PC gamers though, 30fps is practically as dirty a word as "720p" was for console gamers back in the ResolutionGate days.

Running the majority of console ports at 60fps on PC doesn't require that high end a rig these days.

If you at least turn down some settings. I have a OC GTX970 and it won't get the job done for a good bunch of games.