Mr Puggsly said:
Random_Matt said: 60 all the way. Gaming at 60 is a minimum standard for PC these days, how long are consoles going to be stuck at sub 30? |
The average PC gamer is not playing X1/PS4 quality games at 60 fps.
That's apparent looking at Steam statistics.
|
The Steam statistics tell nothing about the settings the "average PC gamer" choses for their games. Some of them will reduce resolution and postprocessing/effects in favor of fps. Others will sacrifice fps and resolution for the best postprocessing/effects. Others will sacrifice fps and postprocessing/effects to play in higher resolution (native or downsampling). Others will sacrifice fps and postprocessing/effects to play in stereoscopic 3D or super-widescreen (triple monitor setups). Others will make compromises while gaming on their laptop but will set the sliders to the max when they switch to their gaming PC. Others will adjust their setting for every game or genre individual. Others won't care at all about setting and will just play in default settings or the settings chosen by Nvidia Experience.
Different PC gamers play different PC games / genres. Some even favor older games (where maxing out the settings is much easier) or browser games. They play on different formfactors with very different performance differences (netbooks, Windows tablets, laptops, gaming laptops, office PCs, "normal" gaming PCs, SLI/CF rigs...).
The concept of the "average PC gamer" is very strange, because the PC gaming community is widely diversified, much more than the community of any other gaming platform. And that's why many PC gamers hate it when a developer limits their options with 30 or 60 fps locks, mandatory V-sync, fixed resolutions and similar stuff.