Quantcast
The "30fps campaign, 60fps multiplayer" approach

Forums - Gaming Discussion - The "30fps campaign, 60fps multiplayer" approach

What do you think of this method?

I like 95 52.20%
 
I dislike it 36 19.78%
 
Don't feel strongly either way 51 28.02%
 
Total:182
Conina said:

I don't guarantee that they will run the game at 60 fps... I even am quite sure that many of them won't have solid 60 fps and I don't agree with Running_Matt's statement that 60 fps are minimum for PC gamers.

My main points are that the PC community is too diverse to find any "average group" within them and that the current Steam stats are a bad basis for diskussions about what percentage of players of a game will hit 60 fps and how much wont.

I agree that it is really hard to find an average unless we had the hardware data for owners of individual games. My only gripe was really with the original post by Running_Matt, the rest is spot on.



Around the Network

Don't care.

I grew up playing games on an outdated computer. Anything above 15 is playable to me.



I LOVE ICELAND!

torok said:
curl-6 said:

I have a feeling the post was referring more to enthusiast PC gamers, not the kind who only play low-spec games. Conina beat me to it, but if we could somehow measure the average specs of those who play contemporary AAA games on PC, I'd guess most players would be using better-than-console hardware.

I agree. However, this would be a biased number. Basically all modern AAA games require a GTX660 at minimum, which is equivalent to PS4. So everyone playing those games will have a better hardware than consoles because if they didn't they would not be able to play it.

Also, the original post was about 60fps. In those game, it isn't quite easy to reach 60 fps, at least not without serious compromises regarding quality.

If we include everyone who's ever played a game on PC then yes, I agree, 60fps is clearly not the minimum, as plenty of casual players really don't care about framerates as long as they are playable.

Among enthusiast PC gamers though, 30fps is practically as dirty a word as "720p" was for console gamers back in the ResolutionGate days.

Running the majority of console ports at 60fps on PC doesn't require that high end a rig these days.



curl-6 said:

If we include everyone who's ever played a game on PC then yes, I agree, 60fps is clearly not the minimum, as plenty of casual players really don't care about framerates as long as they are playable.

Among enthusiast PC gamers though, 30fps is practically as dirty a word as "720p" was for console gamers back in the ResolutionGate days.

Running the majority of console ports at 60fps on PC doesn't require that high end a rig these days.

If you at least turn down some settings. I have a OC GTX970 and it won't get the job done for a good bunch of games.



torok said:
curl-6 said:

If we include everyone who's ever played a game on PC then yes, I agree, 60fps is clearly not the minimum, as plenty of casual players really don't care about framerates as long as they are playable.

Among enthusiast PC gamers though, 30fps is practically as dirty a word as "720p" was for console gamers back in the ResolutionGate days.

Running the majority of console ports at 60fps on PC doesn't require that high end a rig these days.

If you at least turn down some settings. I have a OC GTX970 and it won't get the job done for a good bunch of games.

In the case of a few poorly optimized PC ports perhaps, but an OC GTX970 outperforms consoles by a large margin; Digital Foundry's tests got an 87fps average on Battlefield 4 at 1080p and Ultra settings, while on PS4 the game runs at 900p with a mix of low, medium and high settings. Heck, the 970 will even let you run the game at about 40fps in 4K at high settings.

DF were also able to get Star Wars Battlefront running at 1080p/60fps on a GTX 780, a substantially weaker card, with everything but shadows set to Ultra.



Around the Network
curl-6 said:

In the case of a few poorly optimized PC ports perhaps, but an OC GTX970 outperforms consoles by a large margin; Digital Foundry's tests got an 87fps average on Battlefield 4 at 1080p and Ultra settings, while on PS4 the game runs at 900p with a mix of low, medium and high settings. Heck, the 970 will even let you run the game at about 40fps in 4K at high settings.

DF were also able to get Star Wars Battlefront running at 1080p/60fps on a GTX 780, a substantially weaker card, with everything but shadows set to Ultra.

Yes, I know it does. BF4 and Battlefront aren't exactly the games I'm complaining because they are impressively easy to run at high fps with great settings. The games that are really hard to reach 60 fps without compromises are mainly the Witcher 3, Ryse, Watch Dogs, Mortal Kombat X and others.

I won't count Arkham Knight and the new AC games for obvious reasons.



I can't see why it's a problem. Uncharted 4 did it and it's amazingly received. At the end of the day, if the game is truly great and the framrate is consistency. I don't think anyone could or should complain.



It depends, on Hack and Slash games, 60fps is a great help. But at 30 fps steady are playable too. So YES, 30fps at Camppaign isn't bad at all.



 

DoYou Want DOZENS OF NO GAEMZ?! then... Visit the Official PlayStation Vita Tread

torok said:

Yes, I know it does. BF4 and Battlefront aren't exactly the games I'm complaining because they are impressively easy to run at high fps with great settings. The games that are really hard to reach 60 fps without compromises are mainly the Witcher 3, Ryse, Watch Dogs, Mortal Kombat X and others.

Of course you can choose settings in these games where a GTX970 / GTX 780 Ti / R9 290/X / R9 390/X / RX480 or even GTX980 can't keep solid 60 fps in 1980x1080. But these maxed out / Ultra settings aren't comparable to the settings a PS4 or XBO use.

When you use medium to high settings instead, the graphic fidelity should be closer to PS4 settings and result in doubled framerates (or abov) of the console version.



Conina said:

Of course you can choose settings in these games where a GTX970 / GTX 780 Ti / R9 290/X / R9 390/X / RX480 or even GTX980 can't keep solid 60 fps in 1980x1080. But these maxed out / Ultra settings aren't comparable to the settings a PS4 or XBO use.

When you use medium to high settings instead, the graphic fidelity should be closer to PS4 settings and result in doubled framerates (or abov) of the console version.

Yes, with those settings I can achieve that. I just find it a bit underwhelming that I expent the price of my PS4 on this GPU and it gives me just that. Again, I'm refering to the original discussion about how 1080@60 is the minimum on PC. My GPU isn't exactly cheap or weak to be considered the minimum, specially when the OC version of the 970 actually comes close to stock 980 territory, so it's a pretty beefy GPU. Also notice that I'm not just turning everything on max without looking, I'm using some common sense: cutting down a bit on the AA because it's demanding, not using super sampling and also avoiding Hairworks. If I turned everything on at 1080p I wouldn't reach 30 fps.

I want to explain to people reading this that, unlike the original comment that started it all, things aren't that easy to achieve. Some PC gamers try to sell the idea that any mid-end GPU like a 960 will do miracles while a 970-class one will give you ultra and 60 fps at 1080 and that 4K with a single GPU is easy to achieve. After upgrading my GPU, I saw the real picture. I'm not saying that I regret buying it, specially because it's a damn great GPU, but people have to be a bit more realistic. What they are trying to sell us is that you will get console-crushing visuals @ 60fps and maybe even get those at 1440p or more. At least with reasonable financial investiments, this is BS for a sizeable bunch of games.