By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - WHEN did FPS become important!!

Depends heavily on the type of game. If the game requires quick reactions; such as a fighting game or a a hack and slash, a high FPS rate is crucial. One of a myriad of reasons why the original release of the DmC reboot is such a step down from previous Devil May Cry games in my opinion is the 30 FPS frame rate; not only is your reaction speed cut in half, but the game just doesn't look as smooth, which is a big contributor to the feeling of free flowing movement.

Stable 30 FPS is acceptable (though not preferable) for most titles, but have been way too many recent titles with issues reaching even that. Arkham Knight on PC, the original Dark Souls, Dead Rising 3, Ryse, Titanfall, etc. Going anywhere below 30 severely hampers player reaction and compromises immersion.



Around the Network
mZuzek said:
CGI-Quality said:

For me, I really can't go back below 120fps. I'll tolerate 60 if the game can't go above (though on PC, there are usually ways around that). However, if I can hit 120 (or 144 in the case of my latest monitor @ 1440p), that's why I'll always strive for. It's why I haven't upgraded to 4K on PC. The smoothness of 144fps is just too great.

On console, I tolerate 30-60 because there isn't a choice, but since my main gaming platform is PC, 120+ or bust! :P

Now that's what I call a whole new level of elitist standards.

I mean, disliking 30fps I get, but I really can't see how anyone wouldn't enjoy a game at 60fps.

Elitist?



Before I say anything, it should be stated that a game is meant to be fun.If one feature is so important that makes the game unplayable or something close to it due to that said feature inexistance on the game, it is fair.It is their money that they are expending, and thus it should go on something that they deem acceptable.

Having said that, I think the real problem lies with performance.I personally dont care if the game is 30, 60 or 120 fps, as long as the performance is smooth, or in another words, the experience remains unaltered.Much like better resolution, the more the merrier, but I dont mind as long as the game plays like it should.

I do find it a bit ridicullous when people say that "oh its not 60fps(even for genres that benefits more with it), then its unplayable/sucks/whatever reason for it to be a bad game".Games for the longest time didnt achieve that and many of them are considered the best of all time, because they all had(or most of them anyway) a good performance.The same holds true nowadays in my opinion.Just as a side note, I also find silly for people that put too much weight into things like trophies and such frivolities(they are a nice feature but comon).

But to each its own.



My (locked) thread about how difficulty should be a decision for the developers, not the gamers.

https://gamrconnect.vgchartz.com/thread.php?id=241866&page=1

Qwark said:
Actually it isn't all that important really. If you ask this site which games are the best 8th gen games. Popular answers would be bloodborne, horizon, uncharted 4, zelda botw, witchery 3 etc. You know what all those games have in common besides a high meta they all run in 30fps.

Good games are good games am I right. I understand the higher framerate for fast and competitve but as long as its locked at 30 I'm good to go.



Zach808 said:
When you were a kid with your NES and SNES, did you even know or care what FPS was? Exactly. Now, even with these classic games, there was a difference. For example, Ghosts n' Goblins on the NES. I never liked that game, mostly due to,the stiff, unresponsive controls. Replaying it later, I now understand why the controls are so unresponsive: The framerate is god-awful. Compare it to something like Super Mario Bros and its smooth 60fps. It's like night and day.

But even Super Mario Bros has it's fair share of slowdowns. And lets not forget in example Megaman 2, where Metalmans stage comes to mind in the part with the "wall driller" where the game halts to a crawl.

 

But low FPS isn't a deal braker if it's kind of consistend (including slowdowns), wildly fluctuating FPS is a bigger problem. Heck I remember being greatly helped by the slowmotion button on my NES Advantage (basically a quicktoggle of the startbutton) in games like Gradius and Lifteforce, the effect of this is the same as really low FPS but the added reaction time to bullets and obstructions etc made some parts of the games a lot easier.



Around the Network

Anything better is better. I don't think it is a preference of necessity for most but an easy quantifiable comparison to build a decision or argument around.

You do realize though that in the Atari through even the GCN days, not many even knew what fps was. The internet wasn't as vibrant as it is now for discussion, and everyone was just impressed with the fact that they could even play some of these marvels. It wasn't until CD skips and mainly online play that fps became such a huge issue due to the differences in user experience. Now with 4k the emphasis keeps growing.



The gaming industry always needs something to love and something to hate. Every generation holds onto something they cherish and something they hate. This gen it was all about loving resolution and hating Loot boxes.



^This too. 



Once upon a time, before LCD monitors, PC benchmark sites were using 40fps as cutoff point for acceptable performance. I find even 60fps somewhat slow for keyboard/mouse, but I'm perfectly fine with 30fps when I play games designed to be played with controller.



What you don't know can't hurt you. Or rather in this case what you don't know/pay attention to can't hurt you.



Lube Me Up