By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - 60Fps - is it really that important?

 

Is 60 fps just placebo?

Yes, you wont notice a big difference 98 28.32%
 
No!!!!!! 248 71.68%
 
Total:346

As long as it is a rock solid frame rate, I don't care what the actual rate is. It is only when it jerks around that I start to notice it.



Switch Code: SW-7377-9189-3397 -- Nintendo Network ID: theRepublic -- Steam ID: theRepublic

Now Playing
Switch - Super Mario Maker 2 (2019)
3DS - Phoenix Wright: Ace Attorney (Trilogy) (2005/2014)
Mobile - Yugioh Duel Links (2017)
Mobile - Super Mario Run (2017)
PC - Borderlands 2 (2012)
PC - Deep Rock Galactic (2020)

Around the Network
vivster said:
Eddie_Raja said:
It depends on the game, and to say framerate is sacrificed when a game is 1080p is completely ignorant and idiotic.

1) Framerate is often tied to how fast the CPU is, and these consoles have a dis-proportionally weak CPU with regards to their GPU (When compared to previous gens). Just because it is 1080p 30 FPS, does not mean it would have been 60 FPS at a lower resolution.

Sorry, I had to stop reading after this. Where the hell did you get that one from?

While it is true that the new consoles have a potato where the CPU should be, everything is still bottlenecked at the GPU, which is also part of the same potato. Tell me one game that is limited by CPU and I will tell you a thousand that are limited by GPU.

I guess these X1 developers must be complete idiots. They actually lower the resolution on multiplat games even though the X1 has a slightly stronger CPU. What a bunch of morons.


Hahaha shows how much you know.

 

For instance the PS3 at the time had the equivalent of an i5 and a high end GPU.  When the PS4 launched it had the equivalent of an i3 and a high-end GPU (However it has far more RAM for its time than before).

 

But all of this doesn't matter because you clearly didn't read what I said - The bulk of my post talked about how developers are sacrificing resolution for extra effects (Effects you can't see well because the resolution is too low).

 

Furthermore, I am not sure how you have missed that some of the more poorly optimized games do in fact keep a better framerate on X1 like AC Unity and Project Cars.



Prediction for console Lifetime sales:

Wii:100-120 million, PS3:80-110 million, 360:70-100 million

[Prediction Made 11/5/2009]

3DS: 65m, PSV: 22m, Wii U: 18-22m, PS4: 80-120m, X1: 35-55m

I gauruntee the PS5 comes out after only 5-6 years after the launch of the PS4.

[Prediction Made 6/18/2014]

Eddie_Raja said:
vivster said:

Sorry, I had to stop reading after this. Where the hell did you get that one from?

While it is true that the new consoles have a potato where the CPU should be, everything is still bottlenecked at the GPU, which is also part of the same potato. Tell me one game that is limited by CPU and I will tell you a thousand that are limited by GPU.

I guess these X1 developers must be complete idiots. They actually lower the resolution on multiplat games even though the X1 has a slightly stronger CPU. What a bunch of morons.


Hahaha shows how much you know.

For instance the PS3 at the time had the equivalent of an i5 and a high end GPU.  When the PS4 launched it had the equivalent of an i3 and a high-end GPU (However it has far more RAM for its time than before).

But all of this doesn't matter because you clearly didn't read what I said - The bulk of my post talked about how developers are sacrificing resolution for extra effects (Effects you can't see well because the resolution is too low).

Furthermore, I am not sure how you have missed that some of the more poorly optimized games do in fact keep a better framerate on X1 like AC Unity and Project Cars.

That still does not make your statements true. These "effects" you are mentioning are calculated mostly by the GPU as well. The comparison with PS3 is moot. The theoretical strength of the PS3 was never used and it's not even close to that of a current i5. The current console CPUs can't even keep up with a current i3 but that is still more than enough to run the games. 

Current games are not bottlenecked by the CPU of the consoles, period. That's why developers are offloading the GPU by reducing the resolution. The CPU rarely even come in play. No matter how weak the CPU is, the GPU is comparably even weaker since it's on the chip and can't even dream to reach the power of a dedicated GPU. There is enough proof in the myriad of games that run better on PS4 at the same resolution or abou the same at an even higher resolution despite the PS4 having a slightly weaker CPU.

Again, care to explain why reducing the resolution gives the games a performance boost if they are actually bottlenecked by the CPU? If what you say is true, reducing the resolution would do nothing.

Every game that is not reliant on heavy physics or countless individual and are not poorly optimized (i.e. 99% of all games) will run into the GPU bottleneck before the CPU gives up. It doesn't matter how weak the CPU of the consoles is, the GPU is comparatively many times weaker than it.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

SvennoJ said:
LudicrousSpeed said:
I would take 60 fps over a higher resolution every time.

It's sad when you have a situation like Uncharted 4 where the dev understands the importance of 60fps in one section of the game but is too hesitant to make the full game 60fps out of fear of backlash or whatever is keeping the single player 30fps.

A good looking game with a smooth 60fps doesn't create as much sizzle as a better looking game with a meh framerate and that sucks. There's virtually no backlash when next gen games have framerate issues, but a 900p resolution... oh my god!

I'm ready for a new gen of consoles where we can get 60fps and maybe devs won't have to sacrifice for it.

http://gamrconnect.vgchartz.com/thread.php?id=210040&page=1#
Just one of many. Every time a new game comes out and drops a couple of frames it's time to crucify the developers.
Much more than the 900p vs 1080p MGS 5 discussion.

How did 60 fps work out for Halo 5, seems it's not that much of a selling point after all. There was much more backlash over leaving out splitscreen co-op. I started Tlou remastered in 60fps yet switched to 30fps after a while. The improved shadows were more immersive than a faster framerate. And I actually didn't notice the drop down anymore after a minute at 30fps.

I have nothing against them adding a 720p60 mode next to 1080p30. That's probably still not enough to straight up double the fps but with some lod tweaks it shouldn't be hard to include. It should be even easier for cross-gen titles. Why not have the option in rise of the tr to play the 360 version profile at 720p60 on XBox One next to the 1080p30 version that drops to 25fps. Should even be easy for Fallout 4 with it's multiplat engine, drop to 720p60 at minimum details. I guess doubling the QA for 2 performance profiles takes too much time, plus it's not exactly bug free with just the one...

Right, because all the FUD we saw on this specific forum over how "unbroken and unplayable" Fallout 4 is, is an accurate representation of how the online gaming community in general reacts when a game has framerate issues versus the audacity to release a game below 1080p.

Not sure what the heck Halo 5, which has sold and reviewed quite well since you're asking, has to do with how important 60fps is to me but I'm just gonna stop reading there.



Most of the games I play are 30fps and I am ok with it. So I prefer better gfx.

720P 60fps and 1080 30fps would be great but I am not sure devs can do that on console so easily sadly



 

Around the Network
UltimateUnknown said:
MikeRox said:


This was pretty much how I first started to notice the difference. Well I more could see a "double vision" effect on the movement on some of my games, while others flew by silky smooth. (Metropolis Street Racer and F355 Challenge were the click moment when I tried to find out what the difference was).

The thing is, not everybody CAN see the difference. However I can guarantee you give me a blind test of 30fps vs 60fps I can pick them out every time. This was how I knew Zelda WW was 30fps despite all the reviews claiming 60fps. What a disappointing day that was.

See that's the thing. To me the difference is so obvious that it's hard to believe others may not see it.

I think this website does a very good comparison between the two: http://www.30vs60.com/bf4driving.php


Those comparisons were the ones I was thinking of when I said in my original post that I can't passively/visually see the diffrence between stable 30 FPS and 60 FPS. I tried to do a blind test when I first found the website and kept guessing 30 FPS was 60 FPS (perhaps because that's what my eyes are more used to).

This makes me think that my personal ability to visually dicern more frames ends somewhere in the 24-40 range (I did notice the diffrence in The Hobbit and do notice it on 60 FPS youtube streams that show the streamers as well)

It's not a lie, and I'm not decieving myself, I just honestly can't really tell the diffrence visually.

Now while playing it's another story, I can definitely 'feel' higher framerates, and for some games it helps a lot.



SuperNova said:
UltimateUnknown said:

See that's the thing. To me the difference is so obvious that it's hard to believe others may not see it.

I think this website does a very good comparison between the two: http://www.30vs60.com/bf4driving.php


Those comparisons were the ones I was thinking of when I said in my original post that I can't passively/visually see the diffrence between stable 30 FPS and 60 FPS. I tried to do a blind test when I first found the website and kept guessing 30 FPS was 60 FPS (perhaps because that's what my eyes are more used to).

This makes me think that my personal ability to visually dicern more frames ends somewhere in the 24-40 range (I did notice the diffrence in The Hobbit and do notice it on 60 FPS youtube streams that show the streamers as well)

It's not a lie, and I'm not decieving myself, I just honestly can't really tell the diffrence visually.

Now while playing it's another story, I can definitely 'feel' higher framerates, and for some games it helps a lot.

As someone else pointed out, 60FPS is something that more enhances gameplay than visuals. So being able to actually "see" it in a video is not so important when compared to feeling it when you're playing, which as you said is quite apparent.



 

Quality-Does it even matter?



Hiku said:

/thread.



Hiku said:


I can't see the difference.

Also, my eyes are closed.