By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Why 30FPS looks worse on PC than on Consoles.

 

...

I understand thanks 6 25.00%
 
I don't understand, no thanks. 10 41.67%
 
<3<3<3 8 33.33%
 
Total:24

   Pc gamers say that 30fps is like playing with a pile of shit. Now don't get me wrong, I prefer 60 always, even if I have to sacrifice some effects 60 fps is a must, my rig is good enough but some games are not well optimized so I have to sacrifice stuff :P but I can play a console game at 30 fps just fine too because I'm a console gamer more than a pc gamer. To the point...

  On consoles, games that run at 30fps are programmed to run at 30fps. On pc, most of the games or almost every game have unlocked framerate or the option for vertical synchronization, what's vsync? "The basic idea is that synchronizes your FPS with your monitor's refresh rate" the majority of pc gamers have monitors with 60hz refresh rate or higher. That means 60fps or higher.

    When your rig is not good enough and your framerates are below 60 or when you try to force and to lock a 60 fps game or a game with unlocked framerate with an external software, when you try to lock those pc games at 30 fps the image is going to look like a stuttering mess because the image is moving 30 frame slower than the refresh rate of your monitor. To avoid that, some games like GTA  have an option called half vsync or double vsync. What's half vsync? it synchronizes to half of your refresh rate. So it would cap at 30 fps on 60hz natively and would look better than a game with unlocked framerate locked at 30 fps. Nvidia (adaptive vsync, frame liniter) and amd(amd settings, it's shit tho) also have their own configurations to do this.

     Now, I'm not saying this is the only reason, most of pc gamers really can't stand a game running at 30 fps, period. But I'm giving a different point of view, sometimes is not as simple as "30 fps look like shit" sometimes pc gamers have the wrong configuration and sometimes console gamers need to understand that 30fps is not as smooth for everyone like on consoles.

I'm not the best explaining stuff plus my english is very basic so I understand if you don't understand, so here's a video:



Around the Network
CGI-Quality said:
Could be because they're used to better framerates, overall. So when going from games at 60-120fps down to 30, it just looks (and feels).....well.....bad! Then again, depends on the game. For example, LA Noire's devs claimed that anything above 30fps would mess up the experience (on PC, they were actually right - the custscenes had some trouble). 

yeah that's the case too, but this could be a reason too i'm talking about bad configurations. Not every pc gamer has a good rig and sometimes their framerates are below 60fps. In fact I would say that a lot of pc gamers are not playing on a beast pc, then they have a bad gaming experience because they don't know how to configurate some stuff.



CGI-Quality said:

Could be because they're used to better framerates, overall. So when going from games at 60-120fps down to 30, it just looks (and feels).....well.....bad! Then again, depends on the game. For example, LA Noire's devs claimed that anything above 30fps would mess up the experience (on PC, they were actually right - the custscenes had some trouble). 

It's the opposite of what the OP said honestly to some extent.  You understand perfectly.  If I play a game at 30fps 1st I can tolerate it.  Hence why I'm getting the Neo.  I'm buying it for FFXV.  When the PC version comes out the Neo version of FFXV will be unplayable to me.  It's a day and night difference.  When you're on a console, for the most part there isn't much room for customization.  I can't play at below 1080p on PC, I'd rather lower the settings to play at 60 fps, it's untolerable because I have the option and I know the difference between 900p and 1080p is night and day.  On a console the FOV, Framerate and graphical fidelity is locked and if there's no PC version there is no unsatisfied feeling because I'd rather play a game at 30 than not at all.  Even with Bloodborne, I noticed stuttering like how mid-range rigs will.  If you use a mouse and keyboard at 30 fps..... Mouse Input is terrible at 30, that's definitely more noticeble.  However, the OP is right to an extent.  Console games are generally optimzed for 1 setting, 2-3 with the Neo.  The game is built from the ground up to accomidate the PS4.  I don't like motion blur, most PC guys don't, but if someone does, the framerate will look less jarring when playing at 30fps on a console game.  It comes down to not knowing any better.  There's no option on consoles to up the FPS and since there is on a PC one is left with an unsatisfied feeling because even 35-45 FPS is nicer than 28-30, no joke.  30FPS isn't terrible, but if you have a great rig you won't want to play games at 60fps.  That's why PC dudes are hype about G-Sync monitors to circumvent V-Sync entirely.  



Currently Playing: N/A

Anime and Studying is life RN

2 things.
G-sync removes stutter even with unlocked frame rate.
Consoles on TV will have even more added input lag on top of the horrible 30fps input lag thanks to TVs . Plus the lag of V-Sync!

So no, not even close. Right now I can't even play RL on my TV with my PC at 60fps because of the input lag. So Consoles are even way worse at 60 fps.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

V-Sync actually synchronizes the framerate with the display device's refresh rate so that the frames get displayed at proper times. In practice, this means that if, for example, your refresh rate is 60 Hz but you can't hit 60 FPS reliably, it'll cap your FPS to 30 FPS. And if you can't hit even 30, it'll cap it to 15. Half V-Sync supposedly does exactly the same, expect that the cap is at maximum half of the display device's refresh rate, so it shouldn't solve any problems, I think.

I'd also like an explanation as to why the same problem doesn't apply to consoles because to me it seems like it should. It's exactly the same technology as far as I know.

Also, 30 FPS looks bad on both PC and consoles. More importantly though, it feels bad.



Around the Network
Zkuq said:
V-Sync actually synchronizes the framerate with the display device's refresh rate so that the frames get displayed at proper times. In practice, this means that if, for example, your refresh rate is 60 Hz but you can't hit 60 FPS reliably, it'll cap your FPS to 30 FPS. And if you can't hit even 30, it'll cap it to 15. Half V-Sync supposedly does exactly the same, expect that the cap is at maximum half of the display device's refresh rate, so it shouldn't solve any problems, I think.

I'd also like an explanation as to why the same problem doesn't apply to consoles because to me it seems like it should. It's exactly the same technology as far as I know.

Also, 30 FPS looks bad on both PC and consoles. More importantly though, it feels bad.

try to cap witcher 3 with the framerate limiter of the game, the game will be capped at 30 fps but it will run like shit even without motion blur because it's just a framerate limiter, after that, use radeonpro double vsync or nvidia's control panel and the game will be smooth even at 30 fps. I did the same with various games, nobody can tell me it's not true cause I did it myself.

I'm not talking about 30 fps looking bad or no, I'm talking about the configuration and why looks worse on pc compared to a console when you use a framerate limiter or when your rig is not powerful enough



DannyDesario said:
CGI-Quality said:

Could be because they're used to better framerates, overall. So when going from games at 60-120fps down to 30, it just looks (and feels).....well.....bad! Then again, depends on the game. For example, LA Noire's devs claimed that anything above 30fps would mess up the experience (on PC, they were actually right - the custscenes had some trouble). 

It's the opposite of what the OP said honestly to some extent.  You understand perfectly.  If I play a game at 30fps 1st I can tolerate it.  Hence why I'm getting the Neo.  I'm buying it for FFXV. 

I guess we'll both be weeping then once SE announces that the Pro version will be much prettier but still run like shit.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

I play my PC on my television sometimes and I don't have the newest hardware anymore, but I have played some games in good settings on 30 to 45 fps, and it looks fine, but I also play old games on my monitor on 144 fps and its just a smooth-train.

I have noticed on console I can really see the differense too, one night-shift I was playing the last of us remastered and after that we played bloodborne, everything, moving around, just shifting the camera felt so much less smooth because of the FPS.

Obviously when you can get used to a better setting, it doesnt matter what your pc is hooked up to, even if you use your same television and controller, you are going to miss the better settings, because, fps is one of the most noticeable things.




Twitter @CyberMalistix

BasilZero said:
I had v sync on when playing Bioshock 2 - it capped it to 60 FPS for my gaming laptop.

But it felt weird and didnt feel like proper 60 FPS though.

When I turned off V-sync - my FPS went from 60 to like in the 80s to 90s - and it felt much much more smoother.


Ever since I got into PC gaming back in 2012 - i've noticed FPS drops when playing games even on consoles.

Doesnt really bother me personally but I find it funny that I'm able to notice the FPS drops now even playing consoles/games that I played 15+ years ago too.

me too i have the same problem :C

thanks to pc gamingn now I always notice everything. the other day i was playing wind waker hd, there are some fps drops and I was like uggggghhh why do I have to live with this curse lol



I very often have to settle for 40-45 fps in PC games and I'm happy with it. Like Dragon Age Inquisition, Witcher 3 and Fallout 4 are impossible for me to run in 60fps.

And honestly a stable 30 fps is quite okay for me. I remember I put almost 2,000 hours in Oblivion back in 2006 and 2007 and it ran at an unstable 18-27fps and yet I was the happiest man in the world.

I'm so amazed that Battelfield 1 is able to run in stable 60fps on my PC while it looks almost photorealistic. It has the best graphics of all games and yet it performs extremely well.