By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Killzone 2 framerate can drop as low as 20fps

My fact checking:

  • The refresh rate of the TV indicates its fps capability 1 Hz = 1fps.  Most digital TV's range between 60Hz (US) or 60 fps and 240 Hz or 240 fps.
  • Digital Foundry is an extremely reputable site with no platform allegiances.  If they said it, it's true.  They are the source of most video comparisons and complete framerate analysis on most games.  They have videos of all of the big games with their dirty frame rate secrets (GeoW2 with it's 24 fps dips).  They also do a nice job of explaining why the dip occurred and why you really shouldn't worry about it.
  • The dips in Killzone 2 are not a problem visually becasue the game is able to conceal them through the use of motion blur.  This may be an issue in terms of controller responsiveness.
  • Nice optical illusion gif : 

 

 

 

 



Thanks for the input, Jeff.

 

 

Around the Network
dbot said:
  • Nice optical illusion gif : 

 

 

 

 

 

QFT







VGChartz♥♥♥♥♥FOREVER

Xbone... the new "N" word   Apparently I troll MS now | Evidence | Evidence
NJ5 said:
Neptune said:
So it drops to about what normal games run at. No biggie. I'd be more worried if it dropped to 15 or 10 FPS.

This thread gets funnier and funnier. Since when do "normal games" (whatever that means) run at 20 fps?

I know that framerate standards got worse in this gen (for the PS3 and 360, not Wii), but even then 20 fps is never a normal framerate.

 

 

It runs at 30 fps, you see one screenshot that shows a dip to 20 and now the game runs at 20 fps? You're obnoxious.



PSN ID= bigdaddymoo

 

MSI GT725-074 owner..... TRUE BEAST.. COD4 is a different game on PC.

vthokiesrmoo said:
NJ5 said:
Neptune said:
So it drops to about what normal games run at. No biggie. I'd be more worried if it dropped to 15 or 10 FPS.

This thread gets funnier and funnier. Since when do "normal games" (whatever that means) run at 20 fps?

I know that framerate standards got worse in this gen (for the PS3 and 360, not Wii), but even then 20 fps is never a normal framerate.

 

 

It runs at 30 fps, you see one screenshot that shows a dip to 20 and now the game runs at 20 fps? You're obnoxious.

I would be obnoxious if I said that. However I didn't so it's fine.

 



My Mario Kart Wii friend code: 2707-1866-0957

Wait? This thread is still alive, I feel sorry for people who care so much about a game on a console they don't own.........



Around the Network
Griffin said:
Squilliam said:
One of the reasons why COD IV sells so well could be that its running at 60FPS with excellent visuals.

Cod4 drops way below 20FPS of you know where to go and what to do.  And its visuals are far from excellent.

 

 

COD4 runs at 60FPS on the X360, around 55FPS on the PS3 and the worse it goes is around 40FPS on the X360 version. You can see it on the very same site providing what this thread is all about.



 

 

 

 

 

Cinematic films run at 24p, 24 frames per second, progressive scan with motion blur to clear up the jutters on moving objects. The great thing about a cinema screen is it's highly reflective surface leaving a distinct after-image in your eye so you don't notice the transition between frames where the screen is blank.

On old 60Hrz CRT's or 60Hrz LCD televisions this transition is much more apparent, especially if showing interlaced footage. Any writing on 540i @ 60Hrz flickers like hell and looks blurry and of low quality. Even if a game runs at 30fps you'll want a nice 100Hrz LCD so you don't notice the transition between the tv's frames, the faster the image refreshes the clearer it looks. That's why I prefer to game at 720p instead of 1080i if using a 60Hrz tv.

COD4 looks great at 60fps, Killzone 2 looks great at 30fps because of their cinematic motion blur. So both games benefit from a tv that refreshes it's image 100 times per second just as much as eachother.

If you think the human eye can only "see" 20 fps, or even a riddiculously low 10fps then please download this video.
http://kimpix.net/2006/12/03/60fps-vs-24fps/



As someone who cares a great deal more about frame rate than graphics, this would bother me, although it isn't my type of game anyway.

I think it just depends what your perspective is. If you're a skill oriented gamer, whose goal it is to beat the hardest content and play against the top level players, frame rate dips are absolutely unacceptable. They are a significant detriment to reaction time, with 15-20 FPS reducing reaction time by at least .05 seconds in my case.

For "experience" players, who really just want to experience the story and the world they create ("immersion," if you'd prefer that term), then frame rate dips are much less significant. ,05 seconds off your response time is entirely negligible and unnoticable.


I fall in to the former group, but there's nothing wrong with being in the latter.



http://i14.photobucket.com/albums/a324/Arkives/Disccopy.jpg%5B/IMG%5D">http://i14.photobucket.com/albums/a324/Arkives/Disccopy.jpg%5B/IMG%5D">

Bodhesatva said:

As someone who cares a great deal more about frame rate than graphics, this would bother me, although it isn't my type of game anyway.

I think it just depends what your perspective is. If you're a skill oriented gamer, whose goal it is to beat the hardest content and play against the top level players, frame rate dips are absolutely unacceptable. They are a significant detriment to reaction time, with 15-20 FPS reducing reaction time by at least .05 seconds in my case.

For "experience" players, who really just want to experience the story and the world they create ("immersion," if you'd prefer that term), then frame rate dips are much less significant. ,05 seconds off your response time is entirely negligible and unnoticable.


I fall in to the former group, but there's nothing wrong with being in the latter.

Ok, but to say frame rate drops in this game will effect your performance is not accurate.  I experienced none in multiplayer and they have had half a year to tune that code.

 



windbane said:
Bodhesatva said:

As someone who cares a great deal more about frame rate than graphics, this would bother me, although it isn't my type of game anyway.

I think it just depends what your perspective is. If you're a skill oriented gamer, whose goal it is to beat the hardest content and play against the top level players, frame rate dips are absolutely unacceptable. They are a significant detriment to reaction time, with 15-20 FPS reducing reaction time by at least .05 seconds in my case.

For "experience" players, who really just want to experience the story and the world they create ("immersion," if you'd prefer that term), then frame rate dips are much less significant. ,05 seconds off your response time is entirely negligible and unnoticable.


I fall in to the former group, but there's nothing wrong with being in the latter.

Ok, but to say frame rate drops in this game will effect your performance is not accurate.  I experienced none in multiplayer and they have had half a year to tune that code.

 

I've never played the game, mind you, so I don't have personal experience. If you're saying the OP is incorrect, and that there are no frame rate drops, then that's great. 

If you're saying that there may be frame rate drops (from 30 to 20 FPS), then you're simply not noticing the effect because you aren't as picky about this particular concern as I am. It is not possible for a 33% loss in FPS to not cause reflex latency.

 



http://i14.photobucket.com/albums/a324/Arkives/Disccopy.jpg%5B/IMG%5D">http://i14.photobucket.com/albums/a324/Arkives/Disccopy.jpg%5B/IMG%5D">