By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Should 900p and 60fps be the base line now instead 1080p?

I think the standard should be consistency, and the rest should be left up to developers.

If realizing the developers vision means 1080/60, 900/30, 720/60, whatever, it's fine as long as the game is good. The important part is that if a game targets a certain performance than it should provide that level of performance.



Bet with Adamblaziken:

I bet that on launch the Nintendo Switch will have no built in in-game voice chat. He bets that it will. The winner gets six months of avatar control over the other user.

Around the Network

For me it's: 900p + 60FPS > 1080p + 30FPS

60FPS should be the target for all game developers imo.



The Pro is not designed to double frame rates, just bump resolutions and a few graphical settings. I don't know why people keep harping on about it when we've known what's in the box and what its marketing angle is for months now. In 9th gen 60fps better be more common than 30, though. I'd say 1440-1500p60 with high settings would be the sweet spot. Some will push for native 4k and won't have to sacrifice frame rate to do it, but others inevitably will.



Even at 900p you have to take some consideration that there are a lot of game that are CPU intesive and even the PS4 Pro may not hit 60fps. 900p/60fps should be the norm BUT is not going to be possible on all games.



This is a common misconception. A game thats 30fps wont magically shoot to 60fps if u drop it to 900p. In reality it would maybe gain 4-8 frames on average. Dropping resolution is typically used when a game is just shy of its targeted fps.

The entire game has to be designed around a 60fps target if thats what they want. MGSV is a good example. 1080p locked 60fps game, runs great, has great IQ but visually you can tell that the world is pretty barren and empty with only small bases with patrolling guards.

The games you mentioned BF1 and Nier are both targeted 60fps games from the beginning and even then they still drop frames on OG PS4 even at 900p.



Around the Network
Chevinator123 said:
This is a common misconception. A game thats 30fps wont magically shoot to 60fps if u drop it to 900p. In reality it would maybe gain 4-8 frames on average.

The entire game has to be designed around a 60fps target if thats what they want. MGSV is a good example. 1080p locked 60fps game, runs great, has great IQ but visually you can tell that the world is pretty barren and empty with only small bases with patrolling guards.

The games you mentioned BF1 and Nier are both targeted 60fps games from the beginning and even then they still drop frames on OG PS4 even at 900p.

Exactly. The hardware is simply anemic to be able to push 60fps and retain a degree of image quality.

Majority of games that claim they are "60fps" do so with poor graphics or they don't actually hit 60fps all the time.
Halo 5 achieved it with terrible graphics, Battlefield 1 doesn't achieve 60fps all the time, Overwatch manages it due to poor graphics that is hidden by great Art.
Battlefield 1, despite how pretty it looks at times, also has instances where a texture would sit right at home in a last generation console, especially when looking at distant areas on a map.

jason1637 said:
Why cant the Standard be 1080p 60fps on regular PS4 and higher resolution on Pro?

Because the hardware isn't high-end.

Dr.Vita said:

For me it's: 900p + 60FPS > 1080p + 30FPS

60FPS should be the target for all game developers imo.

Depends on the game in my opinion.
I thoroughly enjoyed Dragon Age: Inquisition on console, even though it was 900P and 30fps. Being a non-twitch shooter it was fine at 30fps, but something that is more competitive like Battlefield 1, Call of Duty, Halo or Overwatch? It has to be 60fps.

Framerate was one of the main aspects that made the PC in a league of it's own last generation, console gamers never knew what they were missing, so it's nice to see it come to the limelight as something people want these days.

TallSilhouette said:
The Pro is not designed to double frame rates, just bump resolutions and a few graphical settings.


That is a load of horse raddish.
It is completely up to the developer.




www.youtube.com/@Pemalite

Solid-Stark said:
jason1637 said:
Why cant the Standard be 1080p 60fps on regular PS4 and higher resolution on Pro?

Standard PS4 is not powerful enough to push 1080p60 with the usual targetted details.

But there are games running at 1080p 60fps. Also why cant devs just optimize their games even more to reach 1080p 60fps?



jason1637 said:
Solid-Stark said:

Standard PS4 is not powerful enough to push 1080p60 with the usual targetted details.

But there are games running at 1080p 60fps. Also why cant devs just optimize their games even more to reach 1080p 60fps?

Compare graphics between say, Uncharted 4 1080p30 and Call of Duty BO III 1080p60 (games I've played).

Some push hardware more than others in many dimensions. Also, there are games far less complex that can easilly render at 1080p60. Just because those can, doesn't mean they all can. It's not a linear comparison.



e=mc^2

Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)

jason1637 said:
Why cant the Standard be 1080p 60fps on regular PS4 and higher resolution on Pro?

Cause resolution and visuals sell more copies than framerates.



jason1637 said:
Solid-Stark said:

Standard PS4 is not powerful enough to push 1080p60 with the usual targetted details.

But there are games running at 1080p 60fps. Also why cant devs just optimize their games even more to reach 1080p 60fps?

You can't just endlessly "Optimize" there is only so much fuel in the tank.

On console it is a game of sacrifice. You sacrifice one detail/effect in the hope you can get enough resources to bolster two others.

For example, the move from Halo 3 to Halo Reach. - You lost Tessellated water, HDR lighting, Double Buffering... HDR is especially intensive, but really nice to have, same goes for Double Buffering.
Halo Reach then implemented a type of "Post Processing Filter" to achieve a similar effect as Morphological Anti-Aliasing, which is cheap. - Impostering for distant details so that 3D objects far away are rendered as 2D sprites and texture streaming to save memory.
They then used all this freed up graphics power for better texturing, more on-screen characters, longer draw distances, which overall improved the image substantually even though some assets took a massive graphics quality hit.

Halo 4 took this even farther, by using simpler models, that were hidden by better artistic assets, pre-calculated lighting and shadowing to free up resources for other things.

They all have one thing in common though, throw away one expensive graphics technique to use several cheaper (And often, lower quality) ones.

Halo 5 took a similar approach, in order for the game to run at 60fps, they had textures with 15fps, Spartans in the distance that are replaced with sprites would also be 10-15fps, massive object, shadow, texture popping due to asset streaming, simple models covered by good art and a big fat hit to the resolution.

Halo 6 will likely build on that.




www.youtube.com/@Pemalite