By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - 1080p standard needs to stop.

 

1080p or 60 frames?

Without 1080p games arent playable 90 23.62%
 
Without 60 frames games arent playable 154 40.42%
 
Both (PC Master Race) 137 35.96%
 
Total:381
BraLoD said:
HollyGamer said:
Tachikoma said:

The framerate issues on PS4 are down to how the engine handles layered alpha transparencies, this would still be an issue with a lower resolution, and will most likely be fixed by updating the shader transient ordering.

The framerate issues on XBO are acceptable except for the audio stuttering and freezes, these freezes and audio stuttering as a direct result of the engines cache streaming process on the Xbox One, and would happen even at a lower resolution, they also happen on the Ps4 but much less pronounced as the memoy pools used for the asset buffer are faster on the PS4.

Thus the primary issues in framerate on both consoles is down to engine issues, not resolution.

Also, dropping the resolution to 900P isn't anywhere near enough to bridge the gap between 30fps and 60fps.

Hime sama has spooken, this thread should have been ended while ago. 

Wonder if Tamron approves of this kind of treatment... :P

Ups.......Run..........!!!!!



Around the Network
Lawlight said:
Chrizum said:
People act like 720p was the standard last gen, where in reality most HD games rendered at 600p or similar resolutions.


Yeah, I doubt that.

He's telling the truth. Not many games, especially multiplats, actually ran at a native 720p. Several games ran at 600p or 640p. The Call of Duty games are the highest profile examples. Most games were upscaled. 



Tachikoma said:
Azzanation said:

Ok so here is the problem. We all know by now that consoles cannot achieve 1080p and 60 frames without major sacifices, infact they cant even do 1080p without having a steady frame rate unless the games are extremly linear or nerfed to the ground (Apart from acouple exceptions).

Fallout 4 is the main problem here. Why would a dev release a game like Fallout 4 on consoles at 1080p when they know the frame rate suffers.. I tell you why, because 1080p has become the talking point this gen, that even devs are afraid to lower it because of the backlash they might get for there games. Would you rather play Fallout 4 at 1080p and 30 frames with alot of dips? or would you rather play it at 900p and closer to a 60 frames or a rock solid 30 frames? As an old school gamer my anwser is clear, Framerate. I was going to buy Fallout 4 on consoles but after seeing that both consoles struggle to keep it at 30 it ended my decision very quickly and PC it is for me.

In my own opinion, these 1080p standards need to stop, these consoles just cant do it without sacificing not only frame rate but many other effects and features just to achieve this pixal count. There is barely any difference between 1080p and 900p but dips below 30 frames is noticable. If the gaming community can get off the 1080p hype train and start looking at whats more important for games, devs will finially realise that lowering resolution isnt all that bad if it makes the game more fun to play and stable.

http://gearnuke.com/fallout-4-pete-hines-defends-solid-30-fps-comment-talks-frame-rate-ps4-xbox-one/#

The framerate issues on PS4 are down to how the engine handles layered alpha transparencies, this would still be an issue with a lower resolution, and will most likely be fixed by updating the shader transient ordering.

The framerate issues on XBO are acceptable except for the audio stuttering and freezes, these freezes and audio stuttering as a direct result of the engines cache streaming process on the Xbox One, and would happen even at a lower resolution, they also happen on the Ps4 but much less pronounced as the memoy pools used for the asset buffer are faster on the PS4.

Thus the primary issues in framerate on both consoles is down to engine issues, not resolution.

Also, dropping the resolution to 900P isn't anywhere near enough to bridge the gap between 30fps and 60fps.

No where in my post did i say 900p will achieve 60 frames, if you took the time in reading what i wrote i stated "would you rather play it at 900p and closer to a 60 frames or a rock solid 30 frames?" Theres a much bigger jump to hit 60 frames then lowering to 900p and thats nothing new. The problem with games today is that we still get games with screen tearing, unconsistent frame rates and bad pop ins, and i can go on and on with this. Consoles arent perfect and are very limited however in 2015 we get games like Fallout 4 that gives us last generation problems. Why would they release this game knowing theres going to be stutters and dips closer to 20 FPS? Are we gaming on premium consoles? Why is it when i play something on the WiiU i get no screen tearing, bad dips in frames for there top tier games?

Lowering the resolution to 900p was an example and will help the console achieve other things to help improve the gameplay and preformance, but instead we are now buying a game with many day one issues and last generation problems which we are all now waiting on patches and updates, even then it wont hit a rock solid 30 frames because Fallout 4 is aiming to high to achieve what it needs to do. Iv seen the countless articles when a game doesnt hit 1080p, iv read the feed back multiple times of gamers being upset because for some odd reason 1080p is the make or break decision for a purchase.

Dont get me wrong there are acouple great exceptions that devs have done this gen, games like Infamous (apart from the lack of enermy AI) and Forza Horizon 2 are great examples but most of the market is filled with games that dont offer the real standard for next gen and thats games with no dips etc.

Simple, dont aim for 1080p if the frame rate cant keep up, or atleast focus on reducing screen tearing and pop ups. Theres alot more you can do at 900p, every bit counts.



IamAwsome said:
Lawlight said:


Yeah, I doubt that.

He's telling the truth. Not many games, especially multiplats, actually ran at a native 720p. Several games ran at 600p or 640p. The Call of Duty games are the highest profile examples. Most games were upscaled. 


Nopes, you're wrong too. There's a difference between "most games running at 600p" vs "a handful of games run at 600p".

 

I'm looking at the list and yeah, most are 720p.



Azzanation said:

No where in my post did i say 900p will achieve 60 frames

Except you did.

Azzanation said:

or would you rather play it at 900p and closer to a 60 frames or a rock solid 30 frames?

"Closer to" isn't going to happen, neither is hitting 60, a rock solid 30 is possible by fixing the ENGINE issues that result from transparencies and cache loading, beyond these the game sticks to 30 pretty damn well even at 1080p.

Azzanation said:

The problem with games today is that we still get games with screen tearing, unconsistent frame rates and bad pop ins, and i can go on and on with this. 

Fallout 4 on both XBO/PS4 does not have screen teraring, some games do but the games that do, the developers made the descision to allow it in order to push additional frames that would otherwise have been kulled, usually for games that benefit from a higher framerate but are too demanding to attain it consistently.

Azzanation said:

Consoles arent perfect and are very limited however in 2015 we get games like Fallout 4 that gives us last generation problems.

Of course they are, the entire console is cheaper than most 2015 games minimum suggested GPU, expecting similar performance, EVEN IF they dropped the resolution all the way down to 720P is unrealistic.

Azzanation said:

Why would they release this game knowing theres going to be stutters and dips closer to 20 FPS?

Because the stutters are engine bugs, bugs that can be fixed in patches, but more importantly, because people who have been dying to play Fallout 4 for all these years are likely too busy having fun playing the game to whine about the otherwise fairly solid framerate dropping in gunfights or the once in a blue moon occasion you end up in a steam filled corridoor fighting ghouls.

Azzanation said:

Are we gaming on premium consoles?

No, what in the world gave you the impression that three consoles all using entry level hardware would ever be "premium"?

Azzanation said:

Why is it when i play something on the WiiU i get no screen tearing, bad dips in frames for there top tier games?

Because Nintendo prioritize that consistency over complexity, and use artstyle to cover up the fact that the models and textures are all fairly low quality, low quality meshes and textures means less surfaces to calculate light on, less vertex to track, simpler collision boxes, less video memory and therefor less data to move around.

To put it another way, they are built from the ground up to stay below a certain threashold, for XBO/PS4/PC, games are built to push the machine as much as possible and deliver much higher quality mesh and textrues, this is in part why you don't see third party games on WIIU anymore, because the engine used for XBO/PS4/PC is just the same engine with platform specific tweaks, to have the game run on the WiiU would take not only modifying the engine significantly, but downscaling all of the games assets.

Azzanation said:

Lowering the resolution to 900p was an example and will help the console achieve other things to help improve the gameplay and preformance, but instead we are now buying a game with many day one issues and last generation problems which we are all now waiting on patches and updates, even then it wont hit a rock solid 30 frames because Fallout 4 is aiming to high to achieve what it needs to do.

Fallout 3 sticks to 30fps most of the time, reducing it's resolution would make little to no impact for the reasons i stated in my first reply, as for " 900p was an example and will help the console achieve other things to help improve the gameplay and preformance" this copout roughly translates to "Fine you got me there but it could help someone else, im not sure where because i don't have a clue but if i keep repeating it maybe peope will believe me".

Ask anyone, ANYONE who has bought the game, if they regret buying it because it occasionally dips in framerates or freezes briefly when loading in more of the map, go on, i'll wait.

Azzanation said:

LIv seen the countless articles when a game doesnt hit 1080p, iv read the feed back multiple times of gamers being upset because for some odd reason 1080p is the make or break decision for a purchase.

1080p improves overall presentation and given that last generation games were mostly between 720p-900p, most people associate that resolution as being last gen, developers on the other hand shoot for a consistent setup across the board where possible. As for articles, it's mostly from sites like gearnuke and similar who love running stories about resolution because it gets idiots out in their droves fighting each other, which in turn makes them money.

What it boils down to is target framerate, if Fallout 4 runs fairly consistently at 30fps, (inconsistency NOT because of the resolution), and they test out a lower resolution, and the game still has the same inconsistencies despite it, but also, with vsync unlocked still doesnt have enough impact to make a difference, then dropping the resolution makes no sense, none at all. if an engine renders the world at a fairly stable 30fps but only drops under certain situations which are down to issues with the engine itself, resolution will not make one ounce of difference.

Azzanation said:

Simple, dont aim for 1080p if the frame rate cant keep up, or atleast focus on reducing screen tearing and pop ups. Theres alot more you can do at 900p, every bit counts.

1080p and 900p on fallout 4 would have the exact same issues.
Fallout 4 does not have screen tearing
Pop ups have nothing to do with resolution, and reducing resolution does not magically mean you can extend object LOD load ranges
There isn't "Alot" more you can do at 900p, because it isn't the GPU's being strained here, it's the CPU's and the system bus pushing data from storage to memory that cause the issues.



Around the Network

I vote for having a solid frame rate over 1080p. It's always best for native resolution, but if dropping to 900p give you 60 fps then do so.

Sounds like they need to tweek the engine some more. That's pretty common with this developer and thing should be running great in a few months.

I wonder if this is also why they have pushed back console mods till next year?



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

I think that 1080p/30 need to be standard and this gen like 720p/30 was last gen and that games need to aiming for 1080p/60, but only if that can be achievable and that doesn't affect smooth experience of game itself.
If game have problems on 1080p/30 than its better to go for 900p/30fps.



Shadow1980 said:
BraLoD said:

Even though Halo is a shooter, which can get good feedback from a higher framerate, if compromisses like game features has to be made to achieve it, one can really ask itself if it was the best route to take.

I honestly have doubts that the higher frame rate actually benefits gameplay. In my own experience, I had no real difference in personal performance going from the OXbox & 360 Halo games to the Master Chief Collection. Granted, that's just anecdotal evidence, but still. About the only real change the jump to 60 fps had on my personal experience was causing me to go through a long acclimation period, as 60 fps, especially in shooters, always felt weird and unnatural to me after years of playing shooters that ran at 30 fps. I couldn't stand to play CoD for long because the high frame rate was so off-putting. Even after having mostly adapted to Halo running at 60 fps, it still looks off kilter. The "Soap Opera/Hobbit effect" can apply to video games as well.

Especially after Halo 5 dropped split-screen in all modes, I honestly don't think the jump to 60 fps was worth it. We dealt with Halo running at 30 fps for over a decade. Everybody was fine with it. The games played perfectly smooth. I may be the exception, but I don't think shooters & third-person action games really need to run at 60 fps. Fighting games and even platformers might be another matter, but a rock-solid 30 fps for a shooter has always been more than sufficient.

I used to think like you when this gen started, 60FPS felt unnaturally smooth for some reason. Then I played Gears UE multiplayer... going from 60FPS to 30FPS campaign was horrible and even previous gears titles feel bad now. Same with first person shooters, it takes a while to get used to 30FPS and playing Halo 5 at 60FPS is a joy. So today my opinion is, 60FPS is clearly superior and really worth other compromisses. I also tried Forza 6 demo and boy was that 60FPS sweet for a racing game..

That said, I'm fine with games like Fallout, Witcher 3 etc. to run at 30FPS and I'm not really botherer by some dips either. I also don't see much difference between 1080p and 900p so I'd say solid framerate is more important than resolution. I've absolutely no tech knowledge tho, so I don't know if lowering resolution helps with framerate issues, but didn't they do the dynamic resolution thing for Halo to make framerates solid? I think it was a brilliant idea.



DakonBlackblade said:

The problem with Fallout 4 is that bethesda dont know how to make games run, theyre great at gameplay but they dont realy know how to make things work. The game is a mess on PC as well. If ppl stopped giving Bethesda a pass every damm time they release a game and it is inevitably a technical mess maybe theyd improve.

It's hardly in the exact same messy positiona s it is on consoles, especially going by Steam forums, reviews and of course watching streamers on YT shows they are having more fun and a lot less issues than major stuttering, frame hangs and bad textures like the current gen systems are having issues with. No need to lump PC into this just to show up on a dev you dislike.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

KiigelHeart said:

I used to think like you when this gen started, 60FPS felt unnaturally smooth for some reason. Then I played Gears UE multiplayer... going from 60FPS to 30FPS campaign was horrible and even previous gears titles feel bad now. Same with first person shooters, it takes a while to get used to 30FPS and playing Halo 5 at 60FPS is a joy. So today my opinion is, 60FPS is clearly superior and really worth other compromisses. I also tried Forza 6 demo and boy was that 60FPS sweet for a racing game..

That said, I'm fine with games like Fallout, Witcher 3 etc. to run at 30FPS and I'm not really botherer by some dips either. I also don't see much difference between 1080p and 900p so I'd say solid framerate is more important than resolution. I've absolutely no tech knowledge tho, so I don't know if lowering resolution helps with framerate issues, but didn't they do the dynamic resolution thing for Halo to make framerates solid? I think it was a brilliant idea.

GT4 already did the dynamic resolution thing by dropping half the scan lines under load, looked a bit odd. Wipeout HD did it too although not enough with the fury expansion, still causing some drops and screen tear. Halo does a lot more scaling than just resolution to stay on target, half rate animation, aggressive dynamic lod for shadows, alpha effects, models and debris.

Looks like Cod black ops 3 is trying resolution scaling too, fails

http://www.eurogamer.net/articles/digitalfoundry-2015-call-of-duty-black-ops-3-performance-analysis

PS4 ranges between 1360x1080 to 1920x1080
Xbox One we're looking at a sustained 1280x900 resolution (upto 1600x900)
PS4 hands in an average frame-rate of 51fps, Xbox One is at 49fps (with drops to 28)

It seems capping at 30 would have been the better choice.

Maybe devs should not aim to max out the system all the time, aim a bit lower, keep some overhead available so the frame rate is steady and/or the image quality doesn't have to change constantly. I thought this gen would bring less pop up, not think of new ways to add more.