| Azzanation said: No where in my post did i say 900p will achieve 60 frames |
Except you did.
| Azzanation said: or would you rather play it at 900p and closer to a 60 frames or a rock solid 30 frames? |
"Closer to" isn't going to happen, neither is hitting 60, a rock solid 30 is possible by fixing the ENGINE issues that result from transparencies and cache loading, beyond these the game sticks to 30 pretty damn well even at 1080p.
| Azzanation said: The problem with games today is that we still get games with screen tearing, unconsistent frame rates and bad pop ins, and i can go on and on with this. |
Fallout 4 on both XBO/PS4 does not have screen teraring, some games do but the games that do, the developers made the descision to allow it in order to push additional frames that would otherwise have been kulled, usually for games that benefit from a higher framerate but are too demanding to attain it consistently.
| Azzanation said: Consoles arent perfect and are very limited however in 2015 we get games like Fallout 4 that gives us last generation problems. |
Of course they are, the entire console is cheaper than most 2015 games minimum suggested GPU, expecting similar performance, EVEN IF they dropped the resolution all the way down to 720P is unrealistic.
| Azzanation said: Why would they release this game knowing theres going to be stutters and dips closer to 20 FPS? |
Because the stutters are engine bugs, bugs that can be fixed in patches, but more importantly, because people who have been dying to play Fallout 4 for all these years are likely too busy having fun playing the game to whine about the otherwise fairly solid framerate dropping in gunfights or the once in a blue moon occasion you end up in a steam filled corridoor fighting ghouls.
| Azzanation said: Are we gaming on premium consoles? |
No, what in the world gave you the impression that three consoles all using entry level hardware would ever be "premium"?
| Azzanation said: Why is it when i play something on the WiiU i get no screen tearing, bad dips in frames for there top tier games? |
Because Nintendo prioritize that consistency over complexity, and use artstyle to cover up the fact that the models and textures are all fairly low quality, low quality meshes and textures means less surfaces to calculate light on, less vertex to track, simpler collision boxes, less video memory and therefor less data to move around.
To put it another way, they are built from the ground up to stay below a certain threashold, for XBO/PS4/PC, games are built to push the machine as much as possible and deliver much higher quality mesh and textrues, this is in part why you don't see third party games on WIIU anymore, because the engine used for XBO/PS4/PC is just the same engine with platform specific tweaks, to have the game run on the WiiU would take not only modifying the engine significantly, but downscaling all of the games assets.
| Azzanation said: Lowering the resolution to 900p was an example and will help the console achieve other things to help improve the gameplay and preformance, but instead we are now buying a game with many day one issues and last generation problems which we are all now waiting on patches and updates, even then it wont hit a rock solid 30 frames because Fallout 4 is aiming to high to achieve what it needs to do. |
Fallout 3 sticks to 30fps most of the time, reducing it's resolution would make little to no impact for the reasons i stated in my first reply, as for " 900p was an example and will help the console achieve other things to help improve the gameplay and preformance" this copout roughly translates to "Fine you got me there but it could help someone else, im not sure where because i don't have a clue but if i keep repeating it maybe peope will believe me".
Ask anyone, ANYONE who has bought the game, if they regret buying it because it occasionally dips in framerates or freezes briefly when loading in more of the map, go on, i'll wait.
| Azzanation said: LIv seen the countless articles when a game doesnt hit 1080p, iv read the feed back multiple times of gamers being upset because for some odd reason 1080p is the make or break decision for a purchase. |
1080p improves overall presentation and given that last generation games were mostly between 720p-900p, most people associate that resolution as being last gen, developers on the other hand shoot for a consistent setup across the board where possible. As for articles, it's mostly from sites like gearnuke and similar who love running stories about resolution because it gets idiots out in their droves fighting each other, which in turn makes them money.
What it boils down to is target framerate, if Fallout 4 runs fairly consistently at 30fps, (inconsistency NOT because of the resolution), and they test out a lower resolution, and the game still has the same inconsistencies despite it, but also, with vsync unlocked still doesnt have enough impact to make a difference, then dropping the resolution makes no sense, none at all. if an engine renders the world at a fairly stable 30fps but only drops under certain situations which are down to issues with the engine itself, resolution will not make one ounce of difference.
| Azzanation said: Simple, dont aim for 1080p if the frame rate cant keep up, or atleast focus on reducing screen tearing and pop ups. Theres alot more you can do at 900p, every bit counts. |
1080p and 900p on fallout 4 would have the exact same issues.
Fallout 4 does not have screen tearing
Pop ups have nothing to do with resolution, and reducing resolution does not magically mean you can extend object LOD load ranges
There isn't "Alot" more you can do at 900p, because it isn't the GPU's being strained here, it's the CPU's and the system bus pushing data from storage to memory that cause the issues.







