By using this site, you agree to our Privacy Policy and our Terms of Use. Close
lucidium said:

600p, 792p and 900p can all be upscaled to 1080p, each of those pixels being shown as intended, displaying 1440p on a display is wasting a large percentage of framebuffer for what will translate to nothing more than an extremely costly AA method.

We are not talking small amounts here, 1440p is close to 3.7m pixels compared to 1080p with its 2.1m, thus you would be wasting the resources to display a large portion of screen that would not actually display and would simply be blended in to the available viewing area and just make edges nicer.

when you consider with the right hardware pipeline a console can apply AA with little to no performance hit, over-rendering for the target display for AA seems like an utterly retarded option.

As for squeezing out more effects and better lighting, they will indeed, just as developers will be doing to the xbox one and ps4 for their entire lifetimes on the market, with that said the hardware jump required from the xbox one and ps4  to be a worthwhile technological leap is significant enough that the effort and cost involved with adding more and more in to a game just to justify the drop in framerate averages wont see the light of day.

You both seem to think developers pick a resolution and framerate then put in as much content as they can before it breaks, when in reality the game is developed and choices are made depending on how that engine performs, if its early days and the framerate is terrible, they dont think "okay 30fps is our target", they optimize and tweak content, if it looks like they can get away with a 60fps target they go for it, if it looks like they wont reach it then the next logical step is to optimize for what they then decide.

The exception to this is games where 60fps is a mandate from the outset, mainly shooters, the optimization process is the same but instead of deciding which framerate to target they drop the resolution to a point where they hit that target and gradually increase it as they optimize, milking it for the best they can.

With new hardware which by next generation will most likely all be low power, high performance sub 20nm process components, the initial performance results will be high, so in almost all cases developers are going to optimize to keep it.

Additionally, developers dont sit in a darkened room disconnected from the world, we have green rooms, lounges, bar meetings, open communication with other teams in other studios and one of the bigger talking points this generation is the fuss around resolution and framerates, in the past that talk was only ever a point when a multiplatform game performed significantly worse on ps3, now we have studio heads biting their nails scared to announce official resolutions and framerates because of the backlash it may cause to the project, as a direct result of this, virtually everyone in the development community is either directly or indirectly on a mission to hit as high a framerate and resolution to the golden 1080p/60 standard as possible, so you can bet your backside when the new consoles come around, anyone trying to push below that are in for a seriously rough ride with not only the public but their own QA staff and management.

We're talking next gen here, 2021/2022. 4K tv's are already more affordable than 1080p sets were in 2005. Consoles will be upscaling to 4K next gen, not to 1080p. 1080p in 2022 will be what 720p feels like now, old hat.

1080p60 is entirely possible this gen, but 60fps doesn't show off in screenshots. Next gen will have the same pressure to produce significantly better looking visuals. And it will only get harder each gen to make a big impact. Forza 5 got as much backlash for the compromises it made to reach 1080p60 as games going sub 1080p60 to deliver the looks. The only lesson to learn is not to raise expectations too high again with demo builds on powerful pc's.