By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Stardock CEO - DirectX 12 to end Xbox One Resolution Woes

lucidium said:
SvennoJ said:

No, next gen PS5/XB0 will be 1440p upscaled to 4K most games running at 30 fps.
The Wii W might be doing 1080p60 next gen.

1440p is not a TV standard, its a monitor resolution. As such they wouldn't target it because it would be wasted on a large portion of users, and we were a decade or more away from consoles even considering a normal title at 4K.

Neither is 600p, 792p or 900p. Next gen (2021) 4K upscaling will be standard. 4K native will be too much, so we'll see anything from 1200p to maybe a few titles running at native 4K. Most will fall in the middle, 2560x1440 at 30fps.

Even if it stayed at 1080p, 30fps will still be used to squeeze out more effects and better lighting.



Around the Network
SvennoJ said:
lucidium said:
SvennoJ said:
 

No, next gen PS5/XB0 will be 1440p upscaled to 4K most games running at 30 fps.
The Wii W might be doing 1080p60 next gen.

1440p is not a TV standard, its a monitor resolution. As such they wouldn't target it because it would be wasted on a large portion of users, and we were a decade or more away from consoles even considering a normal title at 4K.

Neither is 600p, 792p or 900p. Next gen (2021) 4K upscaling will be standard. 4K native will be too much, so we'll see anything from 1200p to maybe a few titles running at native 4K. Most will fall in the middle, 2560x1440 at 30fps.

Even if it stayed at 1080p, 30fps will still be used to squeeze out more effects and better lighting.

600p, 792p and 900p can all be upscaled to 1080p, each of those pixels being shown as intended, displaying 1440p on a display is wasting a large percentage of framebuffer for what will translate to nothing more than an extremely costly AA method.

We are not talking small amounts here, 1440p is close to 3.7m pixels compared to 1080p with its 2.1m, thus you would be wasting the resources to display a large portion of screen that would not actually display and would simply be blended in to the available viewing area and just make edges nicer.

when you consider with the right hardware pipeline a console can apply AA with little to no performance hit, over-rendering for the target display for AA seems like an utterly retarded option.

As for squeezing out more effects and better lighting, they will indeed, just as developers will be doing to the xbox one and ps4 for their entire lifetimes on the market, with that said the hardware jump required from the xbox one and ps4  to be a worthwhile technological leap is significant enough that the effort and cost involved with adding more and more in to a game just to justify the drop in framerate averages wont see the light of day.

You both seem to think developers pick a resolution and framerate then put in as much content as they can before it breaks, when in reality the game is developed and choices are made depending on how that engine performs, if its early days and the framerate is terrible, they dont think "okay 30fps is our target", they optimize and tweak content, if it looks like they can get away with a 60fps target they go for it, if it looks like they wont reach it then the next logical step is to optimize for what they then decide.

The exception to this is games where 60fps is a mandate from the outset, mainly shooters, the optimization process is the same but instead of deciding which framerate to target they drop the resolution to a point where they hit that target and gradually increase it as they optimize, milking it for the best they can.

With new hardware which by next generation will most likely all be low power, high performance sub 20nm process components, the initial performance results will be high, so in almost all cases developers are going to optimize to keep it.

Additionally, developers dont sit in a darkened room disconnected from the world, we have green rooms, lounges, bar meetings, open communication with other teams in other studios and one of the bigger talking points this generation is the fuss around resolution and framerates, in the past that talk was only ever a point when a multiplatform game performed significantly worse on ps3, now we have studio heads biting their nails scared to announce official resolutions and framerates because of the backlash it may cause to the project, as a direct result of this, virtually everyone in the development community is either directly or indirectly on a mission to hit as high a framerate and resolution to the golden 1080p/60 standard as possible, so you can bet your backside when the new consoles come around, anyone trying to push below that are in for a seriously rough ride with not only the public but their own QA staff and management.



VanceIX said:
Children, slightly better software will never make up for a 30-50% hardware difference.

The end.





SvennoJ said:

But that's talking it up even more, since you know the difference between 900p and 1080p isn't noticeable right.

You have a good point.



lucidium said:

600p, 792p and 900p can all be upscaled to 1080p, each of those pixels being shown as intended, displaying 1440p on a display is wasting a large percentage of framebuffer for what will translate to nothing more than an extremely costly AA method.

We are not talking small amounts here, 1440p is close to 3.7m pixels compared to 1080p with its 2.1m, thus you would be wasting the resources to display a large portion of screen that would not actually display and would simply be blended in to the available viewing area and just make edges nicer.

when you consider with the right hardware pipeline a console can apply AA with little to no performance hit, over-rendering for the target display for AA seems like an utterly retarded option.

As for squeezing out more effects and better lighting, they will indeed, just as developers will be doing to the xbox one and ps4 for their entire lifetimes on the market, with that said the hardware jump required from the xbox one and ps4  to be a worthwhile technological leap is significant enough that the effort and cost involved with adding more and more in to a game just to justify the drop in framerate averages wont see the light of day.

You both seem to think developers pick a resolution and framerate then put in as much content as they can before it breaks, when in reality the game is developed and choices are made depending on how that engine performs, if its early days and the framerate is terrible, they dont think "okay 30fps is our target", they optimize and tweak content, if it looks like they can get away with a 60fps target they go for it, if it looks like they wont reach it then the next logical step is to optimize for what they then decide.

The exception to this is games where 60fps is a mandate from the outset, mainly shooters, the optimization process is the same but instead of deciding which framerate to target they drop the resolution to a point where they hit that target and gradually increase it as they optimize, milking it for the best they can.

With new hardware which by next generation will most likely all be low power, high performance sub 20nm process components, the initial performance results will be high, so in almost all cases developers are going to optimize to keep it.

Additionally, developers dont sit in a darkened room disconnected from the world, we have green rooms, lounges, bar meetings, open communication with other teams in other studios and one of the bigger talking points this generation is the fuss around resolution and framerates, in the past that talk was only ever a point when a multiplatform game performed significantly worse on ps3, now we have studio heads biting their nails scared to announce official resolutions and framerates because of the backlash it may cause to the project, as a direct result of this, virtually everyone in the development community is either directly or indirectly on a mission to hit as high a framerate and resolution to the golden 1080p/60 standard as possible, so you can bet your backside when the new consoles come around, anyone trying to push below that are in for a seriously rough ride with not only the public but their own QA staff and management.

We're talking next gen here, 2021/2022. 4K tv's are already more affordable than 1080p sets were in 2005. Consoles will be upscaling to 4K next gen, not to 1080p. 1080p in 2022 will be what 720p feels like now, old hat.

1080p60 is entirely possible this gen, but 60fps doesn't show off in screenshots. Next gen will have the same pressure to produce significantly better looking visuals. And it will only get harder each gen to make a big impact. Forza 5 got as much backlash for the compromises it made to reach 1080p60 as games going sub 1080p60 to deliver the looks. The only lesson to learn is not to raise expectations too high again with demo builds on powerful pc's.



Around the Network
SvennoJ said:

We're talking next gen here, 2021/2022. 4K tv's are already more affordable than 1080p sets were in 2005. Consoles will be upscaling to 4K next gen, not to 1080p. 1080p in 2022 will be what 720p feels like now, old hat.

4K monitors are cheaper, 4K tvs are not.
That has been the case since the price on LCD monitors first dropped.

4K will not be the norm by 2021/2022, many people still, even now, buy 720P lcd and plasma tvs and most manufacturers still produce them (as of January 2011, 46% of new tv sets were 720P or lower).
Adoption of LCD/Plasma TV's was forced in many areas by the switch over of most telecommunications systems from analog to digital, with it being widely misrepresented to customers that a lcd/plasma tv was REQUIRED for the digital switch over, when in reality only a set top box was needed.

There is no such forced posturing to make people upgrade to a 4K set.



lucidium said:
SvennoJ said:

We're talking next gen here, 2021/2022. 4K tv's are already more affordable than 1080p sets were in 2005. Consoles will be upscaling to 4K next gen, not to 1080p. 1080p in 2022 will be what 720p feels like now, old hat.

4K monitors are cheaper, 4K tvs are not.
That has been the case since the price on LCD monitors first dropped.

4K will not be the norm by 2021/2022, many people still, even now, buy 720P lcd and plasma tvs and most manufacturers still produce them (as of January 2011, 46% of new tv sets were 720P or lower).
Adoption of LCD/Plasma TV's was forced in many areas by the switch over of most telecommunications systems from analog to digital, with it being widely misrepresented to customers that a lcd/plasma tv was REQUIRED for the digital switch over, when in reality only a set top box was needed.

There is no such forced posturing to make people upgrade to a 4K set.

http://www.pcmag.com/article2/0,2817,2429270,00.asp Sub $1000 50" 4K tv
http://www.businessinsider.com/the-rise-of-4k-tv--the-new-technology-will-be-in-a-majority-of-us-households-surprisingly-soon-2014-3

Hardly anyone having a HDTV in 2005 didn't stop the 360 from going to 720p and 1080p. Why even bother with 1080p now if not everyone has a 1080p set yet... Next gen is to run from 2022 to 2030 ish. 4K (scaled) output is guaranteed.



SvennoJ said:

Hardly anyone having a HDTV in 2005 didn't stop the 360 from going to 720p and 1080p.

It stopped Microsoft from including a HDMI port until later models.

SvennoJ said:

Why even bother with 1080p now if not everyone has a 1080p set yet... Next gen is to run from 2022 to 2030 ish. 4K (scaled) output is guaranteed.

Because an insurmountable portion of games didnt actually use 1080P last gen, as for "why bother supporting 1080P", it was one of many resolutions supported to upscale to, however the 360 primarilly utilized the ability to select resolutions on a monitor because in place of a HDMI port you could buy a AV to VGA cable with optical/LR breakout unit for the console - thus supporting irregular resolutions.

Early on last generation, 1080P was all but abandoned for most games because the machines could not handle it realistically on most engines, as such most aimed for 720P or slightly higher / lower with many games framerates being 25-30 in most cases, the hardware jump we have currently is enough to get us to a higher resolution and a higher framerate, but not a consistent 1080/60, as such most are opting to dump framerate for resolution, or resolution for framerate.

The next logical step for us developers will be pushing as much as we can at a consistent 60fps at full 1080p, because it will by that time be the most compatible format, adoption of 4K tv's will of course be pretty good but by no means the substantial share of the market at the point in which the next consoles release, and while they can add ports to support new displays that were niche at launch, they cant just up sticks and modify the hardware to suddenly run at higher resolutions.

You ruined your own argument with one of your own points.

Reverting back to this one.

SvennoJ said:

Hardly anyone having a HDTV in 2005 didn't stop the 360 from going to 720p and 1080p.

The very fact the 360 did not have a HDMI port at the time, despite HDMI capable TVs being on the rise and picking up pace underlines the fact that manufacturers are playing a balancing act between unit cost and performance, that balancing act results in consoles that are affordable but by no means cutting edge, in 2005-2006, consoles managed 720/30, in 2013-2014, consoles managed 900/30 - 1080/30   /    720/60 - 900/60, the occasional title will push 1080/60 or at least try to, like the occasional title last gen pushed higher than 720p, but that is not the norm.

They will invariably support upscaling to resolutions beyond 1080P, but the render pipelines are going to stay with 1080p targets.

Unless there is a significant and groundbreaking advancement in graphics processing technology between now and their release, and said technology is cheap enough to mass produce, and that is a very big IF, then the next generation of consoles will invariably be 1080/60 and push the envelope of content within those constraints rather than release titles that punch above their target display hardwares parameters.

You can insist until you're blue in the face that the next generation is going to be higher than 1080p but low framerate, or any other scenario with a higher resolution, that doesnt however mean for a second that its going to happen.

Personally, ive been there since the days of rendering in SD through CRT scan timings all the way to displayport, hdmi and gsync, so you can believe whatever you want to believe about what resolutions theyll use, and ill stick with tried and tested development practices.



^ Many of the reasons you list are for this gen. This gen didn't start with HDMI 2.0, just as the 360 started without HDMI. Later models might have HDMI 2.0 out, but games won't use it. Next gen will have HDMI 2.0 from the get go.

Next gen you might be able to use a HDMI 2.0 compatible monitor for those odd resolutions, just as the 360 could. 4K will be a supported format for games, although most likely not native, but higher than 1080p. Just that you don't want to go beyond 1080p doesn't mean other people won't. Promotional screenshots are already used at higher than 1080p resolutions. I'll stick to my point that 1080p in 2021 will be the new Wii.



Promotional material is generally pre-rendered in which case the resolution is whatever the designer rendering the produced scene sets the render to, in-engine production shots are usually taken on development hardware that is using whatever pc monitor resolution theyre working on the title with, theres a big difference between target resolution and a resolution used when making the game (working environment), the idea of a promotional screenshot is to show off the game in the best possible light, what many people fail to realize is that many of these releases are done much higher resolution on purpose, questionable maybe, but also with today's print technologies, a higher resolution image bring provided as PR is done so with it in mind that it may end up in a magazines print, hence why PR agencies usually have logos in vector format or insanely high resolution CMYK png.

In all likelihood yes, next gen consoles may well ship with HDMI 2 and displayport connections, but again this isnt because the developers are aiming to run games at as high a resolution as humanly possible, the upper limit is still going to be 1080P, because regardless of how cheap 4K units will be at the time, the majority will still be using 1080P sets, and moving on from that, the hardware jump as i said would have to be the biggest hardware jump in the history of videogame generations to make the most of 4K, that just isn't going to happen.

The 1080p-4K resolution range is far too wide a gap for developers to push for targets within that range when they know full well the people that would benefit from it at the minority, and the gap is far too wide for a single generation upgrade to bridge, look at 7th-8th gen, it barely made the jump from 720p-1080p, and that's a pixel difference of 1,152,000, to jump up even to JUST 1440P, they would have to make a pixel difference jump of 1,612,800.

Its like the situation before the PS3 release when they were talking about how we would have 1080p 60fps games with thousands of players and so on, when in reality we got 720p or less and 30fps or less in most games, Sony were on the same overblown estimation you seem to have, the reality of that is even TWO generations since that claim, we aren't at a point where games are "averaging" 1080p/60, hell we're barely averaging 1080p/30
7th gen to 8th gen = higher resolution, more stable framerates
8th gen to 9th gen = finally stable 60fps at target resolution.

Once we get there though 4K will definitely be the next big step as by the end of the 9th gen, 4K or higher will be close to or already the standard, so the industry is going to have a major issue at that point if technology in home consoles continues its speed of progression.

Basically, your estimation of tech progress in consoles is vastly overblown.

Before the PS4/XBONE launched, my wife wrote a fairly detailed post about this, basically saying "if youre expecting 1080P/60 to be the norm, youre in for a shock", and the thread got bashed by people, just like you, who refused to believe the new consoles would be so underpowered, insisting that the magic of a closed box would make up for the low end components, but here we are, months after release, 1080p/60 is a bulletpoint in a small handful of titles with the averaging far from that, and with the way engines are progressing and the overall push, 1080P/60 is going to be the goal everyone would LIKE to hit, but wont loose any sleep over if they have to drop the res and/or framerate.