By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Xbox One S performance boost revealed

DonFerrari said:
Zekkyou said:

Hitman is an 'ideal case test' because of its fairly unique set-up (separate sections clearly push the GPU and CPU, optional frame-rate cap, and no graphical variations), but it's not representative of the average 8th gen game. If developers treated the CPU and GPU equally we would see manifestations of the X1's CPU advantage more often, but given the PS4 and X1 themselves are GPU focused, it's somewhat rare (almost all 8th gen titles are GPU focused, in line with the hardware their built for). The X1's CPU advantage could be as large as the PS4's GPU advantage, and the latter would still be more significant.

Anyway, while it's nice MS are making the effort (and developers will certainly appreciate it), as DF mention, it'll likely be fairly inconsequential in most instances. Many people don't even consider the standard PS4's 40%~ GPU advantage significant.

Yes... a very good article indeed... we deny 40-50% GPU difference makes any real impact, but a 7% increase in the GPU makes all the difference in the world... and cherry picking exemples (and being clear that even then the gains were small) is a clear show that this guy was trying too hard.

Are you referring to DF? Because they've always been on the side that believes the PS4's GPU advantage is significant (at least when properly utilised).



Around the Network
Zekkyou said:
DonFerrari said:

Yes... a very good article indeed... we deny 40-50% GPU difference makes any real impact, but a 7% increase in the GPU makes all the difference in the world... and cherry picking exemples (and being clear that even then the gains were small) is a clear show that this guy was trying too hard.

Are you referring to DF? Because they've always been on the side that believes the PS4's GPU advantage is significant (at least when properly utilised).

Nope, to some analysts and specially at fanbase =]

And there are some DF guys that are X1 fierce defenders.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

SvennoJ said:
Intrinsic said:
I am curious...... so all it took to support 4K mdeia playback was a 7% boost in GPU, a bump in EDram bandwidth and probably the inclusion of HDMI 2.0?

And all inclusions still putting it all round below the PS4 in performance. So does this mean that the only thing preventing the PS4 from doing HDR and 4k media playback in HDMI 2.0?

And another thing, there has been a lot of talk about the existence of the Neo or even the scorpio being due to an inability to shrink the APUs found in the PS4/XB1 down from 28nm to 14/16nm.

But here we see a 16nm APU in the XB1s? Interesting........

The 7% boost or higher clock speed and memory bandwidth was needed to absorb the extra cost of rendering to a HDR output buffer, 10 bit instead of 8 bit.

HMDI 1.4 also supports 10 bit (and 12 bit color) however not HDR 10, the new format used by 4K tvs for HDR. The ps4 and original XBox One could already output games in 10 bit (and 4K 30fps), yet without the extra brightness range that HDR allows. HDMI 2.0 is needed for HDR. (And HDCP 2.0 DRM for video content)

The 20nm planar die shrink failed. XBox One S is now using 16nm finfet which keeps the cost up for now. The release of the 500GB model is likely delayed due to very low to no profit margins, get the hardcore to buy the profitable 2GB model first. Perhaps the slight increase in clock speed is also to compensate for any low level differences between planar and finfet architecture, but that's pure speculation on my part.

Anyway it's good news for the price of the Neo. Without the esram its gpu shouldn't be much more expensive than the 16nm redesign in the XBox One S. Maybe even cheaper as it's a standard component. Neo is partly due to the die shrink problems, we should have cheaper slims already and the relatively higher cost of the delayed slims worked in favor of a mid gen refresh.

Oh ok thanks  that makes much more sense now.

But this brings me to another question.....

If sony is going first with the Neo, does that mean they keep making the base PS4 at 28nm? and for how long? Cause I'm guessing they go with the Neo for now then eventually do a 16nm version of the core PS4 APU further down the road. 



SvennoJ said:

[...]HMDI 1.4 also supports 10 bit (and 12 bit color) however not HDR 10, the new format used by 4K tvs for HDR. The ps4 and original XBox One could already output games in 10 bit (and 4K 30fps), yet without the extra brightness range that HDR allows. HDMI 2.0 is needed for HDR. (And HDCP 2.0 DRM for video content)


The 20nm planar die shrink failed. XBox One S is now using 16nm finfet which keeps the cost up for now.[...]

In addition to what you mentioned, the Xbox One S's optical drive was also upgraded to handle triple-layer UHD BDs.  I'm not sure if the PS4 is capable of dealing with those out of the box or not.  I did a quick web search but got some conflicting information about its BD-XL support.

Planar die shrink vs. finFET?  I read that in the Digital Foundry article, and now here as well.  I'm not 100% sure what the significant of planar vs. FinFET is, honestly.  Time for another web search I guess, thank you for your comments.  I found them interesting and educational.  :)

EDIT TO ADD:  How much of the GPU clock increase was simply because moving to a smaller process addressed heat issues to the point that they could increase it without penalty?  Was the entire increase needed for HDR, or did they go a little further than that just because they could?



scrapking said:

EDIT TO ADD:  How much of the GPU clock increase was simply because moving to a smaller process addressed heat issues to the point that they could increase it without penalty?  Was the entire increase needed for HDR, or did they go a little further than that just because they could?

Both clocks could have been increased, probably by 20 and keeping the power below the X1. That would obviously have pissed off around 20M XBox users, so they just upped the gpu clock to allow for the HDR extra load on the gpu and stay roughly at X1 levels with performance.



Around the Network
Intrinsic said:
SvennoJ said:

The 7% boost or higher clock speed and memory bandwidth was needed to absorb the extra cost of rendering to a HDR output buffer, 10 bit instead of 8 bit.

HMDI 1.4 also supports 10 bit (and 12 bit color) however not HDR 10, the new format used by 4K tvs for HDR. The ps4 and original XBox One could already output games in 10 bit (and 4K 30fps), yet without the extra brightness range that HDR allows. HDMI 2.0 is needed for HDR. (And HDCP 2.0 DRM for video content)

The 20nm planar die shrink failed. XBox One S is now using 16nm finfet which keeps the cost up for now. The release of the 500GB model is likely delayed due to very low to no profit margins, get the hardcore to buy the profitable 2GB model first. Perhaps the slight increase in clock speed is also to compensate for any low level differences between planar and finfet architecture, but that's pure speculation on my part.

Anyway it's good news for the price of the Neo. Without the esram its gpu shouldn't be much more expensive than the 16nm redesign in the XBox One S. Maybe even cheaper as it's a standard component. Neo is partly due to the die shrink problems, we should have cheaper slims already and the relatively higher cost of the delayed slims worked in favor of a mid gen refresh.

Oh ok thanks  that makes much more sense now.

But this brings me to another question.....

If sony is going first with the Neo, does that mean they keep making the base PS4 at 28nm? and for how long? Cause I'm guessing they go with the Neo for now then eventually do a 16nm version of the core PS4 APU further down the road. 

Sony said they would keep the base ps4 around for the rest of the gen, dunno if there's any point in making a 16nm version of it. I imagine the Neo will quickly become cheaper to produce than the older tech and slowly replace the base ps4. Why make things harder by introducing yet another hardware spec for QA.

My guess is all this stuff of base ps4 will stick around and the xbox one S has no influence on performance is all PR so people won't stop buying the base consoles until the successors are freely available.



SvennoJ said:

Sony said they would keep the base ps4 around for the rest of the gen, dunno if there's any point in making a 16nm version of it. I imagine the Neo will quickly become cheaper to produce than the older tech and slowly replace the base ps4. Why make things harder by introducing yet another hardware spec for QA.

My guess is all this stuff of base ps4 will stick around and the xbox one S has no influence on performance is all PR so people won't stop buying the base consoles until the successors are freely available.

Yeah I'm beginning to think so too. The way i see it is that Sony would never manufacture a base PS4 at a 14/16nm process. So you are right, at some point manufacturing the Neo will actually be cheaper than making the base PS4. Sony probably figures that there will be around 20 of the 40M+ PS4 owners that will even tally upgrade to the Neo, which in turn will mean there will be lots of used PS4s out there for people that want that. 

I'm guessing that from the time the Neo is released, Sony won't make any more than 10M new base PS4s. The base PS4 will also probably be dropped to $299 with the Neo priced at $399 and maybe even a $499 elite version with 2TB or storage. Eventually all that would be left is a $299 neo.... probably around the time the scorpio is coming to market. 



scrapking said:

SvennoJ said:

[...]HMDI 1.4 also supports 10 bit (and 12 bit color) however not HDR 10, the new format used by 4K tvs for HDR. The ps4 and original XBox One could already output games in 10 bit (and 4K 30fps), yet without the extra brightness range that HDR allows. HDMI 2.0 is needed for HDR. (And HDCP 2.0 DRM for video content)


The 20nm planar die shrink failed. XBox One S is now using 16nm finfet which keeps the cost up for now.[...]

In addition to what you mentioned, the Xbox One S's optical drive was also upgraded to handle triple-layer UHD BDs.  I'm not sure if the PS4 is capable of dealing with those out of the box or not.  I did a quick web search but got some conflicting information about its BD-XL support.

Planar die shrink vs. finFET?  I read that in the Digital Foundry article, and now here as well.  I'm not 100% sure what the significant of planar vs. FinFET is, honestly.  Time for another web search I guess, thank you for your comments.  I found them interesting and educational.  :)

EDIT TO ADD:  How much of the GPU clock increase was simply because moving to a smaller process addressed heat issues to the point that they could increase it without penalty?  Was the entire increase needed for HDR, or did they go a little further than that just because they could?

The drive in the ps4 is fast enough in theory to read UHD discs however it does not have hardware level AACS 2.0, drive level DRM needed to access UHD discs. Dunno whether it can see the 3rd layer either. People have tried UHD discs in standard BDXL drives to no effect. Without AACS 2.0 it simply can't access the disc.

Maybe they could go further with the clock, yet it also has a smaller fan and power brick inside (saves a lot of costs yet adds heat) I doubt MS wants to risk more overheating issues. It's hard to predict what extra juice you need for 10 bit output. The final output buffer is 25% larger memory wise yet plenty 3D games render in fp16 anyway, it's really only the final step that needs extra work. The increase in memory speed was likely more important than the extra GPU power, more data to shift around in the final stage.



SvennoJ said:

[...]Maybe they could go further with the clock, yet it also has a smaller fan and power brick inside (saves a lot of costs yet adds heat) I doubt MS wants to risk more overheating issues. It's hard to predict what extra juice you need for 10 bit output. The final output buffer is 25% larger memory wise yet plenty 3D games render in fp16 anyway, it's really only the final step that needs extra work. The increase in memory speed was likely more important than the extra GPU power, more data to shift around in the final stage.

Oh, interesting.  I didn't consider that moving the power adapter inside the case might have been a significatn production cost savings.  Interesting!  Thanks for sharing that.



DonFerrari said:

And there are some DF guys that are X1 fierce defenders.

No, there are not. DF are like a mirror; PS4 defenders think they're pro-Xbox, Xbox defenders think they're pro-Sony, Nintendo detractors think they're Nintendo fanboys, etc. They simply reflect the bias of the observer.