By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Zappykins said:
Arkaign said:
Zappykins said:
Fusioncode said:
Zappykins said:
ICStats said:
VitroBahllee said:
I'm watching a liveblog of it now. This is going to really boost the Xbox One.

How will it do that?

They say DX12 is a console like API.  XB1 is already a console.  If it has a boost it would only be because their current SDK is bad.

Nah, it's not like that.  Think of it as like a super optimizing driver.  Only it will give you one platform that does what Steam wants their new OS to do, but with the years of experiences that developers have working on DirectX. It's like a new OS.

Xbox One should get at least a 20% GPU improvement according to what we saw today.  Perhaps significantly greater.

Unreal Engine 4 will get upgraded to work with DirectX 12.

It will be the new standard for is flexibility, scalability and power.  PC and Xbox One games will both benefit.

 

Can you please explain this? I'm curious as to why you think that will happen. 

Well, assuming Xbox One will  at least some of the benifit of a 50% optimization, but probably already is ahead of PC's there.   But even if it doesn't, they stated a 20% reduction in GPU render time per frame, or something to that effect.

Mantle is supposed to give a 20% improvement at well.  It's a bit like mantle.

So I went with a conservative estimate of 20%.  The baseline of the GPU.  However, the scaling across the 8 core CPU might result in greater improvements.


The improvements are relevant to configurations (as shown in the slides) where you are hamstrung by slow high-level API draw calls that depend on CPU to a high degree.

XB1 architect already confirmed that XB1 has a custom variant of DX that already has low-level API for efficient draw calls.

Translation : this will speed up PCs, but will do very little for XB1 (if anything). Now in PR/Marketing-speak, they would LOVE to be able to say 'this will massively improve XB1 graphics'. However, they can't, for two reasons :

(1)- It's not true. If it were, they'd be screaming it from the rooftops.

and

(2)- If it were true, it would mean that the XB1 dev team were complete idiots (obviously untrue), and massive liars (also obviously untrue).

XB1 graphics WILL improve, just like always, as people get more comfortable with the SDKs and learn how to maximize how things look without hitting areas that cause large performance hits.

At the end of the day, the choices to go with DDR3 and a considerably weaker GPU are just going to be something we all have to live with. There is no secret magic to make things substantially better, but at the same time there is no reason we can't have games that look and play great on XB1.

For that, I propose we try to minimize talk about these things, as none of this plays to XB1's strengths. We get it, it's a rung or two down on power. If people have fun with the games they play on it, who cares?

Games are supposed to be fun, and wherever someone has a big grin on their face, whether it be at 1080p on the screen, 2600p with some PC nut, or 792p on the XB1, more power to em.

That's why I didn't go off any CPU inprovement, eventhough I would assume it will also become more effective and efficient.  I was just looking at the GPU claim.  Any ideas how much the 360 improved?  If you compair crackdown to Halo 4, there is quite a bit of improvement.

Microsoft didn't want to go with DDR3, if you look at their old notes and plans it's always DDR4 memory.  But unfortunately, it's not ready yet.  They could have delayed the Xbox One for a year, but I think they went when they felt they really had too.  You can always wait an tech will get better.

True, at some point you just have to get the thing out the door.

The 360 improved quite a lot, for different reasons. At the time, the tech they used was pretty exotic (shared memory, multi-core CPU, oddball architecture that needed all new APIs devved).

This time around things are very well documented and understood from the start, which is why we probably shouldn't expect too much magic down the line. Look at Forza 5. On a PC, a DDR3-based 7790 variant (which in reality would be about 7750 levels at best) wouldn't be able to play anywhere near that well.  It's a pretty fair example of how well optimized the API already was for the XB1 compared to PC DX11.x, as a PC would have needed something in the range of a 7870 or 7950 to hit those visuals for a similar game.