By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - [DF] Metro Redux: what it's really like to develop for PS4 and Xbox One

Intrinsic said:
walsufnir said:

I guess most people don't really care about the bits discussed in the interview and I guess even less understand what the guy is talking about. I guess many are just happy with the console they have, at least I hope so.

But still, not everything just adds up, as I said in my post, and perhaps in real life application it is "just a bit more powerful" but who knows? I am not a game dev and you are not so we are still left in darkness.

Yeah, I guess so to. And yes, we are still in the dark. I just find it strange though.

Its not like we are talking about radically different hardware here. the only major differences are memory type, bandwidth and APIs. But at the heart of it both consoles are made from the exact same processor architecture. And this is whay i become slightly skeptic when I hear a dev somewhat downplay differences.

Just looking at the shader cores in the GPU; The XB1 has 12 GCN compute units. This is the heart of the GPU. The PS4 has 18 of the exact same GCN compute units. Thats 6 more GCN compute units than the XB1. To say that isn't a lot more would be like saying it would make no difference if instead of having 12 GCN cores the XB1 had 6. Half an orange more is exactly that, half an orange more.


Ah, the inevitable hard facts come to rise... Again, we all know the differences but we don't know how much of this spec advantage will translate to real life performance.



Around the Network

Pretty good interview. Love the work from the 4A guys and I'm getting Redux.



walsufnir said:


Ah, the inevitable hard facts come to rise... Again, we all know the differences but we don't know how much of this spec advantage will translate to real life performance.

Thats exactly why I say I need a more indepth analysis.

Its funny how double sided the "hard facts spec analysis seems to be. Indulge me a little.

You have a system that has 12 GPU cores and then one that has 18 GPU cores. In a perfect world, that would mean that the win with 18 will at all times have 50% more GPU performance, since we are saying that this isn't a perfect world, it would mean that the most you will stably be able to get outta the one with 18 cores is lets just say 70% so that means realistically those 18 cores give you only 12.6 cores worth of real world performance.

Now it gets interesting. Doesn't/shouldn't this real world assessment apply to the system with 12 cores too? Shouldn't that also mean that "realistically" they are only eeking out 70% of performance from those 12 cores? Shouldn't that then mean that those 12 cores give a real world performance equivalent of 8.4 cores? Even factoring "real world" performance we should still see a near 50% boost in performance. Unless of course the world the PS4s GPU lives in is realer than the one the XB1s GPU lives in and somehow the XB1 manages to run at peak performance all the time.

Speaking of some worlds be realer than others, its common knowlege that the PS4 so far has better and more efficient APIs so it should at least be living in a real world that is less impactful on its overall performance than the XB1.

Poeple always seem to say that the PS4 realistically wouldn't run as well as advertised, that is true (at least without some serious optimization) but my poing is, neither will the XB1. Which brings me back to, so what are the real benefits of the PS4 having more power across the board. Or have I missed a report somewhere that says having more of everything in the PS4 makes it less efficient?



Intrinsic said:
walsufnir said:


Ah, the inevitable hard facts come to rise... Again, we all know the differences but we don't know how much of this spec advantage will translate to real life performance.

Thats exactly why I say I need a more indepth analysis.

Its funny how double sided the "hard facts spec analysis seems to be. Indulge me a little.

You have a system that has 12 GPU cores and then one that has 18 GPU cores. In a perfect world, that would mean that the win with 18 will at all times have 50% more GPU performance, since we are saying that this isn't a perfect world, it would mean that the most you will stably be able to get outta the one with 18 cores is lets just say 70% so that means realistically those 18 cores give you only 12.6 cores worth of real world performance.

Now it gets interesting. Doesn't/shouldn't this real world assessment apply to the system with 12 cores too? Shouldn't that also mean that "realistically" they are only eeking out 70% of performance from those 12 cores? Shouldn't that then mean that those 12 cores give a real world performance equivalent of 8.4 cores? Even factoring "real world" performance we should still see a near 50% boost in performance. Unless of course the world the PS4s GPU lives in is realer than the one the XB1s GPU lives in and somehow the XB1 manages to run at peak performance all the time.


Oh yes, if you compare this in this way, sure. But GPUs alone can't do anything. They have to be "fed" and this involves CPUs which are the same on both systems. Even more confusing and confusing the imaginary comparison totally is the fact that nowadays consoles run a fully fletched multitasking system which eats up 2 cores and 3gb of ram on both systems. So the scheduler on both systems (which also takes cpu-time) has to handle the game (which of course runs at highest priority) and has to deal with other programs that eat up cpu time. Furthermore there are caches and so on which no one takes into consideration when talking about real life performance of the consoles.

People only mention GPU and GDDR5 and conclude known percentages of advantage while you only see a part of the whole picture where all those other facts rely on.



"Actually, the real pain comes not from ESRAM but from the small amount of it. As for ESRAM performance - it is sufficient for the GPU we have in Xbox One."

I've been saying that for months. Now every time i need to explain that to someone i can just quote this guy as a reference

It would be interesting to know what sort of real world performance advantage developers see between the PS4 and X1, assuming perfect conditions. They tend to dance around the specifics of that issue. The figure is definitely somewhere between 30 - 60%, but past that it's incredibly hard to tell.

Personally, I'd bet on about 40%. A very significant amount, especially for 1st parties, but small enough (relative to equal development time) that the PS4's 3rd party advantage should remain as IQ and stability boosts.



Around the Network
walsufnir said:


Oh yes, if you compare this in this way, sure. But GPUs alone can't do anything. They have to be "fed" and this involves CPUs which are the same on both systems. Even more confusing and confusing the imaginary comparison totally is the fact that nowadays consoles run a fully fletched multitasking system which eats up 2 cores and 3gb of ram on both systems. So the scheduler on both systems (which also takes cpu-time) has to handle the game (which of course runs at highest priority) and has to deal with other programs that eat up cpu time. Furthermore there are caches and so on which no one takes into consideration when talking about real life performance of the consoles.

People only mention GPU and GDDR5 and conclude known percentages of advantage while you only see a part of the whole picture where all those other facts rely on.

Again, this is true but.....

Both consoles have the same CPU. And as far as games are concerned, CPUs do very specific things especially when designing on hardware so similar. But going off what he said in the interview, its clear that the APIs and drivers that govern the function and efficiency of the CPU is further along on the PS4 than it is on the XB1 as MS seems to be trying really hardto bring its APIs up to at least aprity with the PS4.

As I said, every single factor if looked at sincerely points to the PS4 not being just marginally more powerful. Like everything that could be said about the game development process or enviroment will lean in favour of the PS4. And all these little things while great on their own, has to add up in someway not somehow overall detract. The PS4 has

  • Better APIs (CPU realted) and a better closer to the meatl shader language
  • more GPU cores (50% more) and higher GPU compute capability (300% more)
  • while both consoles give devs access to 5GB RAM, the PS4 has an overall larger pool of fatser RAM all under onc address.
  • 100% more Render output units
  • Over 30% more texture mapping units

Any one of those things should give the PS4 a slight advantage, and some of them should give it a really big advantage. But somehow, everyone seems to want us to believe that all of thses things combined doesn' amount to much? I am not saying that the PS4 gives 100% perfomance 100% of the time, but neither would the XB1. 

All I really care to know is what stuff like what I mentioned above means for the respective dev platforms. I dunno, I just think that the PS4 should be capale of a lot more than a resolution bump and more framerate stability.



''Well, the issue is slightly more complicated - it is not like 'here, take that ten per cent of performance we've stolen before', actually it is variable, like sometimes you can use 1.5 per cent more, and sometimes seven per cent and so on''
''You forgot to mention the ROP count, it's important too - and let's not forget that both CPU and GPU share bandwidth to DRAM [on both consoles].''
There goes the magical 10% of power freed from the bad bad kinect! and lets say DX12 make everything equal on API side of things, still the PS4 has the 40% more GPU performance and the extra ROP units! and lets all forget about the much better GPGPU capabilities of the PS4, Can not wait anymore for the likes of uncharted 4 and The Order and Drive Club!



Add me if u want :)

Zekkyou said:

"Actually, the real pain comes not from ESRAM but from the small amount of it. As for ESRAM performance - it is sufficient for the GPU we have in Xbox One."

I've been saying that for months. Now every time i need to explain that to someone i can just quote this guy as a reference

It would be interesting to know what sort of real world performance advantage developers see between the PS4 and X1, assuming perfect conditions. They tend to dance around the specifics of that issue. The figure is definitely somewhere between 30 - 60%, but past that it's incredibly hard to tell.

Personally, I'd bet on about 40%. A very significant amount, especially for 1st parties, but small enough (relative to equal development time) that the PS4's 3rd party advantage should remain as IQ and stability boosts.


This isn't the first Dev to say this. I've seen it said at least twice before this. It seems to be the most common complaint about why devs are having trouble reaching 1080p. Something about needing at least 45mb for 1080p but ESRAM only has 32mb blah blah blah. I don't remember the exact details but that was a rough draft. What I don't get is if that's true why have they hit 1080p sometimes? 



Intrinsic said:
walsufnir said:


Oh yes, if you compare this in this way, sure. But GPUs alone can't do anything. They have to be "fed" and this involves CPUs which are the same on both systems. Even more confusing and confusing the imaginary comparison totally is the fact that nowadays consoles run a fully fletched multitasking system which eats up 2 cores and 3gb of ram on both systems. So the scheduler on both systems (which also takes cpu-time) has to handle the game (which of course runs at highest priority) and has to deal with other programs that eat up cpu time. Furthermore there are caches and so on which no one takes into consideration when talking about real life performance of the consoles.

People only mention GPU and GDDR5 and conclude known percentages of advantage while you only see a part of the whole picture where all those other facts rely on.

Again, this is true but.....

Both consoles have the same CPU. And as far as games are concerned, CPUs do very specific things especially when designing on hardware so similar. But going off what he said in the interview, its clear that the APIs and drivers that govern the function and efficiency of the CPU is further along on the PS4 than it is on the XB1 as MS seems to be trying really hardto bring its APIs up to at least aprity with the PS4.

As I said, every single factor if looked at sincerely points to the PS4 not being just marginally more powerful. Like everything that could be said about the game development process or enviroment will lean in favour of the PS4. And all these little things while great on their own, has to add up in someway not somehow overall detract. The PS4 has

 

  • Better APIs (CPU realted) and a better closer to the meatl shader language
  • more GPU cores (50% more) and higher GPU compute capability (300% more)
  • while both consoles give devs access to 5GB RAM, the PS4 has an overall larger pool of fatser RAM all under onc address.
  • 100% more Render output units
  • Over 30% more texture mapping units

 

Any one of those things should give the PS4 a slight advantage, and some of them should give it a really big advantage. But somehow, everyone seems to want us to believe that all of thses things combined doesn' amount to much? I am not saying that the PS4 gives 100% perfomance 100% of the time, but neither would the XB1. 

All I really care to know is what stuff like what I mentioned above means for the respective dev platforms. I dunno, I just think that the PS4 should be capale of a lot more than a resolution bump and more framerate stability.


I know what you want to say but numbers don't add up, they highly depend on each other. If this all would apply then everything would be way better than it is currently for PS4. If your game is, let's say, CPU limited, then having more GPU power is of no use, obviously. That is what the guy is telling us when he says that they were profiling. I don't say that nothing adds up but you can't judge a whole system when you compare spec by spec. And even if you measure, you have to say what exactly you measure.

Remember the CPU benchmark which should show us which CPU is better? This test was totally useless because we didn't know exactly what they measured.

If you want to test only the CPU, then you have to test everything that belongs to the CPU so at maximum you saturate caches and then you measure. If your data is getting bigger, say 32 mb, than it should favor Xbox One CPU because it is higher clocked and can fit into ESRAM. If your data exceeds 32 mb than most probably PS4 would win again because Xbox One would need to put data into DDR3 ram. And this is just a CPU test.

So after all that is quite the thing why we see some games only 720p on Xbone while PS4 reaches 1080p (besides of easiness to develop for, of course) and why some people like Crysis can pull of Ryse from Xbone at launch.

Data is handled differently, CPU load differs, mem usage differs, everything differs and all relies on the whole system and data doesn't care about specs but how it can efficiently be handled by.



method114 said:
Zekkyou said:

"Actually, the real pain comes not from ESRAM but from the small amount of it. As for ESRAM performance - it is sufficient for the GPU we have in Xbox One."

I've been saying that for months. Now every time i need to explain that to someone i can just quote this guy as a reference

It would be interesting to know what sort of real world performance advantage developers see between the PS4 and X1, assuming perfect conditions. They tend to dance around the specifics of that issue. The figure is definitely somewhere between 30 - 60%, but past that it's incredibly hard to tell.

Personally, I'd bet on about 40%. A very significant amount, especially for 1st parties, but small enough (relative to equal development time) that the PS4's 3rd party advantage should remain as IQ and stability boosts.


This isn't the first Dev to say this. I've seen it said at least twice before this. It seems to be the most common complaint about why devs are having trouble reaching 1080p. Something about needing at least 45mb for 1080p but ESRAM only has 32mb blah blah blah. I don't remember the exact details but that was a rough draft. What I don't get is if that's true why have they hit 1080p sometimes? 


Read my answer to Intrinsic - different games have different loads and different demands, also different approaches to rendering.