By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - TRUE XB1 vs PS4 spec comparison

Adinnieken said:
dahuman said:
Adinnieken said:
drkohler said:

That is not how memory controllers work. I have explained (to the best of my insight into the technology) how the gpu mmu crossbar works in another thread. I'm not going to do that again.

The 204GB/s number thrown around by ms is completely bogus (and probably corresponds to creative accounting for some rmw cycles inside the gpu caches). At this time, I stick with a maximum achievable bandwidth of approx. 150 GB/s (for some peculair access patterns). Until a REAL ms engineer comes forth and explains the REAL functions of the gpu crossbar, we should take the numbers forwarded by ms pr speak as rumours (or extremely creative accounting).

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

He's not really wrong Adinn, while his probable real number is guesswork, he's saying achievable, not theoretical maximum at peak, which neither console will be able to do anyways.

I think though in my original comment I acknowledged this.  The realworld bandwidth that Microsoft has achieved is 70-80% of the maximum theorethical.  And they state the reason why no system can achieve the maximum theorethical, not even Sony.  I don't disagree that theoretical maximum's aren't necessarily a figure we should use, but you can't say the PS4 will achieve 176GB/s and then say the Xbox One won't achieve its theorethical max when neither can reach it.


Hence why I said it's how you manage the RAM in my individual post, however fast the eSRAM or however low the latency might be, it's still just 32MB of it with 8GB of DDR3 that's much further away. The same can be said about the PS4's 8GB as it's by default further away from the GPU and CPU even though you have more of it at that speed. Either way though, as far as overall graphical processing abilities go, PS4 currently does have the edge, not that I think it really matters since you have something like the Wii U(weaker) on the market and more powerful hardware in PCs and the upcoming Steam OS powered PCs(AKA Steam Box.)



Around the Network

you forgot to multiply all of xbone's stats by 3x for teh cloud.



Adinnieken said:
g911turbo said:
drkohler said:
g911turbo said:

So 0.4% of the Xbox one memory is higher bandwidth than PS4, the other 99.6% of the memory pool is much higher bandwidth on the PS4.

That is something most people misunderstand. The esram may only be a small fraction of the total ram, but if the gpu can do 99% of the work it has to do in esram, then you are obviously ok. How much edram/esram you need has been tested endlessly by whoever makes chips . It seems that 32MB is some kind of "sweet spot" if you look at <=1080p images. Ms has built a very complex infracstructure around the esram bottleneck to ensure they get as close as possible to 100% efficiency.

Trust me.  I am a BSEE.  I understand this stuff isn't always black and white.

This anandtech article says it pretty well.   I think at the end of the day, GDDR5 is the better choice for a gaming centric console, hands down.  Microsoft is most certainly aiming for a media hub, and there could be some advantages to DDR3 there.  But from a games perspective, its hard to argue against GDDR5 being better, especially AT 1080P

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/3

Also, 32MB was probably more of a "cheap" spot too in terms of cost.


I don't believe that is the argument.

I think the argument Microsoft is making is that while Sony has the performance specs, it doesn't necessarily equate to a significant advantage due to the limitations of the choices made.

And the limitations of the PS4 are a unified memory, a easy to use and program architecture, and a stronger console. I would love to have those limitations.



kitler53 said:
you forgot to multiply all of xbone's stats by 3x for teh cloud.


Incase you didn't know Xbox One games can be played offline. The devs can't have the game looking beast with those with online and crap for those without.



Ashadian said:
Dark_Feanor said:
5 weeks to go, still no games running on the PS4 with 50% more resolution or frame rate.

I can´t understand why people with ZERO hardware/software archtecture knowledge are confident for speaking with so much passion of memory access bus, gigaflops and the like.

Its the Devs stating the power difference! Look the articles up!

I have seen only two or three original articles that tap into these datails. They say both are powerfull, they say both are close, one could be "faster" whatever this means, one could be easier to developer and jsut that. Every other is derived from those or pure speculaion based on the early leaks.

Again, where are the games? That is simple.

They just have to show one game running at 50% more resolution or frame rate. I´m not even asking for shadows or particles effects. 

People with none, what so ever, technical background are fighting to denay whatever a Microsoft engineer (the best and most well paid) has to say. 

And just belive that Mark Cerny is a God, a guy that no one has heard about before february 2013.

That is rediculous.



Around the Network
Dark_Feanor said:
Ashadian said:
Dark_Feanor said:
5 weeks to go, still no games running on the PS4 with 50% more resolution or frame rate.

I can´t understand why people with ZERO hardware/software archtecture knowledge are confident for speaking with so much passion of memory access bus, gigaflops and the like.

Its the Devs stating the power difference! Look the articles up!

I have seen only two or three original articles that tap into these datails. They say both are powerfull, they say both are close, one could be "faster" whatever this means, one could be easier to developer and jsut that. Every other is derived from those or pure speculaion based on the early leaks.

Again, where are the games? That is simple.

They just have to show one game running at 50% more resolution or frame rate. I´m not even asking for shadows or particles effects. 

People with none, what so ever, technical background are fighting to denay whatever a Microsoft engineer (the best and most well paid) has to say. 

And just belive that Mark Cerny is a God, a guy that no one has heard about before february 2013.

That is rediculous.

your not setting the bar high there, 30 frames vs 45 frames 720p vs 900p these are 50% margins



Whitefire said:
Adinnieken said:
g911turbo said:
drkohler said:
g911turbo said:

So 0.4% of the Xbox one memory is higher bandwidth than PS4, the other 99.6% of the memory pool is much higher bandwidth on the PS4.

That is something most people misunderstand. The esram may only be a small fraction of the total ram, but if the gpu can do 99% of the work it has to do in esram, then you are obviously ok. How much edram/esram you need has been tested endlessly by whoever makes chips . It seems that 32MB is some kind of "sweet spot" if you look at <=1080p images. Ms has built a very complex infracstructure around the esram bottleneck to ensure they get as close as possible to 100% efficiency.

Trust me.  I am a BSEE.  I understand this stuff isn't always black and white.

This anandtech article says it pretty well.   I think at the end of the day, GDDR5 is the better choice for a gaming centric console, hands down.  Microsoft is most certainly aiming for a media hub, and there could be some advantages to DDR3 there.  But from a games perspective, its hard to argue against GDDR5 being better, especially AT 1080P

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/3

Also, 32MB was probably more of a "cheap" spot too in terms of cost.


I don't believe that is the argument.

I think the argument Microsoft is making is that while Sony has the performance specs, it doesn't necessarily equate to a significant advantage due to the limitations of the choices made.

And the limitations of the PS4 are a unified memory, a easy to use and program architecture, and a stronger console. I would love to have those limitations.


Acording to Ethomaz:

The PS4 can´t swap into games or apps. You might not suspend  Knack and go play a multplay match of COD with yuor friends.



Dark_Feanor said:
Whitefire said:
Adinnieken said:
g911turbo said:
drkohler said:
g911turbo said:

So 0.4% of the Xbox one memory is higher bandwidth than PS4, the other 99.6% of the memory pool is much higher bandwidth on the PS4.

That is something most people misunderstand. The esram may only be a small fraction of the total ram, but if the gpu can do 99% of the work it has to do in esram, then you are obviously ok. How much edram/esram you need has been tested endlessly by whoever makes chips . It seems that 32MB is some kind of "sweet spot" if you look at <=1080p images. Ms has built a very complex infracstructure around the esram bottleneck to ensure they get as close as possible to 100% efficiency.

Trust me.  I am a BSEE.  I understand this stuff isn't always black and white.

This anandtech article says it pretty well.   I think at the end of the day, GDDR5 is the better choice for a gaming centric console, hands down.  Microsoft is most certainly aiming for a media hub, and there could be some advantages to DDR3 there.  But from a games perspective, its hard to argue against GDDR5 being better, especially AT 1080P

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/3

Also, 32MB was probably more of a "cheap" spot too in terms of cost.


I don't believe that is the argument.

I think the argument Microsoft is making is that while Sony has the performance specs, it doesn't necessarily equate to a significant advantage due to the limitations of the choices made.

And the limitations of the PS4 are a unified memory, a easy to use and program architecture, and a stronger console. I would love to have those limitations.


Acording to Ethomaz:

The PS4 can´t swap into games or apps. You might not suspend  Knack and go play a multplay match of COD with yuor friends.


I think you can only have a game and the OS open, and switch between them. It's not like you can have a movie, then go to a game that is already playing in the backround, with something downloading. You can have knack playing then enter the PSN Store, and that is all you can do from what I know. Also the Xbox One has a menu hub that shows what you are doing on all types of media, the PS4 doesn't do that, there are a lot more resource heavy things going on with the Xbone.



kitler53 said:
you forgot to multiply all of xbone's stats by 3x for teh cloud.

The reality is we don't know what the possible impact off-system computing will have.

"The Cloud" is nothing more than Client/Server computing in some respects.  To assume there isn't potential in that is to be ignorant of reality.  After the PC was released, businesses started creating client/server applications where PCs were used essentially for UI and mainframes and UNIX servers did all the heavy processing.  Eventually, with the advent of the web, you had lightweight applications that ran on the client that accessed the Web server, which did all the heavy processing. 

Heck, multiplayer games are nothing more than client/server computing.  The only difference is that the rendering takes place on the local client.  Regardless of whether it's a peer-to-peer or server-to-peer arrangement, one system does all the leg work in determining where everyone in the game is and how they're all meshing together. 

People, like yourself, may like to playdown the impact off-system computing will have in the cloud, but the reality is it can be extremely powerful.  We are beginning to run into a wall in terms of computing.  They've developed single atom circuits.  They can't get any smaller than single atoms.  Unless computing moves away from Silicon, then what will eventually happen is we will hit a wall.  Device won't get smaller, computing power won't grow exponentially every few years, and the only way to get more power into a machine is you either build a bigger box or you offload computational work.

Quantum computing is still decades away.  Other mediums that allow better cooling properies than silicon and would allow smaller dies are still being developed.  Even still, they won't be able to offer a circuit size smaller than the atomic level, they'll only allow engineers to develop dies with less space between circuits and still run cool.  Not to mention operate without shorting out. 

So yeah.  If for cost reasons you can't increase the size of your device, then in order to expand the capabilities of what it can do you will need to offload some of the computation onto servers.  For consumers, that's the cloud.  Businesses have the option of in-house servers as well as cloud-based servers.



Truthfact: People bought 360s over the PS3 because of the graphics powah! /sarcasm

Truthfact: No one ultimately cares about graphics, look at Wii and 3DS sales compared to 360/PS3 and Vita sales.



It's just that simple.