By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - TRUE XB1 vs PS4 spec comparison

ethomaz said:

fallen said:

PS3 GPU can texture from XDR or GDDR pool, effectively 48 GB/s for GPU.

 

Sony themselves actually count this in slides, I have one saved...

 

Please dont be irrational anyway, if 360 only needed 24 GB/s, why would MS have included 10MB of EDRAM in 360?

 

It's the same as people who act as if ESRAM in X1 isn't there, and say X1 has only 68 GB/s. They are basically saying MS put 1.6 billion transistors of ESRAM at significant cost, into the system for no reason at all.

No.

+ PS3 GPU (RSX) have direct access to the GDDR3 @ 22.4 GB/s.
+ PS3 GPU (RSX) can use the CellFlexIO to access the XDR @ 20 GB/s (read) / 15 GB/s (write)
+ PS3 CPU (Cell) have direct access to the XDR @ 25.6 GB/s

That's it... when the RSX is accesing the XDR it uses the 20 GB/s of the 25.5 GB/s (read) or 14 GB/s (write) bandwidth of the Cell... it is not a direct acess... it need to use the Cell to access the XDR.

About the eDRAM... MS used it to framebuffer and apply post-processing filters... 10MB is enoght to 720p framebuffer.

The eSRAM in Xbone is not only for framebuffer and post-processing filters... so the DMEs are needed and all the bus access... the first time MS needs to avoid the low bandwidth using the eSRAM... in 360 the 22.4 GB/s bandwidth was enough.

PS3 and 360 have both the same bandwidth for GPU... the difference is that in 360 the GPU can direct access all the 512MB RAM and on PS3 the GPU can direct access only 256MB of RAM... when the PS3 GPU try to access the other 256MB RAM it is slow and cause all the issues with games not progammed to work with that slow bandwidth.

Absolutely FALSE.

 

PS3 GPU as you point out, whether going through Cell or not, accessed XDR and GDDR at one.

 

In fact, this is the ONLY reason Sony even used two pools of RAM in PS3, to get double bandwidth, since 1 is simpler. Use your head!

 

Sony used two pools of RAM, MS used a single pool+EDRAM, to solve the same bandwidth problem. Had Sony only used one pool of RAM with no EDRAM. PS3 would have had severe bandwidth constraints, because 24 GB/s is not nearly enough to feed RSX.

 

Choice last gen was this:

 

Two pool

On pool+EDRAM

 

The only reason Sony used one pool+no EDRAM this gen (though they considered it, as Cerny said) was because they went to a 256 bit bus.

 

Last gen 256 bit bus was considered too expensive. They were forced into two 128 bit busses to two pools of RAM (PS3), or one 128 bus  and EDRAM for 360.



Around the Network

fallen said:

I already explained this to you. X1 10% is NOT for Kinect, or anything intrinsic to the system.

 

360 had GPU reserved (~10% from what I heard) and it did not have Kinect at the beginning, or snap, or anything people use to justify 10% on X1. These are red herrings to try to claim somehow "only" X1 needs GPU reserves.

 

ALL SYSTEMS RESERVE GPU FOR THE OS. INCLUDING PS4.

BTW PS4 has a camera system too, the PS Eye. When you plug one up, dont you think it takes GPU if Kinect supposedly does? Otherwise does your game slow down when you plug PSeye in?

If you get a "Trophy Unlocked" message in PS4 do you think it happens by magic? No, the GPU draws it out of the reserve! Same as on X1.

 

Without knowing how much PS4 reserves it's pointless to speculate or give one console the edge. IN FACT, given Sony's history (where for example PS3 reserved more RAM for OS than X360), it very well may be MORE GPU reserves on PS4!

It is a confirmed fact.

"One thing to keep in mind when looking at comparative game resolutions is that currently the Xbox One has a conservative 10 per cent time-sliced reservation on the GPU for system processing. This is used both for the GPGPU processing for Kinect and for the rendering of concurrent system content such as snap mode. The current reservation provides strong isolation between the title and the system and simplifies game development (strong isolation means that the system workloads, which are variable, won't perturb the performance of the game rendering). In the future, we plan to open up more options to developers to access this GPU reservation time while maintaining full system functionality."

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

20% for Kinect and Metro UI (snap mode).

You can spin that.



fallen said:

Absolutely FALSE.

 

PS3 GPU as you point out, whether going through Cell or not, accessed XDR and GDDR at one.

 

In fact, this is the ONLY reason Sony even used two pools of RAM in PS3, to get double bandwidth, since 1 is simpler. Use your head!

 

Sony used two pools of RAM, MS used a single pool+EDRAM, to solve the same bandwidth problem. Had Sony only used one pool of RAM with no EDRAM. PS3 would have had severe bandwidth constraints, because 24 GB/s is not nearly enough to feed RSX.

 

Choice last gen was this:

 

Two pool

On pool+EDRAM

 

The only reason Sony used one pool+no EDRAM this gen (though they considered it, as Cerny said) was because they went to a 256 bit bus. Last gemn 256 bit bus was considered too expensive. They were forced into two 128 bit busses to two pools of RAM, or one 128 bus and EDRAM for 360.

You need to study first... all what I said is true... come back late



ethomaz said:

fallen said:

I already explained this to you. X1 10% is NOT for Kinect, or anything intrinsic to the system.

 

360 had GPU reserved (~10% from what I heard) and it did not have Kinect at the beginning, or snap, or anything people use to justify 10% on X1. These are red herrings to try to claim somehow "only" X1 needs GPU reserves.

 

ALL SYSTEMS RESERVE GPU FOR THE OS. INCLUDING PS4.

BTW PS4 has a camera system too, the PS Eye. When you plug one up, dont you think it takes GPU if Kinect supposedly does? Otherwise does your game slow down when you plug PSeye in?

If you get a "Trophy Unlocked" message in PS4 do you think it happens by magic? No, the GPU draws it out of the reserve! Same as on X1.

 

Without knowing how much PS4 reserves it's pointless to speculate or give one console the edge. IN FACT, given Sony's history (where for example PS3 reserved more RAM for OS than X360), it very well may be MORE GPU reserves on PS4!

It is a confirmed fact.

"One thing to keep in mind when looking at comparative game resolutions is that currently the Xbox One has a conservative 10 per cent time-sliced reservation on the GPU for system processing. This is used both for the GPGPU processing for Kinect and for the rendering of concurrent system content such as snap mode. The current reservation provides strong isolation between the title and the system and simplifies game development (strong isolation means that the system workloads, which are variable, won't perturb the performance of the game rendering). In the future, we plan to open up more options to developers to access this GPU reservation time while maintaining full system functionality."

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

20% for Kinect and Metro UI (snap mode).

You can spin that.


Wow. The article says 10% RIGHT THERE.

 

Again, can you show me where Sony revealed how much PS4 GPU reserves for OS?

 

If not, we're back to square one.



fallen said:

Wow. The article says 10% RIGHT THERE.

 

Again, can you show me where Sony revealed how much PS4 GPU reserves for OS?

 

If not, we're back to square one.

"I already explained this to you. X1 10% is NOT for Kinect, or anything intrinsic to the system."

See how you don't make sense at all... this claim is completely wrong.

PS4 have Kinect? No. PS4 have Metro UI with Snap? No... PS4 needs GPU power? No.

PS3 didn't used GPU for OS... 360 didn't used GPU for OS... Wii U didn't use GPU for OS (it use for Gamepad)... the framebuffer output didn't need GPU processing at all... it only receive the image and sent to the display... it uses only the ROPs... the GPU didn't need to create/render the image.



Around the Network
ethomaz said:

fallen said:

Absolutely FALSE.

 

PS3 GPU as you point out, whether going through Cell or not, accessed XDR and GDDR at one.

 

In fact, this is the ONLY reason Sony even used two pools of RAM in PS3, to get double bandwidth, since 1 is simpler. Use your head!

 

Sony used two pools of RAM, MS used a single pool+EDRAM, to solve the same bandwidth problem. Had Sony only used one pool of RAM with no EDRAM. PS3 would have had severe bandwidth constraints, because 24 GB/s is not nearly enough to feed RSX.

 

Choice last gen was this:

 

Two pool

On pool+EDRAM

 

The only reason Sony used one pool+no EDRAM this gen (though they considered it, as Cerny said) was because they went to a 256 bit bus. Last gemn 256 bit bus was considered too expensive. They were forced into two 128 bit busses to two pools of RAM, or one 128 bus and EDRAM for 360.

You need to study first... all what I said is true... come back late


Seems like you got owned and dont know how to respond by addressing my points. I understand.

 

I notice on this board there's a few people who know a little about system design, but nobody knows very much. It's not like B3D or even, horror to admit it, Neogaf (which has a few people with tech knowledge even though the ignorant make the most comments by far)

 

I'm not an expert, far from it, but I feel like one on this board LOL. I may have the most system design knowledge here.

 

Not knocking this board, every board has it's character I suppose. Most boards do not have technically informed posters...



fallen said:
Xenostar said:
fallen said:
Xenostar said:
You need to remove 10% you for Kinect but keep that damage control coming. And 1 core for shape lol again shape is mostly there for Kinect.


No there is nothing for Kinect. Rumors say MS reserves ~10% of the GPU for OS use, however contrary to believe all systems including gasp, the PS4, 360, and PS3, reserve some amount of GPU for this. It is needed to pop up say "Trophy unlocked", or a message from a friend.

MS also said to Digital Foundry they are working on reducing the 10%. Regardless, without knowing how much the Ps4 reserves, it's pointless to speculate about this or even declare one side has the advantage.

It reminds me how it was claimed as fact PS4 reserved only 1Gb of RAm for OS, until we found out it's actually 3GB, same as X1 (which some people naturally still deny that fact)


There is no rumour this was from the cheif engineer of the console. You even quote the article yourself. 

And yes im sure they will get that 10% down, just like im sure Sony will get there OS footprint down as well, they did on PS3 and on PSP.

But you cant label an article TRUE comparison, claim reduction in PS4 specs for beleiving it has no audio processor when it does and not mention known facts about X1 spec reduction, its just disingenous. 

But everyone knows the spec differences anyway, they are what they are so im not gonna get into any more debates about them, as im sure there will be many "but but but X1 is as fast honest" threads to come. 

MS only fans start talking about the games stop trying to massage the tech specs we all know the differnce now. 

I already explained this to you. X1 10% is NOT for Kinect, or anything intrinsic to the system.

 

360 had GPU reserved (~10% from what I heard) and it did not have Kinect at the beginning, or snap, or anything people use to justify 10% on X1. These are red herrings to try to claim somehow "only" X1 needs GPU reserves.

 

ALL SYSTEMS RESERVE GPU FOR THE OS. INCLUDING PS4.

BTW PS4 has a camera system too, the PS Eye. When you plug one up, dont you think it takes GPU if Kinect supposedly does? Otherwise does your game slow down when you plug PSeye in?

If you get a "Trophy Unlocked" message in PS4 do you think it happens by magic? No, the GPU draws it out of the reserve! Same as on X1.

 

Without knowing how much PS4 reserves it's pointless to speculate or give one console the edge. IN FACT, given Sony's history (where for example PS3 reserved more RAM for OS than X360), it very well may be MORE GPU reserves on PS4!

This is suppose to be a TRUE comparison, and your assuming far too much, to make it biased.

You assume PS4 CPU speed, you assume PS4 reserves, you assume PS4 audio processing.

Are you also assuming HDD information, as im genuinly interested in that, as im toying with the idea of just gettign a 1TB drive to put in the PS4 day one, but wouldnt want to put a slower one in. 



Pemalite said:
fatslob-:O said:

Just a question. How useful do you think that 32mb ESRAM will be in trying to close the gap ? 


Both Sony and Microsoft would have done the math and simulations to determine the optimal amount of bandwidth (For the cost) that is required.
The Xbox One by it's very nature of having less GPU hardware than the Playstation 4 needs less memory bandwidth for the hardware to become fully saturated to begin with.

However, if you take the quoted bandwidth numbers in the Op's post and expect to get that kind of bandwidth 100% of the time whilst rendering a game, then you're literally dreaming, it won't and cannot happen, you cannot fit everything into it.

Regardless of the Bandwidth though, at the end of the day it's not going to make much difference when you have significantly less compute resources that actually renders and displays the pretty pictures on-screen, that limitation should become incredibly apparant by the end of the generation.

I honestly wish that, Microsoft and Sony would take CPU performance more seriously for once, it was incredibly apparant how limited they were with CPU processing in RTS games and Battlefield 3 this generation, with the cut down player counts, limited Physics and such.

Agreed but I think physics can be done on the GPU such as tressfx, physx, bullet, etc. 

Edit: I'm willing to bet that the ESRAM is used for caching purposes to save on DDR3 bandwidth and not used for enormous access of data. 



fallen said:

Seems like you got owned and dont know how to respond by addressing my points. I understand.

 

I notice on this board there's a few people who know a little about system design, but nobody knows very much. It's not like B3D or even, horror to admit it, Neogaf (which has a few people with tech knowledge even though the ignorant make the most comments by far)

 

I'm not an expert, far from it, but I feel like one on this board LOL. I may have the most system design knowledge here.

 

Not knocking this board, every board has it's character I suppose. Most boards do not have technically informed posters...

I explained to you and you continue to say wrong tech info here... so why I will try to discuss when you know nothing about the subject? If you want to learn I can help you... so to start what I wrote is right... it is like PS3 works.

So instead to try to say the other are wrong and create more bullshit info about eDRAM and memory pools... research... you will see what you are saying is totally wrong.

You debating about something already detailed by Sony, MS and hackers... it is not anything like the Xbone and PS4 that can have some unkown tech info to be found.

I will sleep now... good night



I don't think all that bandwith jibberish will bring any major changes. At least when I upgrade my PC with a better graphics card, the differences are massive. But if I buy some faster RAM, I don't see any difference at all.

I just don't see how that tiny eSRAM should make any difference. Sure, it's not a bad solution, but neither is GDDR5. It's both good enough. But the PS4 still has the better GPU. I mean, come on. Even if the Xbone might have the upper hand in some weird areas nobody gives a crap about, overall the PS4 is still faster. Does that simple fact really hurt so much?

I know that I didn't care that the Xbox was faster than my Gamecube back in the days. They were on the same level, that's all what matters to me, I couldn't care less about some more fps, a higher resolution or some details I will never look at anyway. Sure, I'm buying the PS4, but certainly not because of the hardware. It's all about the games and Sony just has more games that interest me.

Man, I'm beginning to understand Ninty fans. Just screw all that hardware crap, get a console everyone knows it is slower and just play some games. Memory bandwith, dGPU, secret sauce, NDA, what the hell. You're not stopping until every person on the internet says that Xbox One is faster, are you? I'll ask again: Why is it so goddamn important to you?



Official member of VGC's Nintendo family, approved by the one and only RolStoppable. I feel honored.