By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Why Xbox One Will Catch Up To Playstation 4 in Performance.

I highly doubt the xbone will catch the PS4 in performance, the gap is just too large, if it was just minimal it would have a chance but from the initial games we have its pretty noticable.



Around the Network

Oh. The PS4 has GDDR5

Has anybody noticed that there is NO computer up to date which uses ONLY GDDR - even not those machines where "money doesn't count" ?

So - WHY is that? DOES GDDR HAS SOME SERIOUS DISADVANTAGES?

Yes. Because it is a special type of RAM TAILORED for GPUs - big latency but you can read and write at once.

DDR RAM is made for CPUs - low latency and you can read OR write at once.

Now lets investigate further why GDDR only is a bad idea on a CPU like AMDs Jaguar - those cores were designed with netbooks in mind. Of course they have doubled the number of cores to a total of eight so its raw compute power doesn't look that bad - this year...

But AMDs Jaguar lack a L3 cache and its L2 cache isn't that big too. If a core has a cache miss chances are big that the core has to wait longer for data on a PS4 than on the Xbox One. Thats where RAM latency comes into play.

CPUs favor small chunks of data and they need them fast.

GPUs need big chunks of data in parallel but they can wait a bit longer...

This might be the reason why Killzone fails to deliver true 1080p in Multiplayer: if the network code destroys to often the caches the cores have to wait and so these parts can't be as optimized as the single player code...

People are tempted to overlook such technical details and favour a simplified "expensive is better". But as already written - if this would be the case ALL high end servers (POWER, Sparc) would ship with GDDR instead of DDR . The opposite is the truth: DDR is better for CPUs and GDDR is best for GPUs. And to "glue" both together it's best to use something ultra-fast like EDRAM or ESRAM.



drkohler said:
Michael-5 said:

Picture a highway, and RAM being the number of lanes. If you have 16 lanes, great, but do you need it for 100 cars/day? However having only 1 lane would significantly slow down traffic as everyone would be stuck behind 1 car. DDR5 with said analogy would be 5 lanes, where DDR3 is 3 lanes.

If you are trying to make analogies, try to make correct ones. Otherwise it just confuses the hell out of people. So your analogy actually goes like this:

1. DDR3 in the XBox One has a highway with 256 lanes. GDDR5 in the PS4 has a highway with 256 lanes.

2. The PS4 has a much more modern highway that allows each car to drive much closer to the car in front of it than on the X1 highway.

Now esram is somewhat more complicated. Technically, it is a software managed cache/address area. In our analogy, it could be described as a 1024 lanes highway with a twist - each of the four groups of 256 lanes can end at a destination that is not fixed in advance. You can manage where the lanes end. That works because there is an extremely clever highway manager that takes care of all the possibilities.

1. DDR3 is 64 bit, not 256, so DDR3 on XB1 would be like a 64 lane highway in your analogy.

2. Dunno about that.

plus ESRAM would be needlessly complicated (which is the cast in real life) and offers little benefit.



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results

Michael-5 said:

1. DDR3 is 64 bit, not 256, so DDR3 on XB1 would be like a 64 lane highway in your analogy.

ddr3 chips are 16 bits wide. The XBox One has 16 chips in parallel for a 256 bit bus

gddr5 chips are 32 bits wide but two chips can be paired as 16 bits wide each (clamshell mode). The PS4 has 16 chips in clamshell mode for a 256 bit bus.



mine said:
Oh. The PS4 has GDDR5

Has anybody noticed that there is NO computer up to date which uses ONLY GDDR - even not those machines where "money doesn't count" ?

So - WHY is that? DOES GDDR HAS SOME SERIOUS DISADVANTAGES?

Yes. Because it is a special type of RAM TAILORED for GPUs - big latency but you can read and write at once.

DDR RAM is made for CPUs - low latency and you can read OR write at once.

Now lets investigate further why GDDR only is a bad idea on a CPU like AMDs Jaguar - those cores were designed with netbooks in mind. Of course they have doubled the number of cores to a total of eight so its raw compute power doesn't look that bad - this year...

But AMDs Jaguar lack a L3 cache and its L2 cache isn't that big too. If a core has a cache miss chances are big that the core has to wait longer for data on a PS4 than on the Xbox One. Thats where RAM latency comes into play.

CPUs favor small chunks of data and they need them fast.

GPUs need big chunks of data in parallel but they can wait a bit longer...

This might be the reason why Killzone fails to deliver true 1080p in Multiplayer: if the network code destroys to often the caches the cores have to wait and so these parts can't be as optimized as the single player code...

People are tempted to overlook such technical details and favour a simplified "expensive is better". But as already written - if this would be the case ALL high end servers (POWER, Sparc) would ship with GDDR instead of DDR . The opposite is the truth: DDR is better for CPUs and GDDR is best for GPUs. And to "glue" both together it's best to use something ultra-fast like EDRAM or ESRAM.


nice post only you overlooked one thing, these are games consoles not PC's "totally different in almost every department".

im also sure the PS4 is using AMD's HSA technology, either way your theory on paper works but the reality is completely different ESRAM is not making the XB1 better.



Current PC build:

Asus Z97I-Plus, i5 4790K @ 4.6ghz, EVGA GTX 980 ACX 2.0 1377/1853/124%, Corsair Vengence Pro 2400mhz 2x 8192mb, Corsair RM850, Corsair H80i, 120GB OCZ Vertex 3 SSD, 750GB Seagate Momentus XT SSHD, 320GB Weston Digital HDD, Corsair 230T, Corsair K50 Raptor, HP XQ500AA mouse, Windows 10 Pro 64bit. iiyama Pro Lite G2773HS 120Hz 1Ms G2G gaming monitor.

Around the Network
r3tr0gam3r1337 said:
i couldn't be bothered to read through all the comments, i was just laughing at the stupidity from jega.

the games are being primarily developed for the PS4 as that is the console who has the biggest user base, just like last gen the 360 was the primary console for development as it had the biggest user base (in terms of HD gaming, the wii had the biggest instal base but it wasn't HD among other things).

the XB1 will not receive a hardware upgrade as that will render all games currently released unplayable as they have been developed for the current hardware inside the XB1 also any future games would not work on older XB1's meaning current XB1 owners would be forced to upgrade their console, Microsoft are not that stupid to make such a daft move hence why the HD-DVD add-on for the 360 was not used for games.

other things could have soured this thread is pretty much like expecting a repeat from the 7th gen when the PS3 eventually overtook the 360, much like the PS3 the XB1 on paper looks to be the better spec but much like the PS3 developers are having a hard time with the XB1 which is why the PS4 has the better versions (multiplats etc), it is easier to develop for much like the last gen where the 360 was easier to develop for and the hardware in both console's is effectively the same, both have APU's developed by AMD both have AMD GPU's and yes both will have some variant of the direct X API.

6/10, post almost seemed convincing.

That bolded statement is incorrect however. The XB1 is not the new PS3, just like the PS4 is not the new 360.

That's just wish fufillment.

mine said:
Oh. The PS4 has GDDR5 

Has anybody noticed that there is NO computer up to date which uses ONLY GDDR - even not those machines where "money doesn't count" ?

So - WHY is that? DOES GDDR HAS SOME SERIOUS DISADVANTAGES? 

Yes. Because it is a special type of RAM TAILORED for GPUs - big latency but you can read and write at once. 

DDR RAM is made for CPUs - low latency and you can read OR write at once. 

Now lets investigate further why GDDR only is a bad idea on a CPU like AMDs Jaguar - those cores were designed with netbooks in mind. Of course they have doubled the number of cores to a total of eight so its raw compute power doesn't look that bad - this year... 

But AMDs Jaguar lack a L3 cache and its L2 cache isn't that big too. If a core has a cache miss chances are big that the core has to wait longer for data on a PS4 than on the Xbox One. Thats where RAM latency comes into play. 

CPUs favor small chunks of data and they need them fast. 

GPUs need big chunks of data in parallel but they can wait a bit longer...

This might be the reason why Killzone fails to deliver true 1080p in Multiplayer: if the network code destroys to often the caches the cores have to wait and so these parts can't be as optimized as the single player code...

People are tempted to overlook such technical details and favour a simplified "expensive is better". But as already written - if this would be the case ALL high end servers (POWER, Sparc) would ship with GDDR instead of DDR . The opposite is the truth: DDR is better for CPUs and GDDR is best for GPUs. And to "glue" both together it's best to use something ultra-fast like EDRAM or ESRAM.

4/10, Nice effort but lack of fact checking is annoying.

PS4 has 256mb of ddr3 for system task such as background data and networking.

It doensn't only use GDDR5.

KZ:SF is 1080p native, they just used temporal relocation to increase the frame rate.



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

mine said:

Yes. Because it is a special type of RAM TAILORED for GPUs - big latency but you can read and write at once.

1. DDR RAM is made for CPUs - low latency and you can read OR write at once.

Now lets investigate further why GDDR only is a bad idea on a CPU like AMDs Jaguar

2. But as already written - if this would be the case ALL high end servers (POWER, Sparc) would ship with GDDR instead of DDR .

3. The opposite is the truth: DDR is better for CPUs and GDDR is best for GPUs. And to "glue" both together it's best to use something ultra-fast like EDRAM or ESRAM.

You are somewhat confused about the technologies you try to explain so you pretty much get everything wrong. Just a few notes:
1. ddr ram is made for everything that needs memory. No memory in the world can "read and write at once". You always have to tell your memory what cell(s) you want to read OR write. If you really wanted low latency, you'd go with sram, else you take other memory types.

ddr ram chip: cheap (per MB), high capacity hence small, fast (low to medium latencies), low bandwidth, relatively simple to address, difficult mb layout

sram chip: expensive (per MB), low capacity hence large, very fast (very low latencies), high bandwidth, simple to address, difficult mb layout

gddr ram chip: soon cheap? (per MB), medium capacity hence large, fast (low to high latencies depending on program mode), high bandwidth, complex to address, simple mb layout.

gddr memory chips have higher latencies because they need more clock cycles to "setup and ready" the data. However, gddr operates on two forward clocks which are significantly higher than ddr clocks, hence the latencies hardly differ in the end.

2. Servers operate somewhere away from you. Memory latency is completely irrelevant compared to the time it takes to send/receive data. Since servers need a lot of preferrably cheap memory (and densely packed), all servers obviously use ddr.

3.  Up to now, ddr and gddr have never been "glued". The pc has ddr on the main board, gddr on the graphics card, because these memories were logically separated from one another. There is no "glue" intended to use both types together. On full hsa systems, expect to see gddr systems only in the future (for better systems), ddr only (for average systems).



Dr.Henry_Killinger said:
r3tr0gam3r1337 said:
i couldn't be bothered to read through all the comments, i was just laughing at the stupidity from jega.

the games are being primarily developed for the PS4 as that is the console who has the biggest user base, just like last gen the 360 was the primary console for development as it had the biggest user base (in terms of HD gaming, the wii had the biggest instal base but it wasn't HD among other things).

the XB1 will not receive a hardware upgrade as that will render all games currently released unplayable as they have been developed for the current hardware inside the XB1 also any future games would not work on older XB1's meaning current XB1 owners would be forced to upgrade their console, Microsoft are not that stupid to make such a daft move hence why the HD-DVD add-on for the 360 was not used for games.

other things could have soured this thread is pretty much like expecting a repeat from the 7th gen when the PS3 eventually overtook the 360, much like the PS3 the XB1 on paper looks to be the better spec but much like the PS3 developers are having a hard time with the XB1 which is why the PS4 has the better versions (multiplats etc), it is easier to develop for much like the last gen where the 360 was easier to develop for and the hardware in both console's is effectively the same, both have APU's developed by AMD both have AMD GPU's and yes both will have some variant of the direct X API.

6/10, post almost seemed convincing.

That bolded statement is incorrect however. The XB1 is not the new PS3, just like the PS4 is not the new 360.

That's just wish fufillment.

mine said:
Oh. The PS4 has GDDR5 

Has anybody noticed that there is NO computer up to date which uses ONLY GDDR - even not those machines where "money doesn't count" ?

So - WHY is that? DOES GDDR HAS SOME SERIOUS DISADVANTAGES? 

Yes. Because it is a special type of RAM TAILORED for GPUs - big latency but you can read and write at once. 

DDR RAM is made for CPUs - low latency and you can read OR write at once. 

Now lets investigate further why GDDR only is a bad idea on a CPU like AMDs Jaguar - those cores were designed with netbooks in mind. Of course they have doubled the number of cores to a total of eight so its raw compute power doesn't look that bad - this year... 

But AMDs Jaguar lack a L3 cache and its L2 cache isn't that big too. If a core has a cache miss chances are big that the core has to wait longer for data on a PS4 than on the Xbox One. Thats where RAM latency comes into play. 

CPUs favor small chunks of data and they need them fast. 

GPUs need big chunks of data in parallel but they can wait a bit longer...

This might be the reason why Killzone fails to deliver true 1080p in Multiplayer: if the network code destroys to often the caches the cores have to wait and so these parts can't be as optimized as the single player code...

People are tempted to overlook such technical details and favour a simplified "expensive is better". But as already written - if this would be the case ALL high end servers (POWER, Sparc) would ship with GDDR instead of DDR . The opposite is the truth: DDR is better for CPUs and GDDR is best for GPUs. And to "glue" both together it's best to use something ultra-fast like EDRAM or ESRAM.

4/10, Nice effort but lack of fact checking is annoying.

PS4 has 256mb of ddr3 for system task such as background data and networking.

It doensn't only use GDDR5.

KZ:SF is 1080p native, they just used temporal relocation to increase the frame rate.


i didn't say the XB1 was the new PS3 what i was trying to put across was that it currently sits in a similar possition as the PS3 did back in 2007, on paper the PS3 had better spec but the reality was that the 360 performed better and was easier for game deves to work on plus it had the bigger user base, now do you see what i was trying to say?



Current PC build:

Asus Z97I-Plus, i5 4790K @ 4.6ghz, EVGA GTX 980 ACX 2.0 1377/1853/124%, Corsair Vengence Pro 2400mhz 2x 8192mb, Corsair RM850, Corsair H80i, 120GB OCZ Vertex 3 SSD, 750GB Seagate Momentus XT SSHD, 320GB Weston Digital HDD, Corsair 230T, Corsair K50 Raptor, HP XQ500AA mouse, Windows 10 Pro 64bit. iiyama Pro Lite G2773HS 120Hz 1Ms G2G gaming monitor.

r3tr0gam3r1337 said:

i didn't say the XB1 was the new PS3 what i was trying to put across was that it currently sits in a similar possition as the PS3 did back in 2007, on paper the PS3 had better spec but the reality was that the 360 performed better and was easier for game deves to work on plus it had the bigger user base, now do you see what i was trying to say?

5/10; unclear

The only similarities b/w the PS3 and XB1 are emphasis on media features, higher price then competitor, similar sales position (Both are trailing behind their competitor, but while PS3 came in a year late, XB1 is facing problems with demand).

The PS4 has better specs and is easier to develop for, then the XB1. ESRAM adds to the complexity, but without it the gap would just be larger. Also 360 had a larger userbase then ps3 when ps3 came into market because the 360 had a year lead.



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank

r3tr0gam3r1337 said:
Dr.Henry_Killinger said:
r3tr0gam3r1337 said:
i couldn't be bothered to read through all the comments, i was just laughing at the stupidity from jega.

the games are being primarily developed for the PS4 as that is the console who has the biggest user base, just like last gen the 360 was the primary console for development as it had the biggest user base (in terms of HD gaming, the wii had the biggest instal base but it wasn't HD among other things).

the XB1 will not receive a hardware upgrade as that will render all games currently released unplayable as they have been developed for the current hardware inside the XB1 also any future games would not work on older XB1's meaning current XB1 owners would be forced to upgrade their console, Microsoft are not that stupid to make such a daft move hence why the HD-DVD add-on for the 360 was not used for games.

other things could have soured this thread is pretty much like expecting a repeat from the 7th gen when the PS3 eventually overtook the 360, much like the PS3 the XB1 on paper looks to be the better spec but much like the PS3 developers are having a hard time with the XB1 which is why the PS4 has the better versions (multiplats etc), it is easier to develop for much like the last gen where the 360 was easier to develop for and the hardware in both console's is effectively the same, both have APU's developed by AMD both have AMD GPU's and yes both will have some variant of the direct X API.

6/10, post almost seemed convincing.

That bolded statement is incorrect however. The XB1 is not the new PS3, just like the PS4 is not the new 360.

That's just wish fufillment.

mine said:
Oh. The PS4 has GDDR5 

Has anybody noticed that there is NO computer up to date which uses ONLY GDDR - even not those machines where "money doesn't count" ?

So - WHY is that? DOES GDDR HAS SOME SERIOUS DISADVANTAGES? 

Yes. Because it is a special type of RAM TAILORED for GPUs - big latency but you can read and write at once. 

DDR RAM is made for CPUs - low latency and you can read OR write at once. 

Now lets investigate further why GDDR only is a bad idea on a CPU like AMDs Jaguar - those cores were designed with netbooks in mind. Of course they have doubled the number of cores to a total of eight so its raw compute power doesn't look that bad - this year... 

But AMDs Jaguar lack a L3 cache and its L2 cache isn't that big too. If a core has a cache miss chances are big that the core has to wait longer for data on a PS4 than on the Xbox One. Thats where RAM latency comes into play. 

CPUs favor small chunks of data and they need them fast. 

GPUs need big chunks of data in parallel but they can wait a bit longer...

This might be the reason why Killzone fails to deliver true 1080p in Multiplayer: if the network code destroys to often the caches the cores have to wait and so these parts can't be as optimized as the single player code...

People are tempted to overlook such technical details and favour a simplified "expensive is better". But as already written - if this would be the case ALL high end servers (POWER, Sparc) would ship with GDDR instead of DDR . The opposite is the truth: DDR is better for CPUs and GDDR is best for GPUs. And to "glue" both together it's best to use something ultra-fast like EDRAM or ESRAM.

4/10, Nice effort but lack of fact checking is annoying.

PS4 has 256mb of ddr3 for system task such as background data and networking.

It doensn't only use GDDR5.

KZ:SF is 1080p native, they just used temporal relocation to increase the frame rate.


i didn't say the XB1 was the new PS3 what i was trying to put across was that it currently sits in a similar possition as the PS3 did back in 2007, on paper the PS3 had better spec but the reality was that the 360 performed better and was easier for game deves to work on plus it had the bigger user base, now do you see what i was trying to say?

Dude,the PS4 just have better spec and performed better than XBOX One,

so your theory just...doens't make sense.