By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

Not sure if someone mentioned it, but:
Afaik 4Gb/512MB GDDR5 chips will release this or next quarter to the mass market. See http://www.skhynix.com/products/support/databook.jsp



Around the Network

What "ed"RAM?"es"RAM,esRAM are expensive than edRAM

And who told you 720 not gonna use high-end GPU?



Sirius87 said:
Not sure if someone mentioned it, but:
Afaik 4Gb/512MB GDDR5 chips will release this or next quarter to the mass market. See http://www.skhynix.com/products/support/databook.jsp


Ah that changes things a bit, still I doubt Sony would go for more than 4GB even with 4Gb chips available. I imagine that the new chips will be very expensive in the short term, costs should come down as production ramps up so I guess they could take a short term loss with the expectation costs will drop over the next couple years.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

zarx said:
Sirius87 said:
Not sure if someone mentioned it, but:
Afaik 4Gb/512MB GDDR5 chips will release this or next quarter to the mass market. See http://www.skhynix.com/products/support/databook.jsp


Ah that changes things a bit, still I doubt Sony would go for more than 4GB even with 4Gb chips available. I imagine that the new chips will be very expensive in the short term, costs should come down as production ramps up so I guess they could take a short term loss with the expectation costs will drop over the next couple years.

Well, I think 512MB chips are needed to reach the 4GB target. 16 chips would be insane.



Sirius87 said:
zarx said:
Sirius87 said:
Not sure if someone mentioned it, but:
Afaik 4Gb/512MB GDDR5 chips will release this or next quarter to the mass market. See http://www.skhynix.com/products/support/databook.jsp


Ah that changes things a bit, still I doubt Sony would go for more than 4GB even with 4Gb chips available. I imagine that the new chips will be very expensive in the short term, costs should come down as production ramps up so I guess they could take a short term loss with the expectation costs will drop over the next couple years.

Well, I think 512MB chips are needed to reach the 4GB target. 16 chips would be insane.

Yea I agree that 16 chips would be pushing the limits, 8 is much more reasonable. 



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

Around the Network
BlueFalcon said:
sergiodaly said:
question @ALL
if the GPUs on PS4 and nextbox are from AMD... what makes people believe that xbox will have edram or esram and PS4 will not... if wii U uses it, its obvious AMD thinks its a good idea for the GPU and i believe AMD will use it too in Sony machine. Right?

In the Wii U, "there are four 4Gb (512MB) Hynix DDR3-1600 devices surrounding the Wii U's MCM (Multi Chip Module). Memory is shared between the CPU and GPU, and if I'm decoding the DRAM part numbers correctly it looks like these are 16-bit devices giving the Wii U a total of 12.8GB/s of peak memory bandwidth." (http://www.anandtech.com/show/6465/nintendo-wii-u-teardown)

Because MS crippled the memory bandwidth of R500 GPU in Xbox 360 in half and Wii U's GPU's memory bandwidth is also crippled by shared DDR3-1600, instead of dedicated GDDR5, both of those solutions try to mask the memory bandwidth penalty by including eDRAM. If you want to retain the full power of the GPU, you would go with GDDR5 for the GPU and drop the eDRAM. If you want to save costs, you go with cheaper DDR memory and eDRAM. If PS4 uses dedicated GDDR5 for the GPU, it's actually the most optimal approach (which is why no AMD/NV GPUs have eDRAM on the PCB). Despite sounding fancy, the use of eDRAM/eSRAM is actually a cost-savings solution to minimize the performance penalty of foregoing a wider memory bus+GDDR5. The lack of eDRAM on PS4 but the inclusion of GDDR5 would actually be a good thing, not a disadvantage. It would imply the graphics sub-system would not be compromised. If Sony cripples the 256-bit bus of HD7970M in half, then sure eDRAM is possible.

ummm thanks for the info... i now see the reason... my confusion was about a article i read i few years back about eDRAM on Xbox 360 and the "ability" of doing AA with that memory and the processor on what "she" is embedded.

from what i notice the step from eDRAM to eSRAM is a step back from dynamic to static... is cheaper but has a larger die size. don't really know if its slower. the increase in size (from 10 to 32MB) might help alot now that games are going for the 1080p resolution. will have to wait and see.



Proudest Platinums - BF: Bad Company, Killzone 2 , Battlefield 3 and GTA4

superchunk said:
sergiodaly said:
@ superchunk
where is the info on those 4 move engines next box use to move data around? would like to know more about that.

question @ALL
if the GPUs on PS4 and nextbox are from AMD... what makes people believe that xbox will have edram or esram and PS4 will not... if wii U uses it, its obvious AMD thinks its a good idea for the GPU and i believe AMD will use it too in Sony machine. Right?

I picked the move engines info from GAF. Basically all I could get out of it is that they had to do with assisting data movement between the various components. This is supposedly what will make up the transfer rate differences between the DDR3 ram and the sDRAM. IDK, its really an uknown at this point and could very well be for something completely different. But looking at the diagram VGLeaks supplied, it does make sense.

Nintendo/MS both must of decided that it was more important to save money on the main RAM and supplement its lack of speed with embedded ram. Sony seems to have wanted to fix its developer complications and stick to something that is very similar to PCs, thus it went for the costlier GDDR5. Since it is already a very fast memory, the embedded ram isn't needed.

so those "move engines" must be the "secret sauce" they mention before. i really don't know what those engines could be, until we know more, its a wild guess... i will make some just for fun... could be Physics co processors to ease the CPU work, could also be something like the hydra engine by lucid to do scaled multi GPU rendering using the GPU in the APU with the dedicated GPU or the  crossfire protocol if it works here... some more ideas?



Proudest Platinums - BF: Bad Company, Killzone 2 , Battlefield 3 and GTA4

BlueFalcon said:

The specs say 8 ROPs and 18 TMUs. Make no sense unless they forgot to multiply those by 4. Also, the idea of splitting the GPU's compute units into a 14+4 setup makes no sense whatsoever because all compute units inside GCN are equal.

Each ROPs and TMUs backends have 4 units.

4x Texture Filter per CUs... so 18 CUs ... 18 TMUs backends... 72 TMUs... that's how the GCN works.

And sorry my mistakes about the RAM but I'm pretty sure there is 512MB GDD5 for use 8 chips in Orbis.



D-Joe said:

What "ed"RAM?"es"RAM,esRAM are expensive than edRAM

And who told you 720 not gonna use high-end GPU?

A NeoGaf insider mentioned this yesterday

"If these numbers are right, the Durango GPU is actually better when it comes to pure graphics"

http://www.neogaf.com/forum/showpost.php?p=47002439&postcount=936



Nsanity said:
D-Joe said:

What "ed"RAM?"es"RAM,esRAM are expensive than edRAM

And who told you 720 not gonna use high-end GPU?

A NeoGaf insider mentioned this yesterday

"If these numbers are right, the Durango GPU is actually better when it comes to pure graphics"

http://www.neogaf.com/forum/showpost.php?p=47002439&postcount=936

the term "better", is suposed to be folowed by the term "than"...

and then by "because"....



Proudest Platinums - BF: Bad Company, Killzone 2 , Battlefield 3 and GTA4