By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Two flaws:
1. The eSram is huge and actually causes the APU to quite larger than the PS4s, even though the PS4 has a better GPU. The One's APU will always be larger and more expensive because of the eSram.
2. Cramming 5 gigs of data, let's say 2gigs at any given time is being drawn, into 32 megs of ram is a headache inducing problem. This will cause a lot of issues and will actually lower performance since the slower DDR3 will be used more often than not.

Though Holodust pretty much mentioned everything I typed. Though, there isn't some magic API that Microsoft can create for memory allocation. The best they've managed to come up with is using a stack (Data is actually allocated via memory stack by the O.S.), there's very little you can automate to decide what should or shouldn't go in the eSram. You can document tips to get the most out of it though, but there is no easy way with it. Though most devs probably already know what to do since the PS2 actually only had eDram for a framebuffer. Gamecube, Wii and 360 use eDram too. Though the eSram is quite tiny compared to the system ram so it's actually even worse (32mb for 4mb eDram on PS2, 40mb for 3 mb eSram on GC, 512 for 10 mb eDram on 360, 5 gb for 32 mb eSram on Xbox One).