By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Xbox One Dev: GDDR5 Is Uncomfortable [...] ESRAM Provides High Bandwidth At Low Power Read

z101 said:
Ashadian said:
Lol! Lol! I wonder why PC GPU's use GDDR5 Ram??? Maybe MS should send Nvidia and AMD an email telling them to use Esram instead. Lmfao! MS shill talks up his own product while dismissing the competition without a shred of evidence to back it up with!

But eDRAM is best when the programmer know it is there and can fully utilize it. Use of eDRAM in PCs difficult. Even when a developer want to use it they must make code for PCs with or without eDRAM. But eDRAM is perfect for consoles with their standardized hardware.

Modern graphics card use eDRAM to: http://www.theinquirer.net/inquirer/news/2265428/intel-haswell-chips-will-have-onpackage-dram-for-cpu-and-gpu

Even a Sony Lead system architect of the PS4 admits that eDRAM gives a performance boost: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

Whilst what you say is true (about the programming knowing what's available), it's also almost irrelevant to the majority of developers in this day and age where most games are multiplatform. Most developers don't have enough time to spend optimising their software for only one platform and the result will be most 3rd-party devs doing the bare minimum to get the game to run smoothly.

Look at this gen, most developers used the 360 eDRAM for a small subset of tasks. It was worse for the PS3 with its Cell processor, split RAM and GPU with fixed pixel and vertex shaders. It was such a nightmare for most devs that a number of 3rd party titles look and perform worse compared to their 360 equivalent. Early games didn't even make use of the SPEs in the Cell processor.

It's only the exclusive devs that really have the time to optimise their code thoroughly which only really comes with working on a single platform.



Around the Network
DJEVOLVE said:
Ashadian said:
z101 said:
Ashadian said:
Lol! Lol! I wonder why PC GPU's use GDDR5 Ram??? Maybe MS should send Nvidia and AMD an email telling them to use Esram instead. Lmfao! MS shill talks up his own product while dismissing the competition without a shred of evidence to back it up with!

But eDRAM is best when the programmer know it is there and can fully utilize it. Use of eDRAM in PCs difficult. Even when a developer want to use it they must make code for PCs with or without eDRAM. But eDRAM is perfect for consoles with their standardized hardware.

Modern graphics card use eDRAM to: http://www.theinquirer.net/inquirer/news/2265428/intel-haswell-chips-will-have-onpackage-dram-for-cpu-and-gpu

Even a Sony Lead system architect of the PS4 admits that eDRAM gives a performance boost: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

It isn't perfect for consoles as theirs only 32MB of it. It then becomes a puzzle to solve and a bottleneck. GDDR5 Ram is the better overall solution!


So your proof? oh it's all opinion at this point.

There are two links there. I suggest you read them. Read what Cerny is saying and you'll have your answer!



Ashadian said:
DJEVOLVE said:
Ashadian said:
z101 said:
Ashadian said:
Lol! Lol! I wonder why PC GPU's use GDDR5 Ram??? Maybe MS should send Nvidia and AMD an email telling them to use Esram instead. Lmfao! MS shill talks up his own product while dismissing the competition without a shred of evidence to back it up with!

But eDRAM is best when the programmer know it is there and can fully utilize it. Use of eDRAM in PCs difficult. Even when a developer want to use it they must make code for PCs with or without eDRAM. But eDRAM is perfect for consoles with their standardized hardware.

Modern graphics card use eDRAM to: http://www.theinquirer.net/inquirer/news/2265428/intel-haswell-chips-will-have-onpackage-dram-for-cpu-and-gpu

Even a Sony Lead system architect of the PS4 admits that eDRAM gives a performance boost: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

It isn't perfect for consoles as theirs only 32MB of it. It then becomes a puzzle to solve and a bottleneck. GDDR5 Ram is the better overall solution!


So your proof? oh it's all opinion at this point.

There are two links there. I suggest you read them. Read what Cerny is saying and you'll have your answer!

Come on the proof is that AMD and Nvidia exclusivley run GDDR5 setups, a software company has not just discovered that these 2 hardware giants have been doing it wrong the last 10 years. They said in the EG interview, they chose DDR3 because they needed to balance the needs of 3 operating systems there choice was nothing to do with raw game performance, it was about sticking with what they know as there not really a hardware company. Im sure they guys there are very intellignet its not anyone that can architect a computer, but they should of got more outside help from AMD. 

Choosing DDR3 made it pointless having more than 16 ROPS which intern made it pointless having more than 12 CU's, without GDDR5 they chose the best graphics chip they could. 



Why are people here even down playing the benefits of GDDR5 ? News flash people, GDDR5 takes a dump on DDDR3 with ESRAM and I'm being serious here. What are devs going to do with only 32mb of fast cache when the rest of the the memory is slow. There will be alot of cases in where the GPU will need to access the main system memory in fact that is what should happen mostly considering the ESRAM is likely used as a locality of reference for frequently accessed data but eventually it will need to read from the main memory where the ESRAM doesn't have that piece of data. 32mb is tiny considering it will only give a miniscule bump in bandwidth. BTW the PR team has ran out of everything thing else to talk about so their probably cornered to talk up the advantages of the ESRAM when the engineers will realize that such a small amount of memory will mean jack for performance impacts.



Ashadian said:
z101 said:
Ashadian said:
Lol! Lol! I wonder why PC GPU's use GDDR5 Ram??? Maybe MS should send Nvidia and AMD an email telling them to use Esram instead. Lmfao! MS shill talks up his own product while dismissing the competition without a shred of evidence to back it up with!

But eDRAM is best when the programmer know it is there and can fully utilize it. Use of eDRAM in PCs difficult. Even when a developer want to use it they must make code for PCs with or without eDRAM. But eDRAM is perfect for consoles with their standardized hardware.

Modern graphics card use eDRAM to: http://www.theinquirer.net/inquirer/news/2265428/intel-haswell-chips-will-have-onpackage-dram-for-cpu-and-gpu

Even a Sony Lead system architect of the PS4 admits that eDRAM gives a performance boost: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

It isn't perfect for consoles as theirs only 32MB of it. It then becomes a puzzle to solve and a bottleneck. GDDR5 Ram is the better overall solution!

 

PS2 had embedded ram, 360 has embedded ram, gamecube, wii, wiiu, even psp according to wikipedia. Seems the puzzle has been solved by devs long ago.



Around the Network
Xenostar said:

Come on the proof is that AMD and Nvidia exclusivley run GDDR5 setups, a software company has not just discovered that these 2 hardware giants have been doing it wrong the last 10 years. They said in the EG interview, they chose DDR3 because they needed to balance the needs of 3 operating systems there choice was nothing to do with raw game performance, it was about sticking with what they know as there not really a hardware company. Im sure they guys there are very intellignet its not anyone that can architect a computer, but they should of got more outside help from AMD. 

Choosing DDR3 made it pointless having more than 16 ROPS which intern made it pointless having more than 12 CU's, without GDDR5 they chose the best graphics chip they could. 


It actually all comes down to cost.
nVidia and AMD *can* have dedicated caches equivalent to eSRAM on their GPU's in combination with GDDR5 or DDR3 for an even larger performance boost, however when you have transister counts already in the Billions, it really doesn't make sense to either: blow the die size up more (I.E. More expensive to make, lower yields) or sacrifice on compute performance.

AMD could have done it with Integrated Graphics but never did, instead they provided side-port memory as it was cheaper. (IGP's are cost sensitive!)
The IGP could even access the Sideport and System memory in tandem.

Intel however does provide eDRAM with it's Iris IGP's, but that's a premium solution and not available on all it's graphics solutions.

As for your opinion on DDR3 making more than 16 Rops and 12 CU's pointless, well. Yes and no.
Microsoft really couldn't throw more GPU hardware at the problem even if it wanted too, the APU is already giant and expensive to manufacture, yields would indeed be low.
The, eSram makes up for allot of the deficiencies in bandwidth anyway, not to mention the Xbox One has lower bandwidth requirements in order to achieve peak utilisation to begin with thanks to it's smaller Graphics processor.

Plus, you can do non-bandwidth sensitive tasks on the extra CU's anyway.

Personally though, both machines are a bit of a let-down in the hardware stakes, compared to last generation the GPU's are only mid-range and already we are seeing concessions in games on both platforms either with framerates and/or resolutions, that said, it's still only early days, but I wouldn't be surprised if later in the generation that a chunk of the PS4's games were only 720P. - Which is potentially sad, but the nature of the consoles fixed hardware.

The other issue is, I wont accept 720P on a 5 inch phone in 2013, on a gaming system with a 7+ year lifespan and with the image stretched over 60+ inches of real-estate? What were they thinking!?



--::{PC Gaming Master Race}::--

Pemalite said:
Xenostar said:

Come on the proof is that AMD and Nvidia exclusivley run GDDR5 setups, a software company has not just discovered that these 2 hardware giants have been doing it wrong the last 10 years. They said in the EG interview, they chose DDR3 because they needed to balance the needs of 3 operating systems there choice was nothing to do with raw game performance, it was about sticking with what they know as there not really a hardware company. Im sure they guys there are very intellignet its not anyone that can architect a computer, but they should of got more outside help from AMD. 

Choosing DDR3 made it pointless having more than 16 ROPS which intern made it pointless having more than 12 CU's, without GDDR5 they chose the best graphics chip they could. 


It actually all comes down to cost.
nVidia and AMD *can* have dedicated caches equivalent to eSRAM on their GPU's in combination with GDDR5 or DDR3 for an even larger performance boost, however when you have transister counts already in the Billions, it really doesn't make sense to either: blow the die size up more (I.E. More expensive to make, lower yields) or sacrifice on compute performance.

AMD could have done it with Integrated Graphics but never did, instead they provided side-port memory as it was cheaper. (IGP's are cost sensitive!)
The IGP could even access the Sideport and System memory in tandem.

Intel however does provide eDRAM with it's Iris IGP's, but that's a premium solution and not available on all it's graphics solutions.

As for your opinion on DDR3 making more than 16 Rops and 12 CU's pointless, well. Yes and no.
Microsoft really couldn't throw more GPU hardware at the problem even if it wanted too, the APU is already giant and expensive to manufacture, yields would indeed be low.
The, eSram makes up for allot of the deficiencies in bandwidth anyway, not to mention the Xbox One has lower bandwidth requirements in order to achieve peak utilisation to begin with thanks to it's smaller Graphics processor.

Plus, you can do non-bandwidth sensitive tasks on the extra CU's anyway.

Personally though, both machines are a bit of a let-down in the hardware stakes, compared to last generation the GPU's are only mid-range and already we are seeing concessions in games on both platforms either with framerates and/or resolutions, that said, it's still only early days, but I wouldn't be surprised if later in the generation that a chunk of the PS4's games were only 720P. - Which is potentially sad, but the nature of the consoles fixed hardware.

The other issue is, I wont accept 720P on a 5 inch phone in 2013, on a gaming system with a 7+ year lifespan and with the image stretched over 60+ inches of real-estate? What were they thinking!?


Interesting, and yeah i know you can do more on the extra CU's Mark Cerny talked about that, how the PS4 has too many CUs to be considered balanced, but they want devs to use the extra for GPGPU, further down the road. MS also talked about GPGPU but in the next breath said they only have enough CUs to be balanced for graphics, so maybe your right and they didnt want the APU any bigger.



To boil it down more succinctly, "it was much cheaper and provided a similar, though slower solution."



Pemalite said:

The other issue is, I wont accept 720P on a 5 inch phone in 2013, on a gaming system with a 7+ year lifespan and with the image stretched over 60+ inches of real-estate? What were they thinking!?

Easy:

At Sony: "Don't make a 600 USD/EUR again!"

At Microsoft: "We have spent X on the new Kinect. Now we have Y USD/EUR left for the console"



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

walsufnir said:
Ashadian said:
z101 said:
Ashadian said:
Lol! Lol! I wonder why PC GPU's use GDDR5 Ram??? Maybe MS should send Nvidia and AMD an email telling them to use Esram instead. Lmfao! MS shill talks up his own product while dismissing the competition without a shred of evidence to back it up with!

But eDRAM is best when the programmer know it is there and can fully utilize it. Use of eDRAM in PCs difficult. Even when a developer want to use it they must make code for PCs with or without eDRAM. But eDRAM is perfect for consoles with their standardized hardware.

Modern graphics card use eDRAM to: http://www.theinquirer.net/inquirer/news/2265428/intel-haswell-chips-will-have-onpackage-dram-for-cpu-and-gpu

Even a Sony Lead system architect of the PS4 admits that eDRAM gives a performance boost: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

It isn't perfect for consoles as theirs only 32MB of it. It then becomes a puzzle to solve and a bottleneck. GDDR5 Ram is the better overall solution!

 

PS2 had embedded ram, 360 has embedded ram, gamecube, wii, wiiu, even psp according to wikipedia. Seems the puzzle has been solved by devs long ago.


Why would you want or need a tiny amount of embedded ram when you can use one unified pool of GDDR5 ram? This is exactly what Mark Cerny is saying. The extra die that MS lost by adding the 32mb ESRam Sony have benefitted from having a far more powerful GPU and a much easier platform to develop for.