By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - DigitalFoundry: X1 memory performance improved for production console/ESRAM@192 GB/s

i will admit to have no knowledge of this whatsoever. But people at GAF and Beyond3D seems to believe this supports the downclock rumour.

quote:
Well,
800Mhz x 128 bits = 102 GB/s
According to the article the flux capacitor found by MS allows read and write at the same time(intel, hire these guys) and so theorical max bandwidth should be 102 x 2 = 204
But the news say is 194 with take us to a 96GB/s read and 96GB/s write and so to 750MHz x 128 bits.

Downgrade or what?

/quote



Around the Network
kowenicki said:

No.  what you have ALWAYS said was that it was downclocked.

lol no... show me that.

I said and I will repeat to you... MS is having issues in production yields.

I always said that it's hard to believe in the downclocked.

TELLL THE TRUTH Kowen! Why the panic?

PS. To run at 192GB/s they changed the clock of the eSRAM at least



masterb8tr said:
i will admit to have no knowledge of this whatsoever. But people at GAF and Beyond3D seems to believe this supports the downclock rumour.

quote:
Well,
800Mhz x 128 bits = 102 GB/s
According to the article the flux capacitor found by MS allows read and write at the same time(intel, hire these guys) and so theorical max bandwidth should be 102 x 2 = 204
But the news say is 194 with take us to a 96GB/s read and 96GB/s write and so to 750MHz x 128 bits.

Downgrade or what?

/quote

That's the eSRAM clock... not the GPU or CPU... but yeah for 192GB/s the eSRAM is running at 750Mhz (read/write simultaneos) or 1500Mhz... for 102GB/s 400Mhz (read/write simultaneos) or 800Mhz.

We don't know if the eSRAM clock is synced with GPU or CPU clock.



They are still tinkering with their specs this late....they must be panicking.



theprof00 said:
disolitude said:
theprof00 said:
wow looks like disolitude is wrong again :D
Guess they did go for higher bandwidth in the end, despite dis saying they didn't need it.


Actually no, it proves me right. It shows that X1 didn't need touse GDDR5 RAM. Perfectly capable with DDR3 + esRAM

I thought your point of that thread was that their setup was perfectly in sync with their output, and they didn't need the high speed of gddr5. If I'm wrong about that, my bad. That's how I interpreted it, although I do see that a large part of it was about gddr5, I thought it was more about the speed

All I was saying is that Xbox 1 GPU does not need a contant 150 GB/s throughput when it comes to bandwidth. 

GDDR5 RAM comes with extra power usage, extra heat, extra cost...and that Microsoft was able to maximize performance of the console without implementing GDDR5 RAM. 

Microsoft designed and built the Xbox 360 in 6 months in early 2005. If they thought GDDR5 RAM was needed after PS4 reveal, they would have made the change. Looks like the did make a change to compensate looking at this article...



Around the Network
ethomaz said:
masterb8tr said:
i will admit to have no knowledge of this whatsoever. But people at GAF and Beyond3D seems to believe this supports the downclock rumour.

quote:
Well,
800Mhz x 128 bits = 102 GB/s
According to the article the flux capacitor found by MS allows read and write at the same time(intel, hire these guys) and so theorical max bandwidth should be 102 x 2 = 204
But the news say is 194 with take us to a 96GB/s read and 96GB/s write and so to 750MHz x 128 bits.

Downgrade or what?

/quote

That's the eSRAM clock... not the GPU or CPU... but yeah for 192GB/s the eSRAM is running at 750Mhz (read/write simultaneos) or 1500Mhz... for 102GB/s 400Mhz (read/write simultaneos) or 800Mhz.

We don't know if the eSRAM clock is synced with GPU or CPU clock.

yeah i did not understand any of that:P but thanks anyway:p



disolitude said:
theprof00 said:
disolitude said:
theprof00 said:
wow looks like disolitude is wrong again :D
Guess they did go for higher bandwidth in the end, despite dis saying they didn't need it.


Actually no, it proves me right. It shows that X1 didn't need touse GDDR5 RAM. Perfectly capable with DDR3 + esRAM

I thought your point of that thread was that their setup was perfectly in sync with their output, and they didn't need the high speed of gddr5. If I'm wrong about that, my bad. That's how I interpreted it, although I do see that a large part of it was about gddr5, I thought it was more about the speed

All I was saying is that Xbox 1 GPU does not need a contant 150 GB/s throughput when it comes to bandwidth. 

GDDR5 RAM comes with extra power usage, extra heat, extra cost...and that Microsoft was able to maximize performance of the console without implementing GDDR5 RAM. 

Microsoft designed and built the Xbox 360 in 6 months in early 2005. If they thought GDDR5 RAM was needed after PS4 reveal, they would have made the change. Looks like the did make a change to compensate looking at this article...

Really? I was under the impression that gddr5 ran cooler, used less power.

And yeah, you've just proven they could make a change. However, a change to gddr5 is much more radical than upping the esram speed :/ changing the ram would require a completely new motherboard.



It seems like this new generation the Xbox will be slightly more complicated to code for, so this time it might be that Xbox will have the wider visual and performance difference between first party and third party games.



disolitude said:

All I was saying is that Xbox 1 GPU does not need a contant 150 GB/s throughput when it comes to bandwidth. 

GDDR5 RAM comes with extra power usage, extra heat, extra cost...and that Microsoft was able to maximize performance of the console without implementing GDDR5 RAM. 

Microsoft designed and built the Xbox 360 in 6 months in early 2005. If they thought GDDR5 RAM was needed after PS4 reveal, they would have made the change. Looks like the did make a change to compensate looking at this article...

I agree... they doesn't need but it will be better with.

Just to fix... GDDR5 require less power usage than DDR3 because it uses low voltage heat and cost is right BTW.



theprof00 said:

Really? I was under the impression that gddr5 ran cooler, used less power.

And yeah, you've just proven they could make a change. However, a change to gddr5 is much more radical than upping the esram speed :/ changing the ram would require a completely new motherboard.

GDDR5 uses less power but genereates more heat.