By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Anybody think 8 gb of GDDR5 is a mistake for PS4

drkohler said:
fordy said:


Let me guess: You have some odd belief that buffering/caching comes at absolutely NO cost for a punish/miss routine, and that there's not such thing as latency. Once again, GDDR5 is to handle LARGE chunks of data, not the small blocks of an instruction cache.

Wow.. a "s/he said..s/he said" thread... grrreat.

Look, it is apparent you don't know how memory access works. Yes, gddr5 is here to handle large chunks of data. Guess what, ddr3 memory is also here to handle large chunks of data! My first post tried to explain it in simple terms why the ggrd5 latency problem has become a myth (assuming what I called "ugly" programming). Technically, there are a lot of what-ifs and things get rather complicated rather fast even though we are "only" dealing with a "get some memory" problem.

What you apparently don't know is the simple fact that every memory controller in every gpu/cpu in the world has a limited burst length. Whether it is a memory controller in a gpu or a memory controller in a cpu, they HAVE to use multiple burst sequences to do whatever they are supposed to do. If you want to learn what this means on the transistor level, you'd have to find controller manuals and figure out the timing diagrams (did I mention that I DESIGNED memory cards and memory controllers decades ago?).

In the case of a gpu, many, many, many bursts may target consecutive memory addresses. In the case of a cpu, many, many, many bursts may target consecutive memory addresses ("ugly" programming"), or small bursts target non-aligned target addresses ("clean, old style programming). The nez result is that bandwidth wins over latency since today's caches are so big that many, many, many bursts happen more often than single bursts.

This is getting waaaay too technical. So here is the ultimate result: 8Gb gddr5 in the PS4 wins hands down against any other pc-like setup. (It will be interesting to study the MS solution in the NextBox in detail, should that ever be revealed), but it is already known they use dedicated hardware "move" engines to connect ddr3 to edram to gpu to remedy the bandwidth problem).

You're totally missing the point that, while prefetch/burst ATTEMPTS to hide latency, it's severely punished in areas where data access is a lot more random. GPUs do not suffer this problem because the majority of their data is a lot more sequential (such as textures). You're missing 2 points here:

1. DDR3 also uses prefetch, and with it's improved latency over GDDR5, I can't see why you brought it up.

2. Higher latencies generally require a larger prefetch in order to try and mask latency. In a random access environment, this merely increases the waste (for instance, fetching 256bit prefetch only to read a 32bit integer yields 87.5% waste across the bandwidth. Of course, instruction data is a LITTLE more sequential, but given their much smaller cache block sizes, that pretty much makes that benefit incredibly negligible when comparing to the more randomly-accesed data cache. 

Perhaps you're one of those people who thought GDDR2 was exactly the same as DDR2 or something back in the days, but I'll say this once again, DDR AND GDDR ARE COMPLETELY DIFFERENT MEMORY ARCHITECTURES FOR COMPLETELY DIFFERENT USES. ONE IS NOT BETTER THAN THE OTHER FOR EVERYTHING. 

It's a tradeoff between the two, otherwise we'd have been using GDDR as system memory long ago...



Around the Network

Duplicated.



This pictures is a example about everything I said about GPU eficiency in consoles.



PS. I removed this picture from the Hardware Overview present in the PS3 DevKit... I think this document is public too but I can't find it in google (I give up after little search)



I don't even know the meaning of the words.



ethomaz said:

This pictures is a example about everything I said about GPU eficiency in consoles.



PS. I removed this picture from the Hardware Overview present in the PS3 DevKit... I think this document is public too but I can't find it in google (I give up after little search)

The main advantage is the closed architecture, and that you have one fixed hardware target, with a fixed memory amount, fixed graphics capabilities... The rest is peanuts.



Around the Network
Kynes said:
ethomaz said:

This pictures is a example about everything I said about GPU eficiency in consoles.



PS. I removed this picture from the Hardware Overview present in the PS3 DevKit... I think this document is public too but I can't find it in google (I give up after little search)

The main advantage is the closed architecture, and that you have one fixed hardware target, with a fixed memory amount, fixed graphics capabilities... The rest is peanuts.


The picture is so generic that it applies to almost every console generation and is equal to "tomorrow the sun will rise".



walsufnir said:
Kynes said:
ethomaz said:

This pictures is a example about everything I said about GPU eficiency in consoles.



PS. I removed this picture from the Hardware Overview present in the PS3 DevKit... I think this document is public too but I can't find it in google (I give up after little search)

The main advantage is the closed architecture, and that you have one fixed hardware target, with a fixed memory amount, fixed graphics capabilities... The rest is peanuts.


The picture is so generic that it applies to almost every console generation and is equal to "tomorrow the sun will rise".

Only fools would fall for it, but it's useful for the system wars.



walsufnir said:


The picture is so generic that it applies to almost every console generation and is equal to "tomorrow the sun will rise".

Yeap but when I wrote the same thing early in this thread I was called bullshitter... so the guys can now call Sony or Microsoft bullshitters instead me.

ethomaz said:
walsufnir said:


The picture is so generic that it applies to almost every console generation and is equal to "tomorrow the sun will rise".

Yeap but when I wrote the same thing early in this thread I was called bullshitter... so the guys can now call Sony or Microsoft bullshitters instead me.


Oh, didn't read that. But the slide offers no news. Can you show me your initial post again?



Yes, obviously 9GB of GDDR6 would have been better, 'cos it's more.



Nintendo Network ID: Cheebee   3DS Code: 2320 - 6113 - 9046