TheBigFatJ said:
I happen to be an HPC admin. A minority of jobs run on my clusters can be done in less than 512MB/memory per job, and that alone wouldn't fit within the PS3's system memory (256MB). In fact, in 2004 we purchased our first 192GB memory machine so we could run models that consume hundreds of gigs of memory. And recently I upgraded one of our clusters to 16GB of memory per node. Necessities for performance. What's more, I/O is becoming a bigger and bigger necessity. A single gigabit connection isn't enough anymore. The PS3 could be used on small scale for very specific tasks, but the users would quickly discover that they're limited to a tiny problem space whereas much of the interesting stuff requires significantly more memory. Hell, I just added 250TB to our data center to hold data, and we already had significantly more spinning disk than that. |
Not all things require that kind of memory, SETI workloads, protein folding, etc...all have small amounts of data that need to be in memory for processing but require a complex procedure to be performed that takes some time to do. I realize you guys are trying to point out the memory limitations here, and I completely agree it is severely limited in the memory department both for many science applications and for gaming applications.
But I don't think that effects my point at all, the architecture is still the root cause of many issues as I said in my first post and in general I think most people agree that even keeping Blu-Ray and going with a very powerful architecture they could have made something more developer friendly and that change alone would have made a big differnce as many games would likely not be delayed and the PS3 would be getting into the thick of this "console war" this holiday instead of waiting for the blockbuster games to arrive in '08. A pretty big difference imo.








