By using this site, you agree to our Privacy Policy and our Terms of Use. Close

If you wanted to define % used as "amount of hardware resources used that are available in terms of memory, time and fillrate" then I'd speculate that most games on the market right now use between 80% and 100% of the graphics fillrate, video and main system memory, and available CPU slices. Lair, for example, seems to require well over 100% of the PS3 and seems to require at least twice as much fillrate as the PS3 has just to get a normal framerate.

Suggesting that a game uses "less than 100%" of a system is naive in at least one way: a single bottleneck on the system limits the performance of the game. So, effectively, as soon as you are bottlenecked at one point you've set an upper limit on the performance you can squeeze out of the system. You could say that there is some headroom in terms of fill rate, CPU slices, or available memory (graphics or main), and perhaps you could use those values to quantify "less than 100%", but don't anticipate that a game is going to be released looking and running twice as good as Lair, for example, because it likely won't happen.

The fact of the matter is that the PS3 has a fairly weak video card by todays standards and only 256MB of main system memory, approximately 80MB of which are consumed by the OS.  It's not a lot of resource to work with.  The only question mark is the cell, because game developers see a lot of floating point potential there and could possibly come up with tasks that the cell is well suited toward.  But in the general, it doesn't provide as much general purpose power as the 360's CPU cores do.