Squilliam said:
cellpadding="2" width="90%" bgcolor="#bbbbbb">
Groucho said:
Some of those numbers are "conveniently" rounded to make the 360's GPU look better in more circumstances (last example, for instance, should have the RSX at "4.4 billion pixels/sec", but the author "conveniently" rounded it down to the nearest billion).
However, just about every graphics programmer I know prefers the 360's GPU over the PS3's. Its more flexible, and except in specific circumstances, it makes it easier to use and faster. If the GPU was all that mattered, the 360 would definately be the superior architecture, IMO.
The PS3 does have one (titanic) advantage in the graphics dept. though -- the SPUs are capable of doing alot of the GPU work for the GPU, and merely feeding it simpler, pre-processed data via the PS3's awesome bus architecture. The RSX+SPUs are more than capable of out-performing the X360's GPU, but it takes a LOT of talent to do so -- so much so that you're probably only ever going to see this difference in exclusive titles, or very-late-era cross-platforms (~2010 or later).
|
Xbox 360 - 2.0 Billion Vertices/sec (using only 16 of the 48 Unified Pipelines)
Xbox 360 - 16.0 Billion Pixels/sec (using 32 of the 48 Unified Pipelines)
PS3 - 1.0 Billion Vertices/sec
PS3 - 16.0 Billion Pixels/sec
Is probably the more likely scenario, though with tesselation the ratio on the 360 changes to more like:
Xbox 360 - 1.0 Billion Vertices/sec (using only 8 of the 48 Unified Pipelines)
Xbox 360 - 20.0 Billion Pixels/sec (using 40 of the 48 Unified Pipelines)
Just a rough guess, and you can see the significant edge the 360 GPU has with MSAA, though the Quincunx on the PS3 is getting better and better and it does produce a rough equivelent of 4xMSAA and they are mitigating the nasty blurryness with experience.
|
For once, I agree with Squilliam. In his example, which is very typical of 3rd-party cross platform development, the 360 version just plain ends up with a higher fillrate, because the game engine authors aren't using the SPUs to augment the RSX, and because reducing the amount of fill your app wants to use is much much harder than reducing the number of animated bones it wants to use, etc. This is the case in the vast majority of cross platform games to date -- and the reason why, typically, the PS3 "port" versions are done by merely cutting back the resolution and upscaling it, causing some mild blurryness, or framerate issues, on the PS3 where its not on the 360.
Cross-platform 360-to-PS3 ports are done in this manner to save money and time, which are the primary concerns of a publisher, especially when considering a port. PS3-to-360 ports, however, are often written with the SPUs in mind -- the game devs know that if they farm some work off the the SPUs on the PS3, they can rely upon the 360's GPU to handle the extra work where the SPUs aren't present. The games also tend to not be designed in such a manner that they become so fill-rate bound.
A smartly designed multi-threaded PS3 engine is a piece of cake to port to the 360... the only caveat is that you have to be careful with the amount of animation and SPU vertex computations you do with the PS3... because the 360 isn't going to be able to keep up if you overdo it (since vector math is where the PS3s real muscle shows). Thus, in a PS3-to-360 port, the character animations have less detail than they could have (the characters have smaller bone counts, and less frequently sampled animations than they could have), but the fillrate is never an issue.
Honestly its much easier to tell your art team "cut back on the number of animated bones in the skeletons, and reduce the animation sample rate" that it is to tell them "pick some texture stages to lose, or other fillrate issues, so we can do this on a PS3 and have a decent framerate", because tools exist to easily do the first... but not the latter.