MikeB on 20 September 2007
@ Hockeymac18
Additionally, you might wonder why I keep harping on about "the PS3 should have been launched in '05"? Really it's because it shows that the first generation games for the PS3 had a lot more time in development than most people realize.
Except for that's not really the case. There were marketing reasons for Sony to hint to a possible pre-mature PS3 launch.
Development kits weren't available to developers until a couple of months before launch. Most developers were just using legacy stuff, Edge will help in this regard.
Having Cell processors handle the majority of PS3 graphics should have been possible, but the problem is legacy code. Legacy code needs to be adapted greatly to be broken up in pieces (it may be easier to redesign the game engine from scratch) and be distributed over the multiple SPE processors.
This is a problem with many new technologies, they will have to go up against a legacy install base. Be that much better CPU architectures (goin up x86 legacy software), a better computer or OS (going up against a monopoly of Windows software and closed standards), etc. Let's say you would develop the best OS in history, running circles around Windows, why should consumers buy your OS when all the 3rd party software they need or want is available for Windows? Why should developers support your OS, while there is no install base, it's a chicken and egg thing. IMO it made good sense for Sony to add the RSX GPU.
Having a PC style GPU in the PS3 makes sense, as developers already know how to develop games for the PC, by using just the PPE and RSX development isn't much different as compared to developing a game for a decent single-core PC or Mac.