By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
GaoGaiGarV said:
If they hadn't used a Bluray drive they hadn't had the problems with manufactoring of blue lasers which caused them to postpone the launch of the PS3 by almost a year. So they would have released the console at the same timeframe as the Xbox 360 so the latter one wouldn't have had the time advantage. Paired with the PS2 and PS1 backwards compatibility this would have been a huge advantage. When PS3 was released backwards compatibility wasn't that big of a deal anymore because we already were in the middle of the HD-generation. Also because of the lower price the console would have been more competitive and Sony wouldn't have to axe the backwards compatibility feature in the first place.
If they hadn't used the Cell CPU then maybe the whole architecture of the console would have been different.

The irony is... Despite the power and complexity of the Cell... Playstation 3 emulation is better than Xbox 360 emulation.

Clearly not a real excuse or a hindrance in hindsight... Plus Microsoft took a unique approach to backwards compatibility on the Xbox One, they repackaged the games, virtualization the older console environment and only emulated what they absolutely needed... Which meant even anemic hardware like the Jaguar CPU's were doing just fine.
It helped that Microsoft had some Xbox 360 hardware support in the Xbox One SoC, so Microsoft was forward thinking on that front.


GaoGaiGarV said:
They wouldn't have had the absurd idea to include 2 Cell CPUs, one as the main processor and one which acts as a GPU. They wouldn't have to come to the realization that this was a bad idea and wouldn't have to include a striped down Nvidia GPU at the last moment which was even less powerfull than the Xbox 360s GPU. They most likely would have developed a much more powerful GPU which was even more powerful than the one in the Xbox.

The lower price and earlier release date would have prevented many customers from switching to Xbox 360 and PS3 could have reached 100m sold units.

Sony learned very early that they couldn't use a Cell CPU as a GPU, for one, the Cell CPU doesn't have any GPU features baked into it's hardware anyway (Like Texture Sampelrs), it would have required allot more development time and R&D.
Plus the yields on the Cell meant the CPU was expensive to manufacture. (And thus required a few units to be deactivated to increase yields for the final product.)

Nor does it even have the performance to make it feasible... Why spend the time and money bringing a new chip up to speed when ATI and nVidia have already done all the work?
Sony had already decided to go with RSX a year and a half before it's console released, thus it would have been chosen it much longer before that even.
https://www.anandtech.com/Show/Index/1683

nVidia and ATI were roughly the same level of performance though... In gaming.
https://www.anandtech.com/show/2080/4

The Xbox 360 GPU however could pull ahead in vertex operations thanks to the unified shaders, also had better texture compression support so it could do more with less memory and bandwidth... But also had small advantages like a tessellator which was put to full use in games like Halo 3 for it's water surfaces.

So when developers built and optimized games for the Xbox 360, they were able to take advantage of these small "tricks" to come out ahead.

It also helped that the Xbox 360 was just an easier environment to build games for, not just because of the hardware, but rather often... The software and development tools.

So with GPU advantage on 360's side, just how much more capable was Cell than Xenon in practical gaming applications? How much of an actual edge did Sony's expensive and developer-unfriendly historical solution buy them in the CPU department?