By using this site, you agree to our Privacy Policy and our Terms of Use. Close
windbane said:

This is getting very ridiculous. Apple switched to Intel because they were slower for years. Much more expensive? Are you serious? The one good part of Macs were they were affordable, because their CPU was so outdated. I've built computers for years, I've looked at benchmarks for years, and I can assure you that the Macs were MUCH SLOWER than Intel CPUs at any point in time for at least the last decade. Why do you think gamers didn't buy them? And if you want to talk Supercomputers, the Cell is used for those now.  Supercomputers 10 years ago aren't that fast and do not indicate consumer level prices.


Apple switched to Intel in 2005 because the Core Duo was far more powerful than a PowerPC 970MP (G5) for the price. Back in 1999, when the G3 and Pentium III were in direct competition, Apple always bragged that IBM's G3 was twice as powerful as the Pentium III at the same clockspeed (which was somewhat true); soon afterwords Apple was braggin that the G4 was over twice (2.6 times) as powerful as the Pentium 4 at the same clockspeed. Intel's response was to increase the clockspeed so that the 400MHz G3 was in direct competition with the 1GHz Pentium 3; the 600MHz-800MHz G4s were in direct competition with 2GHz to 2.53GHz Pentium 4 processors.

IBM (and Nintendo) took the G3 processor and heavily modified it for the Gamecube; there were (approximately) 50 vector instructions added to (dramatically) improve its performance with 3D calculations. The overall result was that the Gekko was far more powerful than a standard G3 processor for 3D game applications. Whether the Gekko was more powerful then the modified Celeron that was in the XBox has been the center of debate since they were released, but the general consensus is that they're very similar in performance.

windbane said:

Also, i LOVE how you guys are ignoring the other benchmarks. Look at the freaking GPU numbers on the Wii! it's barely better than previous generations as well. The RAM is even more sad of a situation. So we can continue this architecture arguement until someone shows some benchmarks but you can't change the other facts.

Oh wait, next you'll be telling me that the GPU and RAM is SUPER-MEGA-OPTIMIZED LIKE NOTHING BEFORE because for some reason Nintendo holds the secrets of computing and is selling it at a loss. Oh wait, their Wii is worth around $150. That's right, I forgot.

The Flipper (Gamecube's GPU) was a very different beast than what was being developed for the PC at the time (or what is currently available in the PS3 or XBox 360). In 2000/2001 both ATI and Nvidia moved towards producing graphics cards with GPUs that abandoned fixed functionality pipelines (with lots of graphical features) in favour of programmable pipelines because the incompatibility between the features across videocards prevented developers from taking advantage of these features. At the same time the Flipper was designed around built in features and was (much) faster than the PC GPUs (like the XBox's GPU) when both systems only used graphical techniques that were built into the Flipper, but the Flipper had difficulty emulating many of the pixel and vertex shaders that were possible on the PC GPUs.

 

Overall we know very little about either the Hollywood or Broadway processors that are in the Wii and we only know that they're based on the Gamecube's Gekko and Flipper processors; if they are only overclocked versions of these processors the Wii would be (roughly) 1.5 to 2 times as powerful as the XBox and we should see games that look somewhat better than anything in the previous generation; on the other hand if they extended the instruction set we should see games that look quite a bit better than the previous generation.