| Aielyn said: First of all, the reason that consoles improve in power is because the power/cost ratio has a moving optimum. If you go below the optimum power point, the overhead costs begin to dominate, and the overall cost per unit of power grows. On the flipside, as you increase the power beyond the optimum, the cost increases faster than the power - that is, it costs more than twice as much to get double the power, for instance. Think of it this way - it would probably cost more than $10 to manufacture a 1 GFLOP GPU now. Meanwhile, it might cost $50 to manufacture a 500 GFLOP GPU. If you wanted a 1500 GFLOP GPU, however, it would probably cost $200. Add to this the consumer factor - consumers aren't going to want to spend $1000 on a games console, but on the flipside, they won't spend $50 on a games console that does what the one they already have does. This is why the consoles improve in power. The fact that the PC is always ahead of consoles demonstrates that "keeping up" isn't even remotely a factor. And the fact that PC ownership doesn't soar in the years preceding a new generation also proves that gamers aren't driven by the latest hardware. As for the PS2/GC/Xbox numbers, please don't pull numbers from nowhere. We actually have accurate numbers for the three. Here they are (copied from another thread, where I posted it once already): Gamecube - 1.6 GFLOPS CPU, 9.4 GFLOPS GPU So, here's the total for each system: Gamecube - 11 GFLOPS This means that the Xbox is more than three times the power of the PS2, and just under double the power of the Gamecube, based on raw numbers. If you factor in that the PS2 therefore didn't have any specialised Graphical operations (as all such operations were being done on the CPU), the PS2 was even weaker (see, for instance, Resident Evil 4 GC vs PS2, for demonstration of this fact). The only way that the Gamecube came out ahead of the Xbox was due to non-power factors - things like the TEV (or whatever it was called) gave it the ability to do more with less. And this is why, most of the time, the Xbox did better. And there's absolutely no way that the Xbox was only about 30% more powerful than the PS2. It was, on raw numbers alone, a touch under 3.5x more powerful. More, when you factor in things like the presence of SDRAM, and the fact that the PS2 had no graphics logic. Generations are defined by the dictionary - I suggest you look it up. |
You are way overshooting both the GC and the XB GPUs.
First it seems like you are either counting the XB GPU as 8 pipelines, where it does have only 4, or you are placing an insane ammount of operations per second on each pipeline, very unlikely considering we know very well what's inside of each in those times of non-unified shaders. It wouldn't be able to come even close to a decent performance with its memory bandwidth if either were the case. Second even with the fixed-function hardware inside the Gamecube's GPU it wouldn't go beyond 8 gigaflops. So it's more like...
GC - 1.9 gigaflops CPU, 8 gigaflops GPU
XB - 2.9 gigaflops CPU, 11.1 gigaflops GPU
Meaning about twice the PS2 and perhaps four times the real performance of the Dreamcast. Four times, by the way, is about the same differene we're seeing between the Wii-U and Durango/Orbis specs.








