Viper1 said:
The best way to look at the Xbox vs Wii debate is to say the Wii has a slightly better CPU and the Xbox has a slightly better GPU. Technicaly, the Wii's GPU is just as capable but it's more difficult to draw out those abilities because it's a fixed function GPU while theXxbox employed progammable shaders which allowed developers to use tools and tricks and methods they were familiar with. And yes, FLOPS can be very misleading because they can be calulated in a way that would not actually operate in the real world. It's like saying a human being can run 27 mph. While true, a human can run at that rate of speed momentarily they cannot actualy run 27 miles in one hour. I already took that into consideration with the Xbox and docked it down from 83 to 23. There's a lot to it that goes into whether a figure is near real world or inflated. And even then other factors come into play. But G/TFLOPS are the closest single figure we have to being able to compare disparate computing systems. Dahuman, single precision FLOPS are still relevant to gaming. It's double precision FLOPS that have no value to games. |
I said "not reliable" as a measuring unit for gaming, not that it's "not relevant." :P Too many factors go into a piece of device, therefore creating massive bottlenecks, it's how they can build a machine that has as little bottleneck as possible while maintaining a form factor that counts, the idea is not treating it like a PC and then just toss high numbers together since the power draw and form factor are different for consoles and they can't brute force it like PCs can.