RolStoppable said:
No, it was change in technology. PCs still use 32-bit processors as far as I know and consoles maybe too, or 64-bit processors (I could be wrong on both accounts). Anyway, the point is that main CPUs of consoles stopped to double the rate of how many bits they could handle in one step (which is what the "bit" meant in the old times, they only refered to the CPU), that's what the bit number on a processor means. So a 32-bit processor handles 32 bits in a single step. The whole power of a processor only becomes clear by how frequent it can work with information, that's determined by the hertz (1 hertz means a processor can do it once a second, 1 Mhz means it can do it a million times per second). Nowadays the bit of the processors remain the same while only the Mhz or Ghz go up. Another reason why the "bit" stopped to matter is because the CPU alone can't accurately describe a console's power and it was mainly marketing talk anyway (Our console has more "bit", therefore it is better.). For example, the Mega Drive CPU is faster than the one of the SNES, but the SNES still managed to create better graphics due to effects like Mode 7 and having a way bigger range of colors to choose from and display on the screen at the same time (MD: 64 colors out of 512 available, SNES: 256 out of 32,768). I am sure someone else can explain this stuff a whole lot better, but that's basically why companies stopped using "bit". |
Thanks for this info Rol.







