Kasz216 on 04 November 2009
| Uberkiffer said: Wow, I've got a bachelors in Information Systems and I've noticed alot of people for some reason think that 32bit is 2x faster than 64bit, this is WAY OFF. To explain, 32bit can have 32 spaces of 0's and 1's. 0101 is 4 bits lot, which has 16 different possibilities of input code (0000, 0001, 0010, 0100... etc..). 0101 1100 is 8 bits long and processes 256 possiblities of input code. Every bit added is equal to every previous bit *2. Once again, for example 0101 is 4 bits (16 possibilities), 1 0101 is 5 bit and has 32 possibilities, just as 10 0101 is 6 bit and has 64 possibilities. If you followed any of that, then you can understand that 32 bit to 64 bit is actually a huge difference in computational potential. 32 bits 2^32= 4,294,967,296 possibilities. 64 bits 2^6=18,446,744,073,709,551,616 possibilities, which is exactly 4,294,967,296 times more possibilities, which is far more than simply two times the power. Beyond all of that, it's about the coding and the algorithims to make the extra space that much more badass. |
See... that I actually knew. About bit sizes being far more drastic then double. Just didn't understand what impact that may have on the operating system.








