SvennoJ said:
Lol we never used those terms at work. Allocating a megabyte of memory was always 1024 KB, not 1000 KB. |
Lol, then you use the wrong terms for decades...
It were sloppy computer engineers who used the metric prefixes "kilo" (10^3, established since 1795) and "mega" (10^6, established since 1873) to describe certain binary values with somewhat close values - as 2^10 is 1024 ("close enough" to 1000) and 2^20 is 1048576 ("close enough" to 1000000). These were never official units in any way, just a kludge to get along. Standard documents always used power of 10 prefixes — which leads, by the way, to the effect of serial transmissions always being decimal - a 9.6 kbit line transfers 9600 bits per second, not 9830 :)
And they even used it inconsistantly. Perhaps the most egregious nonsense comes from the high density floppy disk which is described as having 1.44 Megabytes where a Megabyte is defined as 1000 kilobytes and a kilobyte is defined as 1024 bytes. i.e. 1.44 × 1000 × 1024 which is plainly ridiculous.
In the late 1980s/early 1990s it became obvious that there is a need for a clear meaning, so an international standard was proposed (kibi, mebi, gibi... for binary prefixes) - and accepted in the late 1990s.