Finally, someone gets the definition of "k" correct :)
In PC terms ---> k = 1024 (i.e. kilobyte = 1024bytes), and M = 1024*1024 (i.e. megabyte).
Computers do not use base 10 like us humans, but instead base 2 (binary). 2^10 = 1024, which is usually close enough to 1000 for most people to consider them the same.
...
But yeah - when dealing in resolution, k 'probably' drops back to the "human" meaning --> i.e. $10k = $10,000 or 10k pixels = 10,000 pixels.
Its a 2% error when the wrong meaning is picked either way, so it doesn't make a huge difference.
Gesta Non Verba
Nocturnal is helping companies get cheaper game ratings in Australia:
Wii code: 2263 4706 2910 1099







