Nvm, you just forgot to multiply .01 by 100 to get the % (.01 * 100 = 1%). Funny how you act condescending when you're the one who's wrong, though.
9/10 vs 8/10 = 10% difference.
Hence, 9.1/10 vs 9/10 = 1% difference. Unless you're going to tell me the above is only a 1% difference.
So tell me, what's the definition of average?









