| CommonMan said: It's not a flaw, it's an aggregate. Thre are 3 "Averages": the mean, median and mode. Mathmeticians really have never decided what the "best" average is, it's a case by case basis. What metacritic shows is the mean score. Meaning take the scores, add them up and divide by how many scores there are (plus weighting but we can leave it out of this discussion). Each of the average types have strengths and weaknesses (for instance the mode is the most common number in a set, so Uncharted 2 would be 100 on this scale, in fact we'd have a lot of 100's this gen if that was the case, not very helpful) and the median is arrived at by listing the numbers and finding the right in the middle and taking the mean of those two, which isn't very helpful either as outliers would put a lot of games way lower. So unless you can come up with a brand new type of average that no mathmetician since ancient Egypt has been able to come up with, it's as good as it's going to get. |
That's basically correct, but Meta is nonetheless statistically flawed (to be fair mostly in terms of how people apply the results but also within its own right) because:
1 - number of samples is inconsistent not only between different games but different console versions. This makes it impossible to reliably compare titles. Right now FFXIII for PS3 has way more reviews, and because consoles have official mags, etc. this means statistically 360 should come out with a higher score despite being the same game with somewhat weaker graphics - that's clearly useless for comparison
2 - the real biggie in terms of their overall average, they weight some reviews using a 'secret' formula plus for reviews that are text only they 'guess' the score implied by the words. This is the biggie. This means the metacritic average is fundamentally unreliable.
Now the site does have some merit, if you understand all this (and you clearly do) and approach it's contents wisely. But few do, let's be honest.
Try to be reasonable... its easier than you think...







