By using this site, you agree to our Privacy Policy and our Terms of Use. Close
crissindahouse said:
GameAnalyser said:

MOS steel has an average weighted score of 6.2/10 and it deserves fresh instead of Rotten 55. On the other hand we see marginally better 7.7/10 getting 91 for CW shows how flawed the score highlighting is in that site. It's a hack site. 

7.7 isn't marginally better than 6.2. That is a decent difference if you read reviews and what a 6.2 means compared to a 7.7 for many editors and if you look at it mathematically and how you get to these average scores. 

Talking about the average score and the Tomatometer of MoS. Well, the Tomatometer just says that 55% of all reviews are positive. 55% of the reviews were a 60%+ for Man of Steel and the rest was below a 60%. I don't see how this system is so flawed, it's pretty logical. More reviews are positive as negative but it's still seen as rotten since the site says they can only really recommend a movie if minimum 60% of all reviews recommend it. 

Sure, you could argue about this single point and if there should be maybe one extra category between fresh and rotten. 

Same with Civil War. Right now 91% of all reviews are positive which means 91% were above 60%. The average is "only" 7.7 since one bad review weights much more against a high average as one high review helps it. One perfect 10 doesn't help the average score of 7.7 as much as one  bad score of let's say a 3 decreases it compared to the 6.2 of MoS. That's why the average is "only" 7.7 even with 91% of scores above 60%. 

That's also a perfect reason why even "only" a difference of 1.5 points on average can make a big difference in how reviewers liked it. One single bad score harms an average 7.7 score more as a 6.2. One single good score helps a 6.2 more than a 7.7

You will find examples of movies where the average score is as high as the Tomatometer even in hig regions but usually, a movie with an already good average score suffers much more from a few  horrible scores as a movie which is already considered only average since that average score is already closer to the low scores. 

So yeah, it's all pretty logical and nothing crazy. 

Well the problem here is you actually don't understand the weighted average concept and that's why you don't understand the the fact that how misleading the percentage numbers would be when MORE NUMBER of critics are allowed to publish their scores.

"More reviews are positive as negative but it's still seen as rotten since the site says they can only really recommend a movie if minimum 60% of all reviews recommend it. " 

You have inadvertently spilled the beans there exposing the flaw. You are mixing up percentage numbers and the weighted average here.  If there were only 55 percent out of total who gave a score above 60...the weighted average 6.2 would not make sense at all. Both metrics cannot be applied at the same time. That's why sites like Metacritic generally have heavier hits from lower scores to the average as you explained.  You generally see lower Metascores than Tomatometers when the flaw isn't exposed.