By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Edge vs. Metacritic (Bias Confirmed!!!)

I am the author of that blog.
I simply went to metacritic and sorted games (Ps3/360/Multi) from highest ranked to lowest. I picked the 30 highest scoring ones for each. There are some games omitted, because THERE WAS NO EDGE REVIEW of them, such as god of war collection or street fighter IV. Everything that had an Edge review I picked (seems I forgot Infinite Undiscovery but I'll add it no problem, it won't change the graphs) I am human and made this all by myself, errors may crop up.
I simply computed the difference and made the charts. So I did not pick the games at random, I picked them from highest metacritic ranking to lowest.

Metacritic is the most objective method of "ranking" games, since the really high and low reviews get lower weights in the average, at least that's what metacritic says. If you can find a better yardstick, I'll compare it to that.

You'll be asking why just top 30 games, because those are the games that sell the most, get advertised the most, in the spotlight the most. Not many people read the review of Battlestations Midway or Folklore (both are included because they are in the top 30), compared to ODST and God of War 3.

I could easily randomly sample 20 games, and with 20 games out of 100 or so exclusives, it'll make a 99% confidence interval less than one point. However neither I or many others care about games that are worse than that, based on sales. So choosing top 30 is valid. I chose the top 30 THAT EDGE REVIEWED. It's hard to compare metacritic to edge when there is no edge review is there?



Around the Network
jarrod said:
Inherently flawed study, as it's only rating the top 30 games on each (plus multi). You can't claim institutional bias when you're looking at ~5% of the games on these platforms, it's borderline irresponsible. "Journalistic integrity" indeed, lol.

30 is more than enough.  10 is the limit for a Two Proportion Z-Test.

 

The real problem is that the article provides point values for averages instead of ranges.  That's bad statistics.  Of course, that's bad statistics that nearly any study does, but it's still bad statistics

 



CommonMan said:
I'm so confused, I thought metacritic was garbage? Is it okay to use Metacritic when proving a pro-PS3 point? I remember a thread that said somthing like "360 has 50% more AAA exclusives than PS3!" and used metacritic as justification. There were like 2 pages of "LOL Meta is garbage" replies and then it was locked.

I don't like moving the goalposts like this.

Actually, even though I don't think the linked article proves a malicious bias, there's no goalpost moving because the two cases are very different

1) "360 has more games with 90+ metacritic score than PS3 ..." is factual data, nobody can say it isn't true. But "... thus its library is the best one" is the part that makes little sense (Metacritic does not cover all games, Metacritic covers games that can be irrelevant today or for a given user, etc...) What is refused is the significance of mechanically counting metascores to assess the quality of a library.

2) If I want to compare a single review source -say Edge- with an average of many other reviews, Metacritic provides that average instead of having to track down another 30 reviews per game, and that's all that concerns a purely statistical argument. Nowhere in this we're taking Metacritic scores at face value as "the right ones" to assess game quality.

 

 



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

OMG who really cares what they say any way? You still enjoy your games regardless of their reviews from some anonymous review site.



1. Since the Xbox 360s got more higher rated games than the PS3 the scores for the top 30 games have a lower spread. I have also noticed high ranked games also tend to have very little spread in overall review scores.

2. That Edge may not like certain styles of games. I haven't seen them particularly praise games with heavy cinematics and storytelling elements and Sony titles tend to be heavier in these areas, so it may come down to the types of games which fall in the top 30. This doesn't make them biased, it means they have a certain preferrence.

3. You can prove that theres a difference, but that doesn't prove bias. All it proves is that Edge reviews PS3 exclusives relatively lower compared to an aggregate of review sites.



Do you know what its like to live on the far side of Uranus?

Around the Network
Twistedpixel said:

1. Since the Xbox 360s got more higher rated games than the PS3 the scores for the top 30 games have a lower spread. I have also noticed high ranked games also tend to have very little spread in overall review scores.

...

What data are you looking at? Actually, looking at the graphs they seem to have very similar distributions, PS3 ranging 96 to 74 and 360 ranging 96 to 73. You're welcome if you want to calculate the average and variance for each, but I bet they'll be negligibly different.

 



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

what I find funny from this discussion is how people keep saying 30 games when it's 30 exclusive and 30 multiplat so 60 per system. Still not perfect no. I also think though that those saying they should look at all games are a bit silly. If a game gets a 1 or a 4 it won't make as much of a difference to people as if it gets a 9 or a 6. Although the scores are there video game scoring has been strongly affected by an a,b,c,d,f rating system. 5 to 1 would still be an f and arguing about who gives worse fs would be kinda silly. So people focus on how they rate those that rate high to average. Yes the person who did the study could have made sure that his or her study didn't miss any by the criteria set.
The thing though is even if there is a bias. It could just be a coincidental bias or those who rate the ps3 games are harder critics. While with multiplatform games you'd get a mix of the harder critics on the ps3 and less hard ones on the 360 so the scores are lower but not as low as ps3 alone and then the 360 would be the highest (all relative to metacritic since they still are lower scores for the most part) I don't think there is some intentional harsh bias towards the ps3 games.
Now if it were to be comparing individual reviewers to metacritic that would be something else. Harder but you could then see if it's individuals or just from the combined totals



WereKitten said:
Twistedpixel said:

1. Since the Xbox 360s got more higher rated games than the PS3 the scores for the top 30 games have a lower spread. I have also noticed high ranked games also tend to have very little spread in overall review scores.

...

What data are you looking at? Actually, looking at the graphs they seem to have very similar distributions, PS3 ranging 96 to 74 and 360 ranging 96 to 73. You're welcome if you want to calculate the average and variance for each, but I bet they'll be negligibly different.

 

The scale threw me off, sorry. It was different between the two so I misread the graph. The bigger version makes it clear.



Do you know what its like to live on the far side of Uranus?