Machina said:
Ergh, I should bookmark that post I made ages ago and then just copy and paste it each time this comes up. Basically it's number 2 in Montana's post above.
I'll go see if I can find my old post.
edit 1 - the list in the post above me is out of date and hasn't been updated in months since I stopped tracking that.
edit 2 - found and modified my old post:
I keep track of a lot of stats for our official reviews. The pattern the OP sees isn't too far off actually - when compared with Metacritic across the board, VGC is slightly above average for Wii games, roughly on par for PS3 games, and slightly below for 360 games.
This is my explanation:
It's not a case of the site being biased - there's no way you could orchestrate such a bias amongst 16 different people (that's roughly how many active reviewers we have at any one time) and 5 different content editors. It's my view that the scoring differences are less about console preference, and more about how harsh or lenient the reviewer who happens to review a particular game (and own a particular console) is.
I'll use Torillian as an example because I know he won't mind. His main console is the PS3 and his secondary console the Wii. I don't think anyone would accuse him of being anti-PS3 (and if they do they're plain wrong), but he's one of our harsher scorers. Given that he writes far more PS3 reviews for the site than anyone else, the end result is that for PS3 reviews we tend to be roughly in-line with Metacritic.
Then take myself. I'm our primary 360 reviewer, the site's harshest scorer across the board, and also the site's 2nd most active reviewer. The end result of all this is that we tend to be slightly below Metacritic for 360 games. Not because I dislike the 360 - it's my main console afterall - but because I'm a harsh scorer whose main console happens to be the 360. If my main reviewing console was Wii then the same thing would happen to our Wii average.
You'll find this with our primary Wii reviewers as well - there are some who score harshly when compared with Metacritic (O-D-C is a good example), and then there are those who score generously (I won't give an example since I don't think it's fair to do so without their permission, but you could easily work it out for yourself).
In short, I think the reason for the difference is the harshness/leniency of the main reviewers vis-à-vis Metacritic and which consoles they happen to own and enjoy, not site bias.
|
Quoted so people can see this post.