Quantcast
Edge vs. Metacritic (Bias Confirmed!!!)

Forums - Gaming Discussion - Edge vs. Metacritic (Bias Confirmed!!!)

As if u care if theyre biased or not, theyre just reviews



Around the Network

I agree with Jarrod that the stats are confusing and seem cherrypicked in this article.  Why does it include games ranging from a metascore of 100 to around 70, but not all the games in that range?  How were the games included chosen?  If it were including all the games from 100-70 it would be fine.  That would be a "vertical slice" as Jester said, I think.  In fact, I've seen another article, and even just forum posts, that do just that.  They would make a much better source than this article.  Really, it doesn't even take that long to tally up all the exclusive games between 70 and 100 yourself (or even between 80 and 100).



 

Scores were taken off the top tier games rather than hand picked examples to support an argument.

That data alone supports what many have already acknowledged. I don't see the need for a comprehensive compilation of all scores as further proof.

Edge is not a bad publication, but their review scores often leave much to be desired, at times seemingly just to be the contrary voice against the overwhelming majority.

Interpret it as you will, but Edge seems to rely upon its sometimes controversial review scores to stay topical. Nobody rants against review sources that basically confirm what most players acknowledge about a good (or bad) game and the editors at Edge clearly know this.



Alic0004 said:

I agree with Jarrod that the stats are confusing and seem cherrypicked in this article.  Why does it include games ranging from a metascore of 100 to around 70, but not all the games in that range?  How were the games included chosen?  If it were including all the games from 100-70 it would be fine.  That would be a "vertical slice" as Jester said, I think.  In fact, I've seen another article, and even just forum posts, that do just that.  They would make a much better source than this article.  Really, it doesn't even take that long to tally up all the exclusive games between 70 and 100 yourself (or even between 80 and 100).

I just want to point out that between 100 - 70 meta rating is on single platform 429 games, 100 - 80 195 games. So this is pretty impossible to do for single person.

ALSO, the point was to compare EDGE reviews to the industry standart, chosen were high profile games. Some games from 100-80 range weren't even reviewed by EDGE. This is handpicked top tier of titles, in the graphs you can read which titles these were.



MY HYPE LIST: 1) Gran Turismo 5; 2) Civilization V; 3) Starcraft II; 4) The Last Guardian; 5) Metal Gear Solid: Rising

Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.



Currently playing: NSMB (Wii) 

Waiting for: Super Mario Galaxy 2 (Wii), The Last Story (Wii), Golden Sun (DS), Portal 2 (Wii? or OSX), Metroid: Other M (Wii), 
... and of course Zelda (Wii) 
Around the Network
aragod said:
jarrod said:
aragod said:
jarrod said:
Inherently flawed study, as it's only rating the top 30 games on each (plus multi). You can't claim institutional bias when you're looking at ~5% of the games on these platforms, it's borderline irresponsible. "Journalistic integrity" indeed, lol.

Guess not everyone has the resources to put hundreds of hours into this. Anyway even from these "30" titles, you can see that the graph lines are pretty much static, which can say a lot. These graphs cover pretty much every notable exclusive or release, so we are talking about the "important" titles.

If you care to add more titles, please do so. Until than, you haven't proven any flaw. It's like crying about public opinion research that only covers 50 000 people out of 50 000 000.

If he doesn't have the resources to do an actual comprehensive study, perhaps he shouldn't be casting aspirations of bias?

And actually, upon further inspection it looks like he's not even meeting own criteria for the comparison. For example, Lost Odyssey should be in that 360 list, but it missing despite it's metascore of 78.  EDGE only gave it a 6, meaning it'd also work against his "bias" claims.... how convenient. ;)

This wasn never labeled as a comprehensive study, and his aspiration of bias is subject to his research which was conducted under given circumestances, which are described in the opening statement. I'm urging you to improve upon this research by adding all that you think is missing, unless you can prove that his graphs are wrong, your statement isn't justified.

Lost Odyssey should be in that 360, but his choice to exclude it was probably limitation content wise. Also EDGE only gave it 7, your claim was false, how convenient...

If you try to undermine this, atleast take the effort not to fail on your first FACTUAL statement. Kk thx bye!

Well, I should hope not, given it's a pretty shallow look at the data. I'm saying is you'd need to take a comprehensive look at things before staking a claim of bias, especially for a publication was well regarded and respected in the industry as edge.

Also, there's no "limitation" here for LO, it ranked higher than other titles he did include in this "top 30" for 360 exclusives.  Also missing is Tales of Vesperia, Blue Dragon... and any JRPGs in fact, a genre Edge is known to be particularly hard on.  And yet, I see Valkyria Chronicles, Disgaea 3 and Demon's Souls all seemed to make it into his PS3 list? 



Packie said:
If you dont like their reviews and think they're biased, why dont you just you know... ignore them. seriously people, just enjoy your games.

Or one could like to read their reviews for the content, don't have too much faith in scores as a whole, and like to put numbers in a context - as in my case...

Reviews on Edge are not signed by individuals, so we never get to know their taste. We can't say "ok, this reviewer didn't like this game but he also didn't like this one that I loved so I won't give it much weight". Or "he loved it, but I'm not in overly long RPGs as he is known to be".

I'd say it's a matter of policy more than journalistic integrity. Maybe the guy who reviewed Halo 3 and ODST (becasue he loves to review FPS games) is simply more generous in scores than the guy who reviewed GOWIII, and there's no integrity or platform love/hate at stake.

But since the policy of not signing the reviews deprives us of that info, and since we also don't know how much at Edge they care about their scores and their policy about consistency or lack of, the best we can do is infer from the existing numbers.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

ElRhodeo said:
Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.

hey I could do that to!

Just say evry 360 game sucks compared to ps3 games and slap them all with a 6/10, thats is definitly not biased!!!



jarrod said:
Inherently flawed study, as it's only rating the top 30 games on each (plus multi). You can't claim institutional bias when you're looking at ~5% of the games on these platforms, it's borderline irresponsible. "Journalistic integrity" indeed, lol.

it's not a flawed study. By including the top 30 games, it only limits the generality of the results. However, that doesn't mean it's flawed at all.

I really wish people like you would stop making such outlandish claims like 'inherently flawed study' when your logic is just as flawed.

Clearly, you can't generalize to games that weren't included but the results for the games that were included speak for themselves. - in the games that were analysed Edge rated ps3 exclusives harsher than 360 exclusives when compared to the metacritic average for each game.

Of course the results only applies to the top 30 games or w/e it was but so what? People generally only care for those games in the first place...

 

 



aragod said:
Alic0004 said:

I agree with Jarrod that the stats are confusing and seem cherrypicked in this article.  Why does it include games ranging from a metascore of 100 to around 70, but not all the games in that range?  How were the games included chosen?  If it were including all the games from 100-70 it would be fine.  That would be a "vertical slice" as Jester said, I think.  In fact, I've seen another article, and even just forum posts, that do just that.  They would make a much better source than this article.  Really, it doesn't even take that long to tally up all the exclusive games between 70 and 100 yourself (or even between 80 and 100).

I just want to point out that between 100 - 70 meta rating is on single platform 429 games, 100 - 80 195 games. So this is pretty impossible to do for single person.

ALSO, the point was to compare EDGE reviews to the industry standart, chosen were high profile games. Some games from 100-80 range weren't even reviewed by EDGE. This is handpicked top tier of titles, in the graphs you can read which titles these were.

The problem is, he gave no methodology for how the "top 30" were chosen.  And even calling it a "top 30" is disingenuous, that's not even what it is.