By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Edge vs. Metacritic (Bias Confirmed!!!)

cura said:
jarrod said:
Inherently flawed study, as it's only rating the top 30 games on each (plus multi). You can't claim institutional bias when you're looking at ~5% of the games on these platforms, it's borderline irresponsible. "Journalistic integrity" indeed, lol.

it's not a flawed study. By including the top 30 games, it only limits the generality of the results. However, that doesn't mean it's flawed at all.

I really wish people like you would stop making such outlandish claims like 'inherently flawed study' when your logic is just as flawed.

Clearly, you can't generalize to games that weren't included but the results for the games that were included speak for themselves. - in the games that were analysed Edge rated ps3 exclusives harsher than 360 exclusives when compared to the metacritic average for each game.

Of course the results only applies to the top 30 games or w/e it was but so what? People generally only care for those games in the first place...

 

 

Read it again, it's not even the top 30 games in each category.  It's 30 games which scored somewhere between 100-70 on metacritic that he somehow decided to use.

 



Around the Network
ElRhodeo said:
Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.

Yup, maybe no one in the whole world is biased, and everybody farts sugar and rainbows, dorothy.  (Sorry, not sure where that came from :) )

 

The point is that if it's possible that any game magazine could be biased in some way, you would show it by looking at their track record in a fair and accurate way.  You probably shouldn't have a ten point average divergence if the sample is big enough, for example.  Of course, others may insist that a twenty point difference is where people should start scartching their heads.  But surely it's imaginable that in an extreme enough case, with a big enough sample size, anyone would have to admit there would be something weird going on.  Imagine if one specific magazine rated every Wii game 50 points below everyone else's average.  (And no, I'm not talking about IGN )



 

ElRhodeo said:
Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.

 

Thanks for pointing out exactly what is wrong with the majority of these bias claims. We are to assume that all PS3 and Xbox 360 exclusives are of equal quality ... unlikely.

 

In a way it is no surprise 360 games would be reviewed higher since, for the most part, 360 exclusives are safe games in very popular genres like shooters (Halo, Left 4 dead, and Gears of War series) while PS3 games have been a bit more high risk / experimental recently. (Heavy Rain, MAG, White Knight Chronicles, Demon's Souls).

I just don't see the bias even with the questionable graphs. It would be far more suspicious to me if both PS3 and 360 console exclusives were given nearly the same scores. If they just handed out 9's and 10's to every game regardless of quality then they would just be Game Informer. 

 

And before I am accused of anything, I own both consoles and play on both and blah blah blah. 



DarthVolod said:
ElRhodeo said:
Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.

 

Thanks for pointing out exactly what is wrong with the majority of these bias claims. We are to assume that all PS3 and Xbox 360 exclusives are of equal quality ... unlikely.

 

In a way it is no surprise 360 games would be reviewed higher since, for the most part, 360 exclusives are safe games in very popular genres like shooters (Halo, Left 4 dead, and Gears of War series) while PS3 games have been a bit more high risk / experimental recently. (Heavy Rain, MAG, White Knight Chronicles, Demon's Souls).

I just don't see the bias even with the questionable graphs. It would be far more suspicious to me if both PS3 and 360 console exclusives were given nearly the same scores. If they just handed out 9's and 10's to every game regardless of quality then they would just be Game Informer. 

 

And before I am accused of anything, I own both consoles and play on both and blah blah blah. 

If you think about it, that doesn't make any sense.



 

That is the most retarded bit of research I've ever seen.

No offence to the OP but you must see how many errors are introduced by the study method.

Restricting any sample to a portion of total compromises any study.

The study proved a few things but it DOESNT prove the startlingly bold final conclusion.



Around the Network

6.43 and 6.26 (mean Edge scores for all 360 and PS3 games respectively)



DarthVolod said:
ElRhodeo said:
Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.

 

Thanks for pointing out exactly what is wrong with the majority of these bias claims. We are to assume that all PS3 and Xbox 360 exclusives are of equal quality ... unlikely.

 

In a way it is no surprise 360 games would be reviewed higher since, for the most part, 360 exclusives are safe games in very popular genres like shooters (Halo, Left 4 dead, and Gears of War series) while PS3 games have been a bit more high risk / experimental recently. (Heavy Rain, MAG, White Knight Chronicles, Demon's Souls).

I just don't see the bias even with the questionable graphs. It would be far more suspicious to me if both PS3 and 360 console exclusives were given nearly the same scores. If they just handed out 9's and 10's to every game regardless of quality then they would just be Game Informer. 

 

And before I am accused of anything, I own both consoles and play on both and blah blah blah. 

Ehhhh what M.A.G isn't part of the popular genre?

anyways this is edge magazine who's main excuse is that innovation = a huge plus for them, so why would they give shooters a high scores when it doesn't even do anything new at all while Sony new IP's are offering something different as you said gets hammered?, even the shooters on ps3 gets low reviews for not doing anything new yet the same doesn't apply for left4dead,modern warfare,halo? give me a break. There see I prove them biased yet again.



hsrob said:
6.43 and 6.26 (mean Edge scores for all 360 and PS3 games respectively)

The whole point is comparing the scores to their metacritic (or gamerankings) average.  The question people are trying to answer is "how much lower or higher does this magazine review each consoles' games compared to the overall reviews they get?"

Not that this article answers the question very well...  I don't think it does, anyway.



 

ElRhodeo said:
Hm, another possible explanation is that Edge isn't biased, they just thought the PS3 games were worse than the 360 games. Shock!

That's called a 'result', something that comes AFTER examining / testing.
'Bias' means judging BEFORE you tried it.

You know, according to your (very popular) definition of 'bias', any comparison that doesn't result in "all contestants are equally good" is biased. That's pointless.

Sorry but your explanation is complete bs.

The point is that EDGE is rating PS3 exclusives with bigger difference than XBox exclusives COMPARED to industry standart. It's like saying that Xbox games are better because EDGE "agrees" with the whole industry, but PS3 games are worse, because EDGE is "off" the industry standart. That makes absolutly no sense whatsoever.

Also bias is universal, there is noone and nothing completly unbiased, because bias is a part of underlying social structure affected by many factors which can't be empiricali studyied. There is a whole science which is trying to find these non-visible functions, called Structuralism. Noone is unbiased, therefore the question is:

How much biased you are?



MY HYPE LIST: 1) Gran Turismo 5; 2) Civilization V; 3) Starcraft II; 4) The Last Guardian; 5) Metal Gear Solid: Rising

Cypher1980 said:
That is the most retarded bit of research I've ever seen.

No offence to the OP but you must see how many errors are introduced by the study method.

Restricting any sample to a portion of total compromises any study.


The study proved a few things but it DOESNT prove the startlingly bold final conclusion.

Way too harsh.

Most studies are not exahustive, but work on a sample of the population, sometimes a tiny one. Medicinals are not tested on each and every person before they are deemed to be beneficial, and the numbers in their testing are not many orders of magnitude bigger than these ones. Particle events detected at LHC are a fraction of those that really happen. Popularity of governments and leaders is assessed weekly by interviewing a few thousand people out of hundred of millions.

In this case, take it for what it is. A sample of top 90 games is somewhat restricted, but they probably amount between them to a big slice of the overall software sales and as such they have a very different weight in mind share than the myriad of lesser titles.

PS: It seems that the criteria for the selection of the 30 titles were not objectively and rigidly followed, but the handful of "counter-examples" I saw won't budge the resulting numbers of any appreciable amount.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman