By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - X Play - Tenchu 4 - 2 stars !!!!!!!!!!!!

Grampy said:

Why would the same criteria not apply to IGN? Or at least some other reviewer. If you really believe that being the absolute lowest score out of all the reviewers 5 out of 9 games and usually by a grotesque margin constitutes some rational system of judging ... well fine. What can I say, somewhere someone will believe almost anything. There are people that actually believe that George Bush will go down in history as a great president.

They use their inferior 5-point scale to its max too often.  Anyway, how can anyone possibly think that W was a great president?  He's the worst one we ever reelected.  I just watched a week long special on the history channel that went through every one, too.  Maybe the only one worse was Buchanan because he presided over the union splitting up, but we didn't reelect him...

 



Around the Network
Grampy said:
rajendra82 said:
Grampy said:

I decided to take a look at X-Play Scores versus the MetaCritic Average. I took their list of Latest Games. Out of the 35 games listed, X-Play reviewed 9. The table lists the MetaCritic Score, the IGN score, the X-Play Score, the difference between X-Play and MetaCritic and whether or not X-Play gave the lowest score of all reviews.

 

MetaCritic

IGN

X-Play

Xp/Meta

Lowest

Mushroom Men

73

79

40

-33

yes

Rygar:BA

54

61

20

-34

yes

Tenchu:SA

74

80

40

-34

yes

RRRabbids:TP

73

70

80

+7

-

Castlevania J

47

75

20

-27

-1

Skate It

71

85

80

+8

-

Anim. Cross:CF

73

75

80

+7

-

TalesSymph:DNW

69

67

40

-29

yes

Average

67

74

50

-17

 

The result wasn't quite what I expected which was a constant bias. Not true, but instead something I  consider even worse. X-Play seems to treat Wii games in one of two ways. If they like it they give a very reasonable, even slightly generous score. If they don't like a game they stick it with an absurdly low score, about 30 points off average and with a single exception, the lowest score of all reviewers. In that single exception they were next to the lowest.

I think this is a deliberate effort to completely trash the MetaCritic average on any game they dislike. That really sucks. When I was a kid we called this torpedoing and even then we knew it was a lousy unfair thing to do.

THIS IS WORSE THAN BIAS BECAUSE YOU CAN'T FACTOR IN A CONSTANT ADJUSTMENT, YOU HAVE TO FIGURE OUT WHETHER THEY ARE ACTUALLY REVIEWING THE GAME OR JUST PLAIN F**KING IT OVER. 

I wonder if they do this to PS360 games as well. I leave that for someone else.

UPDATE: OK, I couldn't stand so I took a quick look at PS3 and XBox 360 for signs of the same pattern. Looking at all low scoring games (yellow or red), X-Play reviews were consistent or slightly higher than MetaCritic. Apparently this SCREW JOB is reserved exclusively for the Wii. Way to go X-Play.

I understand why X-Play scores games the way they do.  The HD games are sometimes good looking and sometimes not, so they get noticed for that, and then you find out other things about them, like gameplay.  The Wii games are generally plain looking, so the only thing to note are gameplay issues.  Therefore X-play tends to break out the Wii games into two seprate camps, good games (4 and 5stars) and bad games (1 or 2 stars) based on what they think of the gameplay.  Whereas with HD games you can potentially have four camps, good looking and good gamplay (5 stars) , good looking and bad gameplay (2 to 4 stars) , bad looking but good gameplay (2 to 4 stars), and bad looking and bad gameplay (1 star).  Naturally the HD games fall more on a spectrum, whereas Wii games are either in the love it or hate it categories.  X-Play review format being limited to a couple minutes on TV exaggerates the issues they notice and makes the scores vary more wildly than they would in a print or web format.

Why would the same criteria not apply to IGN? Or at least some other reviewer. If you really believe that being the absolute lowest score out of all the reviewers 5 out of 9 games and usually by a grotesque margin constitutes some rational system of judging ... well fine. What can I say, somewhere someone will believe almost anything. There are people that actually believe that George Bush will go down in history as a great president.

Here is a graph comparing the data in the post.  As you can see when the game is not good both IGN's and X-Play's rating go down,and when it's good both go up, but IGN's don't go way up and down like X-Play's do.  Both sites show a trend and a logic to scoring games.  It's just different way of scoring between IGN and X-Play.  I don't see it as an intentional bias, but as an inherent result of the two very different systems of scoring.

 



Oh god, X-Play has a review scale that doesn't go from 6 to 9.

Quick, burn them!



I think why (some) people get so angry with G4 is how often they seem to be (dramatically) lower than the average review score, and how often their review doesn't (really) justify the score a game is given. Looking at their reviews on Gamerankings:

The following games were given a score at least 10% lower than the average review score:

Boogie (-38%)
Spider Man (-35%)
Rygar: The Battle of Argus (-33%)
Mushroom Men: The Spore Wars (-33%)
Tales of Symphonia: Dawn of the New World (-30%)
Castlevania Judgment (-30%)
Sonic and the Secret Rings (-29%)
Tony Hawk's Downhill Jam (-29%)
Call of Duty 3 (-29%)
Rayman Raving Rabbids 2 (-28%)
The Sims 2: Pets (-27%)
Avatar: The Last Airbender (-23%)
Tamagotchi Party On! (-22%)
SpongeBob SquarePants: Creature from the Krusty Krab (-20%)
Battle of the Bands (-19%)
Mario Strikers Charged (-19%)
Carnival Games (-18%)
Obscure: The Aftermath (-16%)
Elebits (-15%)
Metal Slug Anthology (-13%)
Ninja Reflex (-12%)                   
Alien Syndrome (-11%)
Baroque (-10%)
Kororinpa: Marble Mania (-10%)
MySims (-10%)

 

The following games were given a score at least 10% higher than the average review score:

Big Brain Academy: Wii Degree (+10% )
Boom Blox (+15%)
No More Heroes (+17% )
Cooking Mama: Cook Off (+18%)

 

Now, personally I think that part of this is that X-Play are simply tough reviewers and (unlike some reviewers) they tend to use the whole grading scale rather than focus on the top 40% of the scale. At the same time, I think they really need to hire better reviewers who can actually come up with a reasonable justification on why they gave a game a particular score.



From what I remember from my stats courses back in college, before compiling data for averages (like on Metacritic) you're supposed to remove a set of data from the top and the bottom to ensure extremes won't go and screw your results.

I've always wondered why the hell Metacritic doesn't do this. Even only taking out only the top and bottom 2 reviews would ensure a more accurate average for all games.



Signature goes here!

Around the Network

I don't think they played the same game I'm playing. Who sent them the wrong copy haha.



TruckOSaurus said:
From what I remember from my stats courses back in college, before compiling data for averages (like on Metacritic) you're supposed to remove a set of data from the top and the bottom to ensure extremes won't go and screw your results.

I've always wondered why the hell Metacritic doesn't do this. Even only taking out only the top and bottom 2 reviews would ensure a more accurate average for all games.

Those are called outliers and they're removed pretty much to avoid skewing the average.

Anyway, if Metacritic did remove the outliers then reviews would actively attempt to avoid being the outlier which leads to more dishonest reviewing.  Personally, I like seeing low reviews because it tells me that someone was totally unimpressed with a game.



Hey if X-Play wants to review it badly that's there business. Doesn't really matter to me if I disagree. Just conflicting opinions. Although it sucks that their opinion is going to be taken with more validity then mine which I highly doubt there is too much of a difference. However, I guess that's just my arrogance.



The MC % dropped 5% after that review was added.



“When we make some new announcement and if there is no positive initial reaction from the market, I try to think of it as a good sign because that can be interpreted as people reacting to something groundbreaking. ...if the employees were always minding themselves to do whatever the market is requiring at any moment, and if they were always focusing on something we can sell right now for the short term, it would be very limiting. We are trying to think outside the box.” - Satoru Iwata - This is why corporate multinationals will never truly understand, or risk doing, what Nintendo does.

Words Of Wisdom said:
TruckOSaurus said:
From what I remember from my stats courses back in college, before compiling data for averages (like on Metacritic) you're supposed to remove a set of data from the top and the bottom to ensure extremes won't go and screw your results.

I've always wondered why the hell Metacritic doesn't do this. Even only taking out only the top and bottom 2 reviews would ensure a more accurate average for all games.

Those are called outliers and they're removed pretty much to avoid skewing the average.

Anyway, if Metacritic did remove the outliers then reviews would actively attempt to avoid being the outlier which leads to more dishonest reviewing.  Personally, I like seeing low reviews because it tells me that someone was totally unimpressed with a game.

I wouldn't want to hide the outlier reviews but just not include them in the average. That way reviewers wouldn't try to adapt their review to avoid being the outliers but they wouldn't either try to score a game as low as possible just because they want to see the game's average drop like a rock.

 



Signature goes here!