By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - The Official Skyward Sword Thread!!

Tagged games:

Boutros said:
Khuutra said:
Boutros said:
I'm thinking it will settle at 94. Though 94 would make it the worst reviewed main Zelda game rofl (even below Majora's Mask O.o)


Not exactly: http://www.gamerankings.com/browse.html?site=&cat=0&year=0&numrev=1&sort=0&letter=&search=zelda

I think it will settle out mid-95, but we won't be sure until about mid-December.

I don't really like the way gamerankings aggretates reviews. Metacritic stops adding reviews for a game after a certain period whilst gamerankings keeps on adding them. I find Metacritic's way more accurate as it gives a better idea of how the game was received when it came out.

For example someone gave the game 6/10 in 2009 and it was added on gamerankings but not on metacritic. I don't think someone from 2009 can judge the game as accurately.

I'm seeing Link To The Past as being around 92%. 

All credibility is lost.



 

Here lies the dearly departed Nintendomination Thread.

Around the Network
RolStoppable said:
Conegamer said:

I'm seeing Link To The Past as being around 92%. 

All credibility is lost.

As if Metacritic and Gamerankings were credible in the first place...

For one, review standards have changed over the years and nowadays games score several points higher than they would have a decade ago. The score inflation is omnipresent. And two, everything before the fifth generation isn't really accounted for which means that a big chunk of gaming history is missing. So the best games of all time you see on Metacritic are really only the best games of the last 15 years, but even that is a blurred picture due to the score inflation.

If the current review standards were in place when Super Mario Bros. 3 was released, it would have gotten a perfect Metascore. That game was so far ahead of everything else at the time.

True. After GTA IV I can't believe anything from any reviewer...

...as for the rest of your post? Score inflation may be present, but I certainly don't think it's as bad now as it was. Take games like Xenoblade, which were scored fairly (if lower than I reckon), and games like Super Mario Galaxy 2, this game and Uncharted 3, which, despite being better (?) than their predeccesor, didn't score that way. That could be because the game isn't as 'innovative', or maybe it's because they aren't being as leniant.

All in all, I doubt we'll see another GTA IV, Red Steel or Wii Play moment, especially this late in the gen. The 'wow' of HD and motion control has disappeared. and mediocre/poor games are being scored accordingly.

Finally, yes, the first 10-15 years of gaming are a bit hazy in terms of reviews, but that's because gaming has only really taken off in the media in the past 5-10 years, and so more sites (Guardian, for example) have expanded into the gaming sector. So there's more people with more games with more opinions. Surely...it's only possible that reviews are being more reliable?

 

(Also, I don't agree with Metacritic, but it would be MUCH better if they used the outlier rule, and removed the top 2 and the lowest 2 scores from each review, or the top X and bottom X, where there are 10X reviews for the game. If that makes sense. That way, troll reviews and estatic fanboy reviews would be ignored, and a fairer representation of the game quality would be present.)



 

Here lies the dearly departed Nintendomination Thread.

Boutros said:

I don't really like the way gamerankings aggretates reviews. Metacritic stops adding reviews for a game after a certain period whilst gamerankings keeps on adding them. I find Metacritic's way more accurate as it gives a better idea of how the game was received when it came out.

For example someone gave the game 6/10 in 2009 and it was added on gamerankings but not on metacritic. I don't think someone from 2009 can judge the game as accurately.


That's fair, I suppose, but you also have to consider that gamerankings is presenting the actual mean average value of the reviews, whereas Metacritic gives a higher weight to some reviewers as compared to others. No one knows which reviewers get more weight versus less, or how big that influence is.

I don't care about review aggregates much, but if I had to choose one it would be gamerankings, if only because they present a more accurate snapshot of the average review.



Khuutra said:
Boutros said:

I don't really like the way gamerankings aggretates reviews. Metacritic stops adding reviews for a game after a certain period whilst gamerankings keeps on adding them. I find Metacritic's way more accurate as it gives a better idea of how the game was received when it came out.

For example someone gave the game 6/10 in 2009 and it was added on gamerankings but not on metacritic. I don't think someone from 2009 can judge the game as accurately.


That's fair, I suppose, but you also have to consider that gamerankings is presenting the actual mean average value of the reviews, whereas Metacritic gives a higher weight to some reviewers as compared to others. No one knows which reviewers get more weight versus less, or how big that influence is.

I don't care about review aggregates much, but if I had to choose one it would be gamerankings, if only because they present a more accurate snapshot of the average review.

Don't forget about some strange conversion ratings (B+ is 84, 4/5 is 80 etc.)

The weighting is good for the most part, however. 



 

Here lies the dearly departed Nintendomination Thread.

Conegamer said:
Khuutra said:


That's fair, I suppose, but you also have to consider that gamerankings is presenting the actual mean average value of the reviews, whereas Metacritic gives a higher weight to some reviewers as compared to others. No one knows which reviewers get more weight versus less, or how big that influence is.

I don't care about review aggregates much, but if I had to choose one it would be gamerankings, if only because they present a more accurate snapshot of the average review.

Don't forget about some strange conversion ratings (B+ is 84, 4/5 is 80 etc.)

The weighting is good for the most part, however. 

Why, pray tell?



Around the Network
Khuutra said:
Conegamer said:
Khuutra said:


That's fair, I suppose, but you also have to consider that gamerankings is presenting the actual mean average value of the reviews, whereas Metacritic gives a higher weight to some reviewers as compared to others. No one knows which reviewers get more weight versus less, or how big that influence is.

I don't care about review aggregates much, but if I had to choose one it would be gamerankings, if only because they present a more accurate snapshot of the average review.

Don't forget about some strange conversion ratings (B+ is 84, 4/5 is 80 etc.)

The weighting is good for the most part, however. 

Why, pray tell?

Because it stops completely bias reviews from the smaller sites (like the AV Club Uncharted review) and insanely high reviews from smaller sites who are inexperienced in reviewing from skewing the score too much. On the other hand, reviews from sites like ONM, Edge, IGN, Destructoid etc. who do a lot more reviews, have a lot more respect and often get a lot nearer to the actual metascore of the game (on the actual quality) get a higher preference.

It's a strange system and one I'm uncomforable with, but I see why they do it.



 

Here lies the dearly departed Nintendomination Thread.

Conegamer said:
Khuutra said:

Why, pray tell?

Because it stops completely bias reviews from the smaller sites (like the AV Club Uncharted review) and insanely high reviews from smaller sites who are inexperienced in reviewing from skewing the score too much. On the other hand, reviews from sites like ONM, Edge, IGN, Destructoid etc. who do a lot more reviews, have a lot more respect and often get a lot nearer to the actual metascore of the game (on the actual quality) get a higher preference.

It's a strange system and one I'm uncomforable with, but I see why they do it.

Except that you don't actually know who's getting more weight, here. There's no transparency to the system. We don't even know if the weight is the same for sites between different reviews.

I don't hold one reviewer's set of opinions as being inherently higher than others. IGN's been around for about 15 years or so now (probably longer), but they still employ some of the worst reviewers in the industry. Giving them higher weight because of their age or popularity doesn't add to the veracity of metascore, it subtracts from it.

Keep in mind that the age of a publication has almost nothing to do with the experience of its reviewers, though it often has something to do with the experience of its editorial staff.

No, I prefer a simple mean average, unweighted, so I can see how the game actually reviewed. If I want to just look at a few sites, I'll just look at a few sites. If I just want to average a few sites, I'll do that. If I want to weight a few sites, as we do in our heads by giving more prominence to certain publications (Edge and Eurogamer for me, though I still disagree with them a lot), then I'll do that.

Review aggregates should leave the averages alone, or be transparent in how they weight them.



Conegamer said:
Khuutra said:
Boutros said:

I don't really like the way gamerankings aggretates reviews. Metacritic stops adding reviews for a game after a certain period whilst gamerankings keeps on adding them. I find Metacritic's way more accurate as it gives a better idea of how the game was received when it came out.

For example someone gave the game 6/10 in 2009 and it was added on gamerankings but not on metacritic. I don't think someone from 2009 can judge the game as accurately.


That's fair, I suppose, but you also have to consider that gamerankings is presenting the actual mean average value of the reviews, whereas Metacritic gives a higher weight to some reviewers as compared to others. No one knows which reviewers get more weight versus less, or how big that influence is.

I don't care about review aggregates much, but if I had to choose one it would be gamerankings, if only because they present a more accurate snapshot of the average review.

Don't forget about some strange conversion ratings (B+ is 84, 4/5 is 80 etc.)

The weighting is good for the most part, however. 

That should be 90? 



 

Ey guys, great news for Skyward Sword. Another perfect score from Meristation (10/10), the most famous Spanish videogames magazine. Here is the link:

http://www.meristation.com/v3/des_analisis.php?pic=WII&id=cw4ec0266b39794&idj=cw4c17b4757bdea&idp=&tipo=art&c=1&pos=0



Lostplanet22 said:
Conegamer said:

Don't forget about some strange conversion ratings (B+ is 84, 4/5 is 80 etc.)

The weighting is good for the most part, however. 

That should be 90? 


I assume you are questioning Conegamers maths, and not correcting (4/5 of 100 is 80)

The problem isn't with the maths, but the fact that converting a 6 point scoring system (0-5) to a 101 point (0-100) is only going to work if the reviewers using the 6 point system follow similar rules to those who use 101 point systems..... ie if 75/100, or 7.5 is generally considered an average game score... then the 6 point system has only 2 possible scores for above average games, but 4 for below average. This is clearly ridiculous, especially given that ganerally a lot of rubbish games don't actually get reviewed... so most of the publications reviews would then fall under just 2 scores.

I would say reviewers that use a 5 or 6 point system will at minimum assume 3 to be the benchmark for 'average' (which is what I hope is what reviewers think of a game that you can enjoy, but may not be worth your money unless you are a fan of the series/genre or whatever) Personally it would make more sense to me to use 2 as 'average'... that way you can use 1 to call a game dull or poor, and 0 to say 'stay away, don't even look at the game box'.

As such, 6 point systems don't gel with the 100 point systems, and shouldn't be combined.