By using this site, you agree to our Privacy Policy and our Terms of Use. Close
RolStoppable said:
Cerebralbore101 said:

"I want the recomendations of my peers, people that have played at least five good games a year for every year they were alive past five years old. People that actively seek out good games, in the same way that a foodie actively seeks out good food. People that aren't limited to a single platform, or a few genres for their experiences."

I don't think the average user on a storefront has these qualifications though, and I think their judgement does differ greatly from somebody with these qualifications. 

Inexperienced users wouldn't need to all think the same way to affect the outcome of the score. If the average score for a game by a user without these qualifications is 6/10, and the average score for a game by a user with these qualifications is 8/10 then we have a problem. Keep in mind that you wouldn't need every inexperienced user to rate at exactly 6/10 in order to get an average of 6/10 by them. Some of them could rate it at 9/10, and some at 2/10, and some at 4/10. If the average review amoung them is 6/10 then the score still gets effected in the end. The same thing goes for the qualified users. 

What I'm trying to say here is that I only care about the opinions of the top 10% of gamers. People like you and me. People that have a passion for the medium that goes beyond just casually playing whatever seems popular. People that actively go out of their way to play old classics, so that they can have a better perspective on the industry as a whole. 

I know that sounds really elitist though. 

It sounds elitist because it is elitist.

You could browse through the 3DS eShop to see how ratings work in practice. Randomly pick games that you really like and see if you find any that are below 4 stars; or pick lame games and see how they have scored; or pick games you are unsure about. How big and varied your sample is going to be will depend on how much time you are willing to invest, but if you write down the games and their average ratings, it should be the rule that the ratings are in the correct area. Exceptions to the rule tend to have a low number of ratings which explains why the average is skewed.

I was originally going to find five games that I liked and five games that I didn't like and see if the user reviews matched up with what I thought of the game. But I rarely play a game I don't like, because I almost never pick up a game unless it has good review scores. Then I thought of checking user reviews of games that I know aren't very good like Sticker Star, Federation Force, etc. But I only know those games are bad based on youtube channels like Arlo, and based on professional review articles. And that isn't giving those games a fair shake at all. It's also circular logic...

"How do I know user reviews aren't trustworthy? Because they say that bad games are good! How do I know those are bad games? Because metacritic/youtube says so! Therefore metacritic is more trustworthy than user reviews!" - Circular and unfair logic in favor of my own ideas. 

So instead of doing that, I'm going to find five games that got a mediocre reception from professional reviews (below 78 on metacritic/opencritic), but have glowing (4/5 stars, 80% reccommended or above) eShop/PSN/Steam reviews. Then I'm going to play those games for at least twenty hours each, and rate them based on how much I like them. I'll post my findings by the end of the year, and I'll find five more games every year after that just to add to the sample size. Also, I'll find games that I've always wanted to play, but have avoided due to the metacritic/opencritic score. That way I can counter any pre-concieved bias that metacritic has given me, with my own eagerness for the game being good.