By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - A rant. Greedy lootboxes, and a better way to do user reviews.

For real though. There needs to be a new term coined for this B.S. Games get released these days with Silver, Gold, and digital deluxe editions, while still having a season pass, seperate DLC, and lootboxes. The entire notion of a digital deluxe edition is laughable. It's like saying we made a collector's edition of the game, but it's all digital. What? I pay extra for the statue, the artbook, and the steel case. You expect me to pay $40 more just to get missing content from the original game? Considering that I can pick up any $60 game in a physical addition for $42 on day one from Best Buy or Amazon, the digital version of a game should cost around $45, not sixty. And as the game gets older the price for a digital copy should go down. Why are years old games still listed at $60 on digital storefronts? If I can't find the game in a brick and mortar store, do you really think I'm going to pay through the nose for a digital version, instead of finding a used copy? This means that even if I wanted to support the developers by buying a game new, they are pricing it so high that I'd be a fool to not pick the game up used. It's either that or wait for a sale. But I want to play the game NOW, not in a year's time, when it finally goes on sale for a reasonable price. So I might as well buy it used, if I can't find it for new on store shelves. 

The more AAA games come out released in this manner, that still get good metacritic scores, the more I lose faith in the entire video game review process. My only hope is that maybe, just maybe, these games are so damned good that if they had releases as complete editions they would have gotten way higher scores. Maybe Shadow of War is so damned good that if it had dropped all the pricing schemes it would have been hailed as the best open world game of the year. Maybe Forza 7 is the greatest racing game of all time, and lootboxes dragged the score down a lot.

 For most games I don't really need to look at the review scores to know whether or not it is good. If I know the developer's past games, or if the game is a sequel I can already predict how good it is. I'm not the only one with this ability, since there are metacritic prediction threads floating around. But at the same time, there are tons of games that I discovered through metacritic. At least a quarter of my collection are games that I never would have thought to buy, if not for the spotlight that is put on them from a good metascore. 

But I think metacritic is starting to really lose it's relevancy. More and more review outlets, just aren't qualified to review a game. This is why I currently use Opencritic, because I can block any sites or writers that I feel are unqualified from affecting the final score. Why don't I just use user reviews then? Well, most users are complete idiots, or simply don't have enough experience in games to recommend something to somebody of my tastes. I want the recomendations of my peers, people that have played at least five good games a year for every year they were alive past five years old. People that actively seek out good games, in the same way that a foodie actively seeks out good food. People that aren't limited to a single platform, or a few genres for their experiences. For the most part professional reviewers provide this service, but a lot of them are either cowards, or flat out unqualified to review whatever they are playing in the first place. Just look at how many reviewers IGN has on staff. They change reviewers so fast, that it's a revolving door over there. They'll hire anybody with good writing skills, and a passing interest in video games. 

There really needs to be a site that takes your PSN trophy count, XBlive gamerscore, and steam achievements, and uses them to build a profile of experience for each user. Users should then be able to rate games that they have actually played on a scale of one to ten. In order to add value to the scores, each user should be limited in how many scores of ten, and zero they can hand out in a given time frame. Users should be able to set requirements for other users to be qualified to rate a game. Did Timmy rate that new JRPG as a 10/10, but Timmy has only played two JRPGs his entire life? Well then that won't show up on the review aggregate for Spike, who has played twenty JRPGs his entire life. Why? Because Spike put his settings as "only allow reviews for JRPGs if the reviewer has played at least five of them". Meanwhile another user, Johnny might not care how many games somebody has played, so Timmy's review has an effect on the score that Johnny sees. 

Finally people should be made to rate a game based on seperate catagories like gameplay, graphics, music, and longevity. Gamespot used to do reviews like this, and it was a very good system. It forced people to stop and think about each aspect of the game individually, and take their own personal likes and dislikes out of the equation for a moment (or at least to an extent). Ultimately whether or not you like something is entirely subjective. Whether or not you liked the controls is just as subjective as whether or not you liked the game in general. But being forced to judge the controls seperately from the rest of the game makes for a better review. Then after each catagory has been evaluated a final score is tallied from the average of all the ratings of each catagory.

TL/DR: Game developers are greedy, metacritic is losing it's credibility, and we need a better way to do user reviews, so that professional reviewers can be rendered obsolete. 



Around the Network

"But I think metacritic is starting to really lose it's relevancy. More and more review outlets, just aren't qualified to review a game. This is why I currently use Opencritic, because I can block any sites or writers that I feel are unqualified from affecting the final score."

Confirmation bias?



RolStoppable said:
Sounds like overcomplicating things way too much. If you want to know how good Forza 7 is, look up its rating on the Xbox store where you know that everyone paid for the game before it was rated.

That still let's plebians have equal say to people with taste though. Little Timmy with four games under his belt has just as much say so as Spike with hundreds of games under his belt. 



AngryLittleAlchemist said:
"But I think metacritic is starting to really lose it's relevancy. More and more review outlets, just aren't qualified to review a game. This is why I currently use Opencritic, because I can block any sites or writers that I feel are unqualified from affecting the final score."

Confirmation bias?

Perhaps. Most of the reviewers that I block on Opencritic are News organizations (Metro, Time Magazine, WaPo), sites with a company's name in the title (Nintendolife, Playstation Universe), or sites that have had a serious review scandal in the past (Gamespot). 



Mar1217 said:

Did you look at the latest Jim Sterling video when you got the idea of this rant ?

Difficult to not agree on his major points

It fueled a lot of it. 



Around the Network
RolStoppable said:
Cerebralbore101 said:

That still let's plebians have equal say to people with taste though. Little Timmy with four games under his belt has just as much say so as Spike with hundreds of games under his belt. 

That's why I said Forza 7.

https://www.microsoft.com/en-US/store/top-paid/games/xbox

2.5 out of 5 stars.

Metacritic: 87
Metacritic user score: 62
OpenCritic: 85

Which one of the above four do you think is the most trustworthy source?

The 2.5 stars would be the most trustworthy out of all of them. But at the same time I don't think Forza 7 is bad enough to warrant a 2.5/5. But then again, maybe it is that bad. I haven't played it. Even though somebody bought Forza 7 and played it that doesn't stop them from posting a review of 0 stars for it out of spite. People that leave reviews like "It had stairs in it and I hate stairs 0/10 worst game evar!", or "11/10 because it had aliens in it and I like aliens. U must buy!" shouldn't have as much say so as people that write a serious and well thought out review, with a reasonable score attached. This is why I want a site that rations out extemely high or low scores, so that people have to think carefully about when they pass them out. 



RolStoppable said:
Cerebralbore101 said:

The 2.5 stars would be the most trustworthy out of all of them. But at the same time I don't think Forza 7 is bad enough to warrant a 2.5/5. But then again, maybe it is that bad. I haven't played it. 

What you get from people who have paid for a game is an answer to the important question if the game was worth buying to them. You also know that it is a closed ecosystem and the likelihood that people bought the game for an opportunity to troll it with a low score is miniscule. An average of 2.5 stars necessitates a high percentage of ratings that were either 1 or 2 stars, a score that people give when they are disappointed or displeased with their purchase. You can't see why people gave a low rating, but you can assume that they had actual reasons, unlike the user scores on Metacritic.

The simple rating system on storefronts gives you generally a good idea. Games worth buying usually end up with 4, 4.5 or 5 stars. A rating of 3.5 stars is acceptable, because it still means that more people were satisfied with the purchase than deeming it only okay (3 stars) or even bad.

Pretty much all other sources are skewed by troll ratings or come from people who did not have to pay for the game. That's one reason why so few reviewers do a proper job (commonly games are reviewed under the premise if they are worth playing which isn't the same thing as worth buying), another is that the reliance on ad income can lead to bumps in scores to appease publishers.

I agree with what you're saying, but I still think allowing inexperienced users to review a game is flawed. I don't care what some 16 year old kid that has only ever owned an XB1 thinks of his purchase. But yeah, I'm going to let these storefronts influence my descisions more than I have in the past. 



Many of the people who rate the games are just giving a bad score cause of microtransactions in the game, without even experiencing the game for themselves. Yes, forza 7 loot boxes are an issue, but people have greatly exaggerated it. I have the game, and it's not nearly as bad as people make it seem. Progression still happens at a reasonable pace, and it never feels like the game urges you to buy them. I'm not condoning it, as I really don't like the loot boxes, thought it doesn't "ruin" or "break" the game by any stretch of the imagination. As for the metascore, it's only at 87 cause of the lootboxes. The game would easily be at 90-91 without them.



Bet with Intrinsic:

The Switch will outsell 3DS (based on VGchartz numbers), according to me, while Intrinsic thinks the opposite will hold true. One month avatar control for the loser's avatar.

RolStoppable said:
Cerebralbore101 said:

I agree with what you're saying, but I still think allowing inexperienced users to review a game is flawed. I don't care what some 16 year old kid that has only ever owned an XB1 thinks of his purchase. But yeah, I'm going to let these storefronts influence my descisions more than I have in the past. 

You are too paranoid about the influence inexperienced gamers have on ratings. It takes two requirements for them to really mess up an average rating:

1) They outnumber experienced gamers.
2) Their judgment, i.e. their score, greatly differs from the one of experienced gamers.

Number 1 is already a reach, number 2 necessitates that all of them think the same way which is just as unlikely as experienced gamers reaching a universal agreement. And in the case that there is widespread agreement, it's usually a correct assessment of a game's quality. Loonies can only skew the average rating if the total number of ratings is very low.

"I want the recomendations of my peers, people that have played at least five good games a year for every year they were alive past five years old. People that actively seek out good games, in the same way that a foodie actively seeks out good food. People that aren't limited to a single platform, or a few genres for their experiences."

I don't think the average user on a storefront has these qualifications though, and I think their judgement does differ greatly from somebody with these qualifications. 

Inexperienced users wouldn't need to all think the same way to affect the outcome of the score. If the average score for a game by a user without these qualifications is 6/10, and the average score for a game by a user with these qualifications is 8/10 then we have a problem. Keep in mind that you wouldn't need every inexperienced user to rate at exactly 6/10 in order to get an average of 6/10 by them. Some of them could rate it at 9/10, and some at 2/10, and some at 4/10. If the average review amoung them is 6/10 then the score still gets effected in the end. The same thing goes for the qualified users. 

What I'm trying to say here is that I only care about the opinions of the top 10% of gamers. People like you and me. People that have a passion for the medium that goes beyond just casually playing whatever seems popular. People that actively go out of their way to play old classics, so that they can have a better perspective on the industry as a whole. 

I know that sounds really elitist though. 



RolStoppable said:
Cerebralbore101 said:

"I want the recomendations of my peers, people that have played at least five good games a year for every year they were alive past five years old. People that actively seek out good games, in the same way that a foodie actively seeks out good food. People that aren't limited to a single platform, or a few genres for their experiences."

I don't think the average user on a storefront has these qualifications though, and I think their judgement does differ greatly from somebody with these qualifications. 

Inexperienced users wouldn't need to all think the same way to affect the outcome of the score. If the average score for a game by a user without these qualifications is 6/10, and the average score for a game by a user with these qualifications is 8/10 then we have a problem. Keep in mind that you wouldn't need every inexperienced user to rate at exactly 6/10 in order to get an average of 6/10 by them. Some of them could rate it at 9/10, and some at 2/10, and some at 4/10. If the average review amoung them is 6/10 then the score still gets effected in the end. The same thing goes for the qualified users. 

What I'm trying to say here is that I only care about the opinions of the top 10% of gamers. People like you and me. People that have a passion for the medium that goes beyond just casually playing whatever seems popular. People that actively go out of their way to play old classics, so that they can have a better perspective on the industry as a whole. 

I know that sounds really elitist though. 

It sounds elitist because it is elitist.

You could browse through the 3DS eShop to see how ratings work in practice. Randomly pick games that you really like and see if you find any that are below 4 stars; or pick lame games and see how they have scored; or pick games you are unsure about. How big and varied your sample is going to be will depend on how much time you are willing to invest, but if you write down the games and their average ratings, it should be the rule that the ratings are in the correct area. Exceptions to the rule tend to have a low number of ratings which explains why the average is skewed.

That's a good idea. I think 4.0 is a good score on a five star scale. I'll post a list of my findings another day.