By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General Discussion - Would this change mitigate the problem of clickbait scores on Metacritic?

Here's the thing, if you're Metacritic, and you stop counting outlier scores to the overall average, then you're basically telling those outlets that their reviews are wrong, and basically there's no point even having those review outlets represented on Meta to begin with.

If you approve a review outlet to your site, you can't then just pick and choose which of their reviews really count, and which don't. That's just nonsense.

That aside, people really shouldn't let massive outliers bother them so damn much. What does it matter if one person out of 100 or something doesn't like what everyone else likes, or is being a contrarian for the sake of it? It's not going to change your personal enjoyment of that product, or the general perception of it.

And frankly, you haven't really made it big until you have people trying to knock you down, so hell....just think of it as a compliment.



Around the Network

People just need to get over their measly 01-03 points on metacritic. The only point of that site is to see whether a game was generally well-recieved or nor, people should not obcess over every minor point just because they seek some sort of weird validation.



I make game analyses on youtube:

FFVI: https://www.youtube.com/watch?v=mSO6n8kNCwk
Shadow of the Colossus: https://youtu.be/9kDBFGw6SXQ
Silent Hill 2: https://youtu.be/BwISCik3Njc
BotW: https://youtu.be/4auqRSAWYKU

Or people should just make an average for themselves. Pick two/three of the worst reviews, pick two/three of the middle one, pick two/three of the best ones, read them (or even if you don't want to bother, check their sum-up views), and see how the reasoning holds across them all. If the worst reviews happen to have a similar, justifiable reasoning than the others but on a negative note, then it's not clickbait, or at the very least it is pretty well masqueraded and quite frankly can pass for a legit one. If it happens to be a bunch of nonsense and extreme nit-picking, in huge contrast to the rest (even the two other bad reviews), then you know where they might be coming from.

The same applies to the 10/10 reviews, btw. Sometimes you can sense they're legit, and sometimes you can sense they're pure bias, by using this comparison model. The thing about reviews is that you don't have to just stick to the number alone, even if that's what people generally do I guess.



Thats actually a good solution.This way you remove the outliners, and get a better average for the score.I imagine taking out 2 or 3 of each end of the spectrum would be ideal, as long as the game has exceded at least 30 reviews or so.

Plus just noticed that we now can like posts on the forums?Nice.



My (locked) thread about how difficulty should be a decision for the developers, not the gamers.

https://gamrconnect.vgchartz.com/thread.php?id=241866&page=1

Yes and no. If a game has 20 reviews with a score of 100 and 1 or 2 reviews with a score of 60 or below, its obvious that the those are the click bait reviews. This also works the other way around where the high review in a bad game count as a click bait or "paid" review



Around the Network

That seems like it'd remove the whole point of having an average score, which is supposed to have a representation of them all around.



You could take the median, not average.



RolStoppable said:
Angelus said:
Here's the thing, if you're Metacritic, and you stop counting outlier scores to the overall average, then you're basically telling those outlets that their reviews are wrong, and basically there's no point even having those review outlets represented on Meta to begin with.

If you approve a review outlet to your site, you can't then just pick and choose which of their reviews really count, and which don't. That's just nonsense.

That aside, people really shouldn't let massive outliers bother them so damn much. What does it matter if one person out of 100 or something doesn't like what everyone else likes, or is being a contrarian for the sake of it? It's not going to change your personal enjoyment of that product, or the general perception of it.

And frankly, you haven't really made it big until you have people trying to knock you down, so hell....just think of it as a compliment.

Metacritic assigns more weight to specific review outlets, so they are already choosing which reviews count more towards the Metascore. An exclusion of the top 10% and bottom 10% of review scores when calculating Metascores would at least be something that is clearly defined instead of being arbitrarily chosen like Metacritic's current method of giving certain review outlets more weight. The top 10% and bottom 10% of reviews would still be listed on Metacritic just like before, all that would change is that those scores aren't factored into the Metascore.

Do they? How needlessly convoluted. Do they at least tell you which outlets are given more weight?



Maybe for the sanity of everyone sites should stop doing scores and let their reviews speak for themselves, cause that number they tack on at the end of it means absolutely nothing. If they want to summarize their reviews just add a "buy this" or don't buy this" at the end. But they won't because it adds to clicks and page upon page of fanboy-fights.



Hiku said:

The importance of metacritic scores varies from person to person. But to some people they are very important.
One issue people often perceive (whether it really happens or not can be hard to prove in some cases) is that some publications will give a game or a movie a significantly higher, or more often than not, lower score to get clicks for their site.

Even if they do this, it may not be their intention to affect the overall metacritic score. But it none the less can have a noticeable effect on it. And this seems to be why a lot of people get angry at so called clickbait reviews. They'd be easier to ignore if they didn't just lower the metascore of your favorite game from a fantastic 92 to a terrible/unplayable/disastrous/I-Will-Cancel-My-Preorder 89.

The other day it occurred to me that there may be a rather simple, and commonly used, fix to this problem. Or at least a way to mitigate the issue.

Some times when calculating an average value, a few of the highest and lowest values are removed from the calculation to filter out results that potentially stray too far from the norm, and get a better representation of the average score.

I imagine if you removed a portion of the top and bottom review scores for games or movie, so called clickbait reviews would be much less of a problem considering that only a small portion of reviews tend to be suspected of this.

What do you guys think?

For stuff like this should be used a median score instead of an average. With an average extreme values have more influence on the result as moderate ones (it doesn't matter if extremely high or extremely low). A median solves all this.

https://en.wikipedia.org/wiki/Median



3DS-FC: 4511-1768-7903 (Mii-Name: Mnementh), Nintendo-Network-ID: Mnementh, Switch: SW-7706-3819-9381 (Mnementh)

my greatest games: 2017, 2018, 2019, 2020, 2021, 2022, 2023

10 years greatest game event!

bets: [peak year] [+], [1], [2], [3], [4]