In order for 99 to happen assuming around 100 reviews, the score breakdown would need to be something along the lines of 90 scores of 95 or over (with around 70 perfect), and no scores below 90. So yeah, would be insane if it happened.
In my mind the solution is pretty obvious:
Anyone who studied even a bit of statistics at school would know, that when you're mapping up a graph of results and you have a consistent line of data with a few rare outliers, you tend to just ignore the outliers since they're assumed to be affected by external factors.
It's not that you just ignore any score under 90 because we've already assumed the game at least deserves that.
But when you have a case of 101 reviewers giving the game a score of no less than 90 (the majority being 100), one reviewer giving 84, and then three other reviewers giving 70, 60 and 60; you would just exclude the last three results from the aggregated score, as they're so far off the margin they're considered anomalies and unrepresentative of the actual reults.
This is why I put the onus on Metacritic moreso than the individual reviewers, since there will always be someone to abuse the system if it's available to do so.
P.S. In reality, the difference between 97 and 98 on Metacritic isn't important. Just seems like an easy fix that could save people a lot of trouble.