This study about reviews and sales might seem to be conclusive evidence that reviews affect sales, but there are two reasons why it is not.
- The participants were required to read the reviews, and most consumers are not.
- The study asked those who intended to buy games, not those who did buy games.
Thus the study doesn't give an actual indication of whether reviews affect sales in real market conditions. The way to properly determine that might just require something along the lines of a Zogby or Gallup poll, asking people why they did or didn't buy certain games, and seeing what percentage of the answers were caused by reviews.
Of course such a study would be hugely expensive, as you'd need a decent sample size of people not only owning games, owning various systems, but, if you want to leave no doubt, also a number of buyers approximating the respective sales of the game. Even a getting the US President's approval rating wouldn't need to talk to that many people.
Another method would be to compare review scores and sales. Let me tell you, the film industry has known for years that reviews might as well be dart throwing. The disparity among box office and reviews scores is blatant there, but for some reason gaming hasn't caught on to this. Now perhaps with games there is a connection, but without verifying it, assuming a game did well/poorly because it got good/bad reviews is falling for the "correlation=causation" fallacy.
It's holding on to those assumptions without confirming them that lost Sony their first place, and made it all the harder, and more expensive, for Microsoft to get just to where they are.
A flashy-first game is awesome when it comes out. A great-first game is awesome forever.
Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs