Why can't a computer game just be a computer game? Why does it have to be the next bastion for scientific study?
There's no doubt that some video games are more mentally taxing than others--anyone who's played Portal can attest to that. But does the ability to excel at one game lead to increased aptitude in other mental tasks? That's a link that cognitive scientists have been trying to make for years, with numerous studies making cases for both sides. You've seen the headlines: "video games are good for the brain!" when at best these studies find loose correlation between increased brain activity and gameplay. There's not yet been conclusive proof that just because someone is good at playing Team Fortress, they're going to be better prepared for problem-solving challenges in the real world. But that's the story again being told by this Scientific American article about Cognitive Scientists studying StarCraft. And it bugs the heck out of me.
StarCraft is a likely candidate for these types of stories for a couple reasons. One: it's a game where there is a clear spectrum of player skill. The top players in the world are measurably better than the rest of the population at playing the game, which makes their abilities worth studying. Second, the game utilizes many mental abilities, all at once: response time, multi-tasking (working memory), and decision making. These are traits deemed interesting for cognitive study. And indeed, there are many academic institutions studying StarCraft games and players. It's even a platform for AI research. But the game is an ideal case study not just for its innate complexity, but for the availability of data that can be mined from the game for research. Cognitive scientists choose to study StarCraft not because it's some a perfect modern analogue to chess, but because it's popular--millions of people play it, and the game saves replay files that are ripe with data to analyze.
But just because researchers choose to study StarCraft doesn't mean the game automatically has profound implications on cognitive research. It is not, like chess, "The Drosophilia of Cognitive Science." And even the study of chess has not proved that expertise in one mentally challenging task is transferable to other tasks. The Scientific American article admits as much:
Part of the problem is that once developed, human skills generally stay specific to the original task. Expert chess players, for example, have significantly superior recall of the positions of chess pieces on a board after a brief exposure than non-experts...But chess players turn out to be no better than others when asked to remember the arrangements of chess pieces placed at random, in configurations that could not appear in a game. Experiments in numerous other domains have demonstrated a similar lack of transfer.
The leap, then, to claim that expertise in StarCraft could be transferable is a tenuous link at best. The article cites video game studies where researchers found some general skills heightened in gamers, but these were for first-person shooter games. Even acknowledging that, the author makes the weak argument that generalized expertise is evident in the fact that many top StarCraft players were also top Warcraft players. While it makes sense that someone practiced in a real-time strategy game would find it easier to learn and play another RTS (or at least inclined to embrace it), I'm dubious that expert decision-making in StarCraft can apply to decision-making in emergency management situations.
Studying thousands of game replay files won't reveal the secrets of mental "expertise." It will more likely show that the best StarCraft players are the ones conditioned to master pattern recognition and execute habitual commands through an time-tested technique: practice.
@TheVoxelman on twitter