I started playing Call of Duty 4 late last year. The thing that had held me back from getting it sooner, was the fact that my machine was already four years old back then, and just managed to meet the game's minimum requirement of a 9800 Pro graphics card (the computer also had an Athlon 64 processor and 1GB of RAM). Surprisingly, with the settings at 640x480 and most everything turned down low, the game ran decently, with no framerate losses that were bad enough to make me stop playing.
A few weeks ago, I realized that my good ol' computer, which served me through nearly all my undergraduate and graduate studies combined, was beginning to break down on me, with minor component problems beginning to come out of the woodwork: The DVD burner had recently turned illiterate, the screen flickered like a modern Outer Limits intro on startup half the time, the Restart in Safe Mode screen had random characters flicker on me on a crash once like the intro of MP3: Corruption, and the Memtest program couldn't even complete a test on the RAM to see if it had a problem. Accordingly, I decided to quit while I was ahead, and instead of waiting for some cataclysmic breakdown, I built myself a new PC, stuck the contents of the old hard drive into the new like some digital Matryoshka doll, and kept going.
So now I'm playing Call of Duty 4 with a triple core processor, an HD4850 video card, and 4GB of RAM. I've gone from a lowly 640x480 with everything turned all the way down, to a glorious 1600x1200 with everything turned all the way up. And you know what? Everything is shiny now! I can see the heat distortion rising out of the missile silos on Countdown, the shockwave from exploding grenades, the blurring of nearby objects when I aim down the sights of a gun at something far away. All this at a baby-butt smooth 60 FPS most of the time, dipping to about 30 when the screen gets really busy with friends, enemies, gunfire and smoke (but I wouldn't even notice the framerate loss if it weren't for a framerate counter program I use).
And yet, strangely, despite the graphical leap, the game has been more or less the same to me. After I got used to the shiny, I didn't find myself enjoying the game any more or less than I did back when I was chugging along on minimum settings, with a rig that had roughly the same processing power of the Wii. I think this means that, to me at least, if devs actually put in the effort to optimize the game, then processing power has already hit a point where they're "good enough". Of course, it helps that absolutely nothing in gameplay was lost in the jump, something devs seem eager to leave out when they develop for the Wii.
So as long as we're here, let's ask: Where does everyone feel "good enough" is for game graphics? Has the threshold been passed already? Do you feel that we won't be there until we get graphics that mirror real life perfectly?
Super World Cup Fighter II: Championship 2010 Edition










