I'm not sure I'd say the failure of most to adapt Vista/Direct X 10 is the real bottleneck for PC developers.
While the additional effects in games that actually took advantage of them are a decent improvement, it was hardly anything most were willing to upgrade OSs for, much less Vista, given it's ruined reputation. Without looking for specific differences, generally I don't even notice.
I'm curious to see how much of an improvement DX 11 will be personally, since it will influence how soon I upgrade to Windows 7 and to DX 11 VGA cards, but until I see the difference in games, I'm counting on it being another incremental bump in quality rather than a reason to drop everything and upgrade.
The typical hardware build on the other hand, may be a bit more of a speed bump, given that the last game that practically had mandatory hardware upgrade requirements was Crysis back in late 2007.
Most common PCs builds are still catching up 18 months later.
Quad core PCs are great for workstations and media processing, but no compelling games are requiring them despite being available since 2006. And really, why would they since they're still among the vast minority of PCs?
And that is the biggest hold up for PC games; developing for the lowest common denominator, balancing increased hardware requirements with a reduction of initial potential users that fall within those requirements.
I can't think of any compelling game since Crysis, and seeing as how it's the first thing PC gamers latch onto to "prove" the superiority of the gaming platform, it's not the available hardware that's holding things back (all depends on how much you're willing to pay), and I'd say it's not even software developers; it's the typical PC user.








)
