Fallout 3 is a really bad example of how much better PC hardware is for gaming. Everyone knows it is, but the visual differences have dropped to the point where only the enthusiasts are going to notice the difference.
Those screen shots in particular did nothing but demonstrate the difference in lighting effects, which were by far the most drastic difference.
So is it really worth noting the difference? No. Only if you want to run the game at a higher native render resolution than 1280x720, which is by far the most common resolution used for both the Xbox and the PS3.
Also, Bioshock, Call of Duty 4, Unreal Tournament III, Orange Box... I'm pretty much going down a list of games I have on both a PS3/Xbox and on PC.
Beyond render resolution, AA and typically frame rate, the visuals are far more similar than not.
While *I* notice the difference enough to buy the games again for an overclocked gaming PC (they all run fine on a stock clock Core2 Quad system as well), it's mainly because I prefer the sharper visuals and the smoother frame rates. Does it actually make the graphics look better? Not really, no. But as a fine detail person, I prefer it just like I prefer BD movies to DVD.
If a console was actually capable of rendering all games at 1920x1080 native at about 60fps (at that resolution, 2-4xAA would be nice but optional) I'd see a lot less reason to buy multiplatform games on PC. But as it were....
But I just have to disagree with this perceived "chasm" of graphical difference the PC pundits seem to be claiming. I think in most cases, they're more enamored with the platform and the hardware itself than anything else.
It's not like the difference is anywhere near as arresting as it is between jumping from a game on a SD console and an HD console.







