Regardless, I still greatly prefer the closed-development platform of a console versus a PC.
The fact that you say you get the highest end stuff for $800 and such, and can play games X and Y on graphic setting B is rather funny. I've been a PC gamer (or I should say was), in the mid 1990's to early 2000's. I've had 2 computers in that timespan (plus the one I have).
Paid $2,500 for a Pionex rig - 120mhz, 16mb of ram, 1mb vid card, 1.6gb of HDD, 17" SVGA monitor, printer.
Gaming-wise, it lasted me about 2 years on the newest, and best games. After that, there was a huge dropoff in what games I could play when they started requiring 32mb of ram, and I didn't upgrade.
Same thing happened with my next rig - a 1.2ghz AMD, 128mb of ram, 40gb HDD, and I think a 32mb V-card, in 01. It was a great, cheap rig when I got it, but the ram got useless pretty quick. It saw maybe 1-2 years of the top-end games, then went DOA.
There are great tradeoffs with both systems - consoles are cheaper, and closed dev space, so your going to actually play new games for 5+ years from the start of the console's life cycle.
The PC will always get better graphics for whatever games, but it's absolute bull to say that a PC can play games for more than 4 years on ANY setting.
In fact, someone show me a rig from 3 years ago that has the minimum requirements for Crysis, and is under $2,000. Someone do that NOW if you want to prove PC superiority. But unless you can show be an affordable computer that can last 3+ years for good games, your full of crap.
Back from the dead, I'm afraid.







