By using this site, you agree to our Privacy Policy and our Terms of Use. Close
twesterm said:

One of the biggest things hurting PC games is there's nothing that enforces quality.  While Microsoft, Sony, and Nintendo may have closed systems, that closed system lets them enforce a measure of quality.  The process is different for all three, but there are basic requirements that say the game must work and be above a certain bar.  PC has nothing like that.

Because of that, more time is spent on the console versions of a game and the PC is just kind of there.  If something gets fixed in the process, cool, if not, oh well.  More attention is *always* put on the console versions if there is one because the big three can say no, you can't release this game but you can release whatever you want on the PC.

Now it's obviously in the developers best interest to release as good of a product as possible, but when it comes down to the wire and you have a minor bug that keeps your game from getting certified on a console and a major bug that makes the PC version crash, the console bug will be fixed and the PC one will be put further down the list.  Always.

Which sucks, but I wouldn't necsesarily blame that on the PC platform.  And it's not like the focus on consoles is simply keeping developers from fixing bugs in their PC releases.  There can be far shittier, gameplay altering results because of that.

Stolen from Neogaf's official Crysis 2 thread:

Dragon Age 2 is another example.

twesterm said:

You can't deny with that control comes advantages.  Every 360 runs just like every other 360.  The developer knows exactly what the target platform is and they can optimize specifically for that.  Sure, the graphics card on my 360 may be five years old, but that doesn't mean the games look bad by any means.  Also, like I said in my previous posts, the console enforce certain standards.

With the PC, there's an infinite variety of setups and it's unreasonable to ask developers to provide for even most setups.  There are just too many options.  It's hard on the developer and it's confusing for the general consumer.  On top of that, there are no standards.  It does make it easier for indie devs which is great, but it also means the PC version will almost always take a back seat to the console versions.  Finally, consoles have a completely invisible to the user DRM while most every PC game has some sort of cumbersome DRM (some worse than others, some not-so-bad).

And I totally agree that modding is a great thing, the games industry wouldn't be near where it is now without it.  That said, it isn't the thing that will keep PC gaming going and I believe you are the one that told me you refuse to pay for a broken or bad game so why do you think it's better or worthwhile for the community to fix a broken/bad game?

What's better to you?  Better quality game or better graphics?

Better quality game, which usually isn't happening when developers focus on consoles.  See above.

Also, modding isn't just used to fix a game.  Just look at Oblivion.