By using this site, you agree to our Privacy Policy and our Terms of Use. Close

What it amounts to these days is a change in economics and technology.

Used to be that a game wouldn't come out unless it had a vast majority of its bugs squashed. My dad used to beta test for Sierra and, having helped him out, he would receive entirely new builds of the game every couple to three weeks, by mail and on CD, with new changes. Betas would go on for close to a year, occasionally, if there were enough bugs to try and crush. It's the same reason that when you think of games back in the 80s and 90s, you can't really pin down a lot of bugs in the games you played. Costs were lower, naturally, so they seemed to have more 'make it work properly' money.

Since the early 2000s, companies have been working with higher graphics and far more complicated programming tools and needs to make games that will still shock and awe us, which is what a lot of people want. When we start a game, we want to feel like we're there and that costs a bundle. So, you're getting bigger, better looking games and they're in a market that is surprisingly competitive. Thus, if it comes down to it, they burn a game, with known bugs because QA might catch the bugs but it might be too big to do before the final release candidate, into the Gold version so they can send it off for mass production, then start working on the patches they couldn't get at that time. The internet and connectivity has given them the ability to do that, so they use it liberally.

Basically, yes, games these days tend to be far more broken than they used to, but it's typically for reasons such as complicated fixes at a critical time before release or economics. And this doesn't count the fact that bigger companies can't do what companies like Mojang can, which is release snapshots that allow a million people to find what a a couple hundred can't, so they don't know some of these bugs are in the final game because it's only after release that they figure out its there.