Shadow1980 said:
spemanig said:
Oh, for Pete's S-
WHO. CARES?
Is "subjectivity" the new "that's your opinion" now? Who cares if progress is subjective? That literally changes nothing about what I said. Being able to play new games on old hardware and upgrade when you want instead of the artificial cut offs we have now is progress. Having every new piece of hardware be backwards and forwards compatable is progress. There is literally no downside to a system like this. It is, by the numbers, superior to what was there before in literally every way. It isn't even like digital only where that would at least mess with collectors. This literally harms no party while adding flexibility were it was once technologically impractical to do so.
Bringing up that progress is subjective in such a dry numbers case as this is a careless abuse of the leniency of the word. Sure, calling forward compatibility progress is "subjective." Calling the cotton gin progress is "subjective." Calling the assembly line progress is "subjective." Calling the internet progress is "subjective." Calling the smart phone progress is "subjective." Everything is "subjective." Everything is "just an opinion." We live in an era where everyone is a delicate little flower and every idea has to be equal and no idea is wrong because hey, technically that's not how opinions work!
Well guess what? Having an opinion in-and-of itself doesn't make it valid, or intelligent, or hold weight. Stop trying to use subjectivity as some catch-all argumentative "get out of jail free" card so that you don't have to validate your arguments. If you think that your "idea" is more valid than mine, stop hiding behind the "it's just your opinion, bro" defense mechanism and actually support it. "That's subjective" isn't a valid response to anything. It's just using redundancy to meekly deflect attention away from the point of view being contested. That's all it ever will be. An underhanded way out. But not today.
Stop muddying the discussion with ubiquitously obvious semantics. If you have something to argue, argue it. If you think your opinion is more valid than mine, show me up by out-supporting me. If you don't, don't respond. Or be dignified by waving a white flag. Or do literally anything more classy than "that's your opinion"-ing me.
|
Take a damn chill pill, dude. No sense in having a meltdown. You go on a rant about fucking cotton gins when I asked you to define "progress." I guess I should have specified "as it relates to video games." For example, we can quantify changes like the shift from carts to discs, or the means of storing saved game data, or the advent of online. But is change equal to progress? There was a debate 20 years ago about whether shifting to discs was truly "progress." Both sides had some merits to their argument. Carts had no load times, whereas discs had long load times (they still do in some cases, but back in the mid 90s it could be downright awful with the slow CD drives). They were vastly more resilient than CDs, which could be easily scratched if not handled properly. They could often (but not always) save most data directly to the game without needing memory cards, whereas discs cannot save any data directly to themselves. However, carts were also very expensive to manufacture relative to discs ($10-15 each, vs. $1-2 for a CD) and could not hold nearly as much data (the biggest N64 carts could hold, appropriately, 64MB of data, while a CD could hold up to 700MB). The low price and high capacity of discs won out over the sturdiness and lack of load times of cartridges, which are now relegated to handhelds. Loading times and increased hardware costs due to hard drives are something we've just had to deal with, but on the plus side we have cheaper software, and discs being more fragile than carts isn't an issue if you're responsible with them, so maybe on the whole discs were "progress." But even if we agree that switching from carts to discs was progress, that does not entail all shifts in media are "progress." I for one think digital downloads and especially streaming are inherently inferior to hard copies for numerous reasons, which would require several paragraphs to explain. Switching from physical to digital may be "progress" to some, but not to others, thus why "progress" is often subjective.
But what about the advent of online play? Is that really progress? Some would argue "no." While hopping on XBL to play with your friends is more convenient than arranging a particular day and time to do some local split-screen or LAN play, we also have all the negative aspects of online gaming. Griefers, cheaters, and assorted other assholes often go out of their way to give other players a hard time and ruin the experience. Playing with others is very impersonal, and lacks the feel of playing with your real friends, and this often manifests not only in the aforementioned antisocial behavior, but people simply treating the game as a truly competitive experience rather than simple fun with friends. Also, developers have increasingly made certain types of games ever more dependent on online play, and many major games lack local MP and/or co-op play entirely. "Shared world"/quasi-MMO games like Destiny, The Division, The Crew, are entirely dependent on an internet connection. Don't have internet? Can't play. Have internet but don't want to pay for XBL or PS+? Can't play. Internet goes out? Can't play. Servers get shut down permanently? Congrats: your $60 game just became a $60 coaster. And the sad thing is, many of these games could have been easily made to be playable offline. Is any of this real progress? I don't think it is. Online play is an evolution of older forms of social gaming, but evolution doesn't necessarily produce "better" things. Even some in this thread think that not only all multiplayer games, but even all single-player games being utterly dependent on an internet connection is a good thing, but I think it's a bad thing... a very, very bad thing. "Progress" is thus subjective.
And iterative hardware running on 1/2/3-year cycles like Apple's iGadgets instead of the normal 5-6-year console cycle being "progress" isn't as clear-cut as you might think. When people buy a console, it's with the expectation that they get a good half-decade or more use out of the product before their $300-400 box gets replaced by something newer and more advanced. It's a long-term investment. If we get a PS4 Neo 2 in 2018, a Neo 3 in 2020, and so on instead of a PS5 in 2020, a PS6 in 2027, etc., how is that going to benefit my wallet? As with PC gaming, I am eventually going to have to upgrade at some point, because the latest and greatest games will eventually be too advanced to run on older models. My PS4 Neo 5 I buy in 2025 might run GTA 8 and all my older PS4 games, but if I stuck with my 2016 model original PS4 Neo, will it run a game designed for the newest hardware iteration released 5-10 years from now? I somehow doubt it, considering my first entry-level PC, bought in 2004, wouldn't have been able to handle the likes of Crysis, much less Battlefield 4 or Starcraft II. If consoles become iterative instead of generational, I'll still probably have to upgrade at some point to play newer games, perhaps just as often if not more so. As with PC games, a console game will still have a minimum system spec requirement to even function. And since I'll likely have to upgrade eventually, will this new bi-annual console paradigm have manufacturer or retailer subsidized trade-in programs to upgrade my $300 box with another $300 box, that way I don't have to pay $300 fucking dollars every year? Consoles, iterative or generational, will continue to advance, and that necessitates that one must upgrade or be left behind. The Neo and Scorpio will eventually be unable to play the latest games as their power becomes increasingly primitive compared to the latest models. MS bragging about "nobody gets left behind" rings hollow if one considers the simple facts of both consoles and PCs. A 15-year-old PC won't play the latest AAA games at any settings, a PS3 won't play PS4 games, and my OG PS4 almost certainly wouldn't be able to run a game optimized for, say, a "Neo 7" released 12-15 years from now.
Also, the advantage of consoles is that they're a fixed platform. If you suddenly have not just two, but eight or ten different extant hardware configurations to deal with, you have now greatly increased the complexity of game development (itself already complex enough as it is) and thus not only increased the cost of development, but increased the odds of on particular port being optimized poorly compared to the others. Giving developers half a decade or more to familiarize themselves with current-gen hardware to extract the most potential out of them would become a thing of the past. Think of all the poorly-optimized PC ports in recent years, as exemplified by last year's Arkham Knight. Now imagine the potential for that to become commonplace even on consoles as having to develop for five or six different versions each of the Neo and Scorpio increases the odds of a porting disaster tremendously, and each port could have their own unique quirks that could involve unique patches for each version. For the consumer, having a fixed platform that will be replaced every 6-7 years and supported for at least a year or two after being replaced means not having to worry about newer games having compatibility issues with older hardware, and it means that their games are less likely to have poorly optimized ports. And if neither Sony nor MS mandate forwards compatibility, it's entirely possible that developers might just abandon older iterations of the hardware sooner rather than later to simplify the development process, and this would feed back into the "getting left behind" and "still needing to upgrade" issues covered in the prior paragraph. Despite a strong install base for last-gen systems, developers eventually abandon it for new hotness and what it can do to help them push the medium forward. Older hardware not only holds them back, but complicates development. There's a reason why cross-gen AAA titles are all but dead this year.
Finally, there's the fact that the console cycle has always been dictated not by tech, but by sales. Eventually, a console's sales will reach a peak and enter a terminal decline phase, and once sales decline sufficiently the system is replaced by its next-gen successor. It just so happens that sales curves keep successful platforms alive long enough to when sales drop low enough to be replaced, the tech has advanced enough to offer a "generational" leap in power. And the Neo and Scorpio aren't the first hardware revisions of their type. While there's been numerous upgrades of various types in the history of the console market, the closest analog would be Nintendo's upgrades to the Game Boy, DS, and 3DS. The DSi and New 3DS had about the same effect on sales as price cuts, similar to what slim models do for home consoles. In the DSi's case, it did nothing to prevent the DS's terminal decline, providing only a modest short-term boost to sales. The DS had already been out for a while, and the DS Lite had already had a big effect on sales, but the market for DS buyers had been mostly tapped and Nintendo was running out of potential new customers for the platform. The New 3DS is much the same, providing a solid short-term boost, but with sales quickly declining back to where they were beforehand. Only the GB Color succeeded in providing a substantial boost to sales, but it came nine whole years after the original model and it was assisted by the Pokemon boom. The GBC was a unique case with very different circumstances facing it. Now, what if the Neo and Scorpio end up providing only a modest boost to hardware sales equivalent to a mid-to-late-gen price cut? Will a Neo 2 or Scorpio 2 even succeed in keeping sales steady? What if none of it succeeds in stopping the sales decline of the PS4 & XBO "product families"? We wouldn't just be walking into uncharted territory in terms of impact on player's wallets and on game development, but also on the very commercial viability of consoles. Without a true "next-gen" console to "reboot" the market, we could see hardware sales crater over time, potentially leading us to a new Great Video Games Crash. As first-party companies, SIE, MS, and Nintendo are not just dependent on software sales, but also on hardware sales to bolster their fortunes, and we have a proven generational model that works to keep hardware sales relatively stable over the long run. No sense in replacing a proven model with an unproven one for no good reason. If it ain't broken, don't fix it.
One cannot underestimate the potential of shifting to an iGadget model of iterative hardware to wreak all sorts of havoc on a market that quite frankly has done just fine for 30 years with the current status quo. It may work with phones and tablets, but those are fundamentally different machines; they aren't dedicated "hardcore" gaming devices but rather more general purpose and utilitarian gadgets, but there's no guarantee that it'll work with consoles. Similarly, there would be many problems with trying to make consoles more closely resemble the PC gaming market, which while definitely "hardcore" is still a very different market with different demands and considerations and a multitude of hardware spec standards.
The end of console generations as we know them may seem like "progress" to you, but when you actually step back and look at all the possible implications this has on the consumer, it could end up being a complete and total shit-show. The corollary of this is that there is every reason to think the Neo and Scorpio on one-off things, likely brought about by VR and 4KTV, and don't represent the advent of a new status quo in the console market.
|