By using this site, you agree to our Privacy Policy and our Terms of Use. Close
WC4Life said:
potato_hamster said:

Tell me the incentive for third party developers to put in the extra time, money and effort to develop for more advanced hardware when there is zero indication it will lead to increased sales. If games aren't coming out that take advantage of the new hardware, you end up in "New Nintendo 3DS land", a console that has additional processing power over its predecessor that all but one game have ever taken advantage of - because there's literally no point in putting in the extra effort. None.

Obviously Sony has to make it easy for developers to change graphic settings between systems and in a way that already exists. Compare PC games to PS4/XB1, there is not much work to change resolution, fps, shadows, lighting, developers only need to use the extra HW power like they do in PC games. This whole generational thing consoles has had will disappear. There is no benefit of purposefully limiting your potential customer base with generational splits. Make a game, change settings and make it playable for PS4, PS4.5, PS5, XB1, XB1.5, PC, Nintendo, Steambox X/Y/Z, sell the game. We have already seen all the remasters this gen and those games sell but because of the last-gen spec hardlock, developers has had to put time, money and effort to make it work for PS4/XB1. Future games will be developed scalable from the very beginning so it can be sold to a wider audience right from the start and at the same time being future proof for coming hardware upgrades. Yes, of course this means there won't be "coding to the metal" to the same degree but when looked closely, "coding to the metal" does not really exist in 3rd party releases anyway, not compared to first party exclusives.

Things change and I don't believe console business is immune to change. This whole conversation follows the same path (not just this thread but in general about "change") where there is inevitable change happening and people start resisting it but in the end, the change will happen. Upgradeable HW+digital games combo will bury this current console model in the future.

edit: For console business I mean upgradeable HW in the sense of multiple SKUs varying in power.

See here is the problem with your reasoning. Your reasoning requires a fundamental misunderstanding of how console games are made vs. PC games. You just see PC as "the right way" and ignore all of the benefits of owning and developing for one specific hardware specification. By creating multiple versions of the same console you do multiple things:

You add confusion to the consumer base who mostly are looking for an easy and affordable solution to gaming. Prospective buyers now how to wonder just how compatible their games could be and if they're at a disadvantage in multiplayer because they own the less powerful version of a console. Currently when you buy a PS4 that is the only PS4 you need to buy to know for a fact that you will have the optimal experience on that console for all PS4 games that have been released or will be released. No worrying about anything else. You are taking that from consumers, and that will piss them off greatly.

Now add developers. See when developers create console engines the optimize them and continue to optimize them for that specific hardware spec. "Coding to the metal" as you said. However, also remember that any changes to that engine to accomodate extra hardware resources adds bloat to that engine, which means that games on the lower spec will fundamentally not run as well as it would run on an engine that was optimized for that specific hardware specification. This is a huge advantage to consoles that you are effectively removing. This is the main reason why a console will run a game better than a PC with the exact same specs. PC engines are a lot more bloated than console engines as a result. This will always be this way as long as consoles one one specific hardware specification. While it's true that most third party engines aren't as "to the metal" as first party ones, the fact remains that any subsequent engine changes will make these engines far more bloated than their earlier counterparts.

On top of that, currently console game developers optimize things like AI, 3D models, animation rigs, lighting effects etc run run within a specific resource budget. With very few exceptions they always, and I do mean always develop these things to the lowest common denominator, the weakest console. That's why when you see games for PS4 and X1 the main different between them isn't shaders, or higher res models, the biggest differences is almost always frame rate. That's because the rendering process is really easy to scale. Almost everything else does not work on a "slider" like it does in PC games because it takes far more resources to do it that way.

Also consider - PC games don't have any kind of certification process. Besides steam, but even then the certification process is a joke compared to consoles. With consoles games have to be certified by Sony, MS and Nintendo to meet a minimum standard before it can be printed on discs and sold on their digitial stores (I know some games still come out buggy as shit but that's not the point). The point is, because of this, there is significantly more testing that needs to be done on console games. Every hardware specification has to be tested. During development, many competent studios require that every single change that is submitted to the mainline of the application pass battery of automated tests that must be demonstrated to have passed those tests before it can be submitted. On the last multip-platform game I worked on (PS4/X1) Spending 5 minutes fixing a typo in a menu resulted in 45 minutes of testing before your change could be submitted. If both Sony and MS released ".5" versions of the PS4 and X1, you would effectively be doubling the time this process takes for every single change that is submitted into the system. Every one. On top of that, In the final testing stages of a game (after it has "gone gold"), I have seen testers have an array of literally every type of PS3 with different internals Sony has ever released, and run the game on each just to make sure that there are no hardware specific bugs. Of course, this is harder to find, which is why you will occassionally hear of a game that has a crash if you run it on X model of PS3. There is absolutely zero doubt making this change would increase development costs for every single PS4 game developer.

So you're right, PS4 games could adopt the PC model of game development with "sliders" and shit. But that comes at a cost of poorer performing games. It comes at a cost of increased development costs for everyone developing and maintaining a game engine. It comes at an increased cost to game developers who now have to test for extra hardware specifications. It comes at a cost of consumer confusion who just want the simplest game experience possible. It comes at a cost of Sony developing the new hardware, dev kits, test kits, educating developers etc. There are so many added costs you are completely glossing over.

It is not as simple as you think it is.

All for what? So the less than 5% of HDTV owners with 4K capable HDTV might buy a PS4.5 assuming they don't already own a PS4 and assuming they feel the difference is signifcant enough to upgrade. Seems like a huge risk for such miniscule reward.

The day consoles effectively turn themselves into locked down PCs is the day the vast majority of that console's consumer base will abandon them. People play on consoles because they want a "pick up and play" gaming experience. Any complications to that piss that consumer base off.