By using this site, you agree to our Privacy Policy and our Terms of Use. Close
potato_hamster said:

See here is the problem with your reasoning. Your reasoning requires a fundamental misunderstanding of how console games are made vs. PC games. You just see PC as "the right way" and ignore all of the benefits of owning and developing for one specific hardware specification. By creating multiple versions of the same console you do multiple things:

You add confusion to the consumer base who mostly are looking for an easy and affordable solution to gaming. Prospective buyers now how to wonder just how compatible their games could be and if they're at a disadvantage in multiplayer because they own the less powerful version of a console. Currently when you buy a PS4 that is the only PS4 you need to buy to know for a fact that you will have the optimal experience on that console for all PS4 games that have been released or will be released. No worrying about anything else. You are taking that from consumers, and that will piss them off greatly.

Now add developers. See when developers create console engines the optimize them and continue to optimize them for that specific hardware spec. "Coding to the metal" as you said. However, also remember that any changes to that engine to accomodate extra hardware resources adds bloat to that engine, which means that games on the lower spec will fundamentally not run as well as it would run on an engine that was optimized for that specific hardware specification. This is a huge advantage to consoles that you are effectively removing. This is the main reason why a console will run a game better than a PC with the exact same specs. PC engines are a lot more bloated than console engines as a result. This will always be this way as long as consoles one one specific hardware specification. While it's true that most third party engines aren't as "to the metal" as first party ones, the fact remains that any subsequent engine changes will make these engines far more bloated than their earlier counterparts.

On top of that, currently console game developers optimize things like AI, 3D models, animation rigs, lighting effects etc run run within a specific resource budget. With very few exceptions they always, and I do mean always develop these things to the lowest common denominator, the weakest console. That's why when you see games for PS4 and X1 the main different between them isn't shaders, or higher res models, the biggest differences is almost always frame rate. That's because the rendering process is really easy to scale. Almost everything else does not work on a "slider" like it does in PC games because it takes far more resources to do it that way.

Also consider - PC games don't have any kind of certification process. Besides steam, but even then the certification process is a joke compared to consoles. With consoles games have to be certified by Sony, MS and Nintendo to meet a minimum standard before it can be printed on discs and sold on their digitial stores (I know some games still come out buggy as shit but that's not the point). The point is, because of this, there is significantly more testing that needs to be done on console games. Every hardware specification has to be tested. During development, many competent studios require that every single change that is submitted to the mainline of the application pass battery of automated tests that must be demonstrated to have passed those tests before it can be submitted. On the last multip-platform game I worked on (PS4/X1) Spending 5 minutes fixing a typo in a menu resulted in 45 minutes of testing before your change could be submitted. If both Sony and MS released ".5" versions of the PS4 and X1, you would effectively be doubling the time this process takes for every single change that is submitted into the system. Every one. On top of that, In the final testing stages of a game (after it has "gone gold"), I have seen testers have an array of literally every type of PS3 with different internals Sony has ever released, and run the game on each just to make sure that there are no hardware specific bugs. Of course, this is harder to find, which is why you will occassionally hear of a game that has a crash if you run it on X model of PS3. There is absolutely zero doubt making this change would increase development costs for every single PS4 game developer.

So you're right, PS4 games could adopt the PC model of game development with "sliders" and shit. But that comes at a cost of poorer performing games. It comes at a cost of increased development costs for everyone developing and maintaining a game engine. It comes at an increased cost to game developers who now have to test for extra hardware specifications. It comes at a cost of consumer confusion who just want the simplest game experience possible. It comes at a cost of Sony developing the new hardware, dev kits, test kits, educating developers etc. There are so many added costs you are completely glossing over.

It is not as simple as you think it is.

All for what? So the less than 5% of HDTV owners with 4K capable HDTV might buy a PS4.5 assuming they don't already own a PS4 and assuming they feel the difference is signifcant enough to upgrade. Seems like a huge risk for such miniscule reward.

The day consoles effectively turn themselves into locked down PCs is the day the vast majority of that console's consumer base will abandon them. People play on consoles because they want a "pick up and play" gaming experience. Any complications to that piss that consumer base off.


I do understand console vs PC game development differences at some level. It is you, who through assumption have come to a conclusion where I supposedly think "PC as the right way" or I must be completely misunderstanding game development. What I have said is my opinion of what I think will happen, not what I necessarily would like/want to happen. So start off with understanding that your whole reasoning lies in that assumption of yours.

First of all, there are obvious downsides to multiple SKU model and I do understand that. However I think it is fuckin naive to believe that all things always change for the better with 0% negative aspects. Consumers will accept some drawbacks when things change. PS4 and multiplayer behind paywall, nobody gives a shit anymore.

Where does the confusion come from? To make it work it needs to be simple. Whether you own a PS4 or PS4.5, you go to a store and there are PS4 games, you buy one, you plug and play. Every single PS4 game would work on both SKUs. The responsibility for balancing multiplayer is up to the developers as it has always been and if you look at PC, developers can make it work. SFV works.

Having 2-3 console SKUs similar to each other is not comparable to the hundreds of different configurations the PC has. The console engines would still be highly optimized in comparison to PC counterparts. As long as the lower spec SKU would have the bigger active install base, developers would prioritize development for lower spec SKU and brute force some extra effects, resolution, fps on the more powerful SKU. That would be the starting point and eventually development process would evolve to where games and engines would be designed from the very start to take into account more powerful HW SKUs coming to exist, making development more cost and resource effective.

Of course there are costs involved for all parties but compared to the total "reset" of the current model, this smoother, on-going approach to HW upgrades would decrease investment costs considerably. And yes, consoles are moving towards PC, that much should be obvious when the change to x86 happened.



I cannot imagine toilet-free life.

Kebabs have a unique attribute compared to other consumables. To unlock this effect you need to wolf down a big ass kebab really fast, like under 10 minutes or so and wait for the effect to kick in. If done correctly your movements should feel unbelievably heavy to the point where you literally cannot move at all.

-Downtown Alanya Kebab magazine issue no.198