By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mr Puggsly said:
potato_hamster said:

It might hypothetically be that simple in your mind, but in reality it isn't. It's tons of additional work for developers to modify their engines and optimize their games for different specs when there is absolutely zero reason to think that additional work will increase profits. It's a waste of time and money for a development perspective. That's what this man is saying. This is what I have been saying based on my own experience making console video games.

And people here still think "it can be that simple". No it can't. Not without a significant amount of work that no one wants to pay for.

Look at New 3DS, we already see developers utilizing the upgraded specs just performance and graphical improvments. http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-3ds-vs-new-3ds-face-off

In theory the primary focus is optimizing the game for the original CPU and GPU as they do now. For new specs which would be running the same architecture, they should be able to raise graphics settings without ruining performance kinda like a PC game. Perhaps I am over simplifying the process but we already see it happening with New 3DS.

If for some reason utilizing the new specs is too difficult (expensive) for some developers, they can simply optimize their game for old specs so the entire audience has access to their game. Many indie games for example don't need the extra power unless they wanna push 4K or something like that.

Maybe I make it seem to simple but you're making this a bigger problem than it actually is.

You want to use the new 3DS as an example? Really? A device that has been out for over a year and less than 5% of the games that have come out for it since actually take advantage of the hardware, and the most of the ones that do are first-party Nintendo or Nintendo-published games? The new 3DS is a perfect example of why this concept desn't work!

See that part where you mention how it's too expensive for some developers so they'll just develop for the old spec. That's the problem! That's exactly what will happen for about 95% of games, because why bother putting in the extra time and effort when there's nothing to gain?  This makes owning hardware with addtional processing power completely and utterly pointless for 95% of uses as it is quite literally offering the exact same experience as the lower spec.

So why should Sony, MS, or anyone bother in the first place? This is why if there is a "PS4K" it probably won't offer additional processing power for games in the way you imagine -  because it makes no sense. It'll probably play 4K blu-rays though.