sc94597 said:
This ignores the economics of video game development. A game is more than pure visual fidelity maximization. There are other parts of the game that need to be funded and resolved. Since publishers and developers don't have infinite resources, there must be trade-offs, and sometimes the trade-offs mean that the game won't push the boundaries in terms of asset quality, but might in others ways. The argument being made is that publishers have grown complacent and know that if they release the same game with better assets it will still sell quite a bit, while the developers who want to create something new suffer in a market where the standard is always for the maximization of asset quality. The frustration is not so different with that found in the movie industry, currently. Many moviegoers complain that the superhero and other action films with simplistic and complacent stories get tons of resources, while more thoughtful story-heavy films that might not have the best assets get the shaft. |
That's really just a lot of supposition. If we stop at the Switch instead of the PS4, there would basically be no difference except with the visuals for high-end games. It wouldn't suddenly lead to more innovation and it wouldn't have much of an effect on the budgets of most projects.
Last gen, developers ran into the limits of the PS3/360 very quickly. What was the gain in innovation? Why would stopping at the Switch now, which is stronger than both, be different?
If anything, developers said that the limits, especially in terms of memory, held them back from doing what they wanted with level design.








