LegitHyperbole said:
They have to course correct and bring down production costs. Smaller teams, less graphical detail nonsense until AI can reduce the cost of that significantly enough to make it worth it. If they put the price on the players they risk crashing the AAA industry and indies aren't enough to sell systems. Nintendo and indies on PC will be all that's left after the implosion. Some of the most popular games of the past decade have been non graphically intensive, go back as far as Minecraft, among us, Palworld more recently and so on and none of them reached close to 70. |
I personally think raising prices, hiring more staff (or higher quality staff at a higher salary) to share the workload is the course correction needed. Reducing staff will just lead to more crunch time and less happy employees. It will also lead to a talent funnel over the medium term. Pro-worker policies are the antidote to this.
Also graphics alone aren't why games are more expensive in this decade, as middle-ware has proliferated throughout most of the industry and helped control costs. Asset production is the biggest expense and that exists just as much for developers who choose more stylistic presentations. Almost all of Nintendo's studios, for example, have expanded significantly over the last decade as average asset quality (again which is only partly dependent on graphics quality) has improved.
I don't think the video game industry will crash over an $80-$100 price. Again, gamers were paying that much in the past and the industry was fine. There is heavy price-discrimination in the industry where game prices drop over a schedule or on subscription services, and that likely won't change.
What will crash the game industry (and which has crashed in the past) were poor quality titles and a lack of creative innovation. The micro-transaction model and long-development times that plague the industry currently are more likely to cause that than inflation-tracked price-increases are.