From a cost-benefit analysis perspective I would say they're correct ...
To take an average game and add/polish gameplay to see a significant increase in quality may increase the development cost of a game by 25% to 50% for most developers. This cost would be spread between having more developers, paying them more to attract better talent, and giving them more time to fix issues in the game. In contrast, to take an average game and improve graphics to see a significant increase in quality may increase development costs of a game to 2 to 4 times their current level. Most of this cost would come from producing more graphical assets at a higher quality level.
If you were a developer and you spent twice as much to develop a game you would get far more bang for your buck by spending that money truly making the best gameplay experience across multiple platforms than by improving the graphics. A game that maintained current HD quality graphics (or moderate enhancements of that) that really pushed the gameplay forward based on unique user interfaces of each of the consoles would be far more impressive (and probably sell better) than a game that used that money to push graphical hardware.
In that case can you please answer Sal's post because I'm not sure who's correct anymore. Is it still true or have the middleware gotten rid of this issue?
I keep hearing the same thing over again but to no conclusion.