Forums - Nintendo Discussion - Nintendo: next gen is about “improving gaming experience” “wonderful graphics won’t help”

HappySqurriel said:

From a cost-benefit analysis perspective I would say they're correct ...

To take an average game and add/polish gameplay to see a significant increase in quality may increase the development cost of a game by 25% to 50% for most developers. This cost would be spread between having more developers, paying them more to attract better talent, and giving them more time to fix issues in the game. In contrast, to take an average game and improve graphics to see a significant increase in quality may increase development costs of a game to 2 to 4 times their current level. Most of this cost would come from producing more graphical assets at a higher quality level.

If you were a developer and you spent twice as much to develop a game you would get far more bang for your buck by spending that money truly making the best gameplay experience across multiple platforms than by improving the graphics. A game that maintained current HD quality graphics (or moderate enhancements of that) that really pushed the gameplay forward based on unique user interfaces of each of the consoles would be far more impressive (and probably sell better) than a game that used that money to push graphical hardware.

In that case can you please answer Sal's post because I'm not sure who's correct anymore. Is it still true or have the middleware gotten rid of this issue?

I keep hearing the same thing over again but to no conclusion.



Around the Network

Amazing visuals only matter to Graphic Whores. Good looking games is good for the eyes, fun playing games is great for the soul.
MGS4 was a beautiful game, but in my opinion MGS was 10x the fun and intrigue all while looking less stunning.
That is the first game that comes to mind, but I know there are countless others to compare.
Nintendo are on the right track as usual.



                                  Gaming Away Life Since 1985


happydolphin said:
HappySqurriel said:

You were right about that Zoo comment a while back. (that is all guys, I won't spam I promise)



Unless gameplay is good, graphics wont help

That is true and I 100% agree with Nintendo philosphy on focusing on the gameplay first

That said, once they get gameplay right, there is no excuse to not make it better by wrapping the experience with deservingly high production values for sound and graphics.

We don't pay for prototypes... we pay for polished final products that are up to industry production standards and we do expect sound/graphics etc to keep improving on the technical aspects even if we expect Nintendo to prioritize on more timeless aspects like gameplay and art direction.



happydolphin said:
HappySqurriel said:

From a cost-benefit analysis perspective I would say they're correct ...

To take an average game and add/polish gameplay to see a significant increase in quality may increase the development cost of a game by 25% to 50% for most developers. This cost would be spread between having more developers, paying them more to attract better talent, and giving them more time to fix issues in the game. In contrast, to take an average game and improve graphics to see a significant increase in quality may increase development costs of a game to 2 to 4 times their current level. Most of this cost would come from producing more graphical assets at a higher quality level.

If you were a developer and you spent twice as much to develop a game you would get far more bang for your buck by spending that money truly making the best gameplay experience across multiple platforms than by improving the graphics. A game that maintained current HD quality graphics (or moderate enhancements of that) that really pushed the gameplay forward based on unique user interfaces of each of the consoles would be far more impressive (and probably sell better) than a game that used that money to push graphical hardware.

In that case can you please answer Sal's post because I'm not sure who's correct anymore. Is it still true or have the middleware gotten rid of this issue?

I keep hearing the same thing over again but to no conclusion.

Middleware reduces the amount of work required by programmers to produce a game, the vast majority of work required to produce a videogame is in creating the artistic assets ... It isn't just the increase in time it takes to create the core assets, you suddenly need far more "filler" assets and these need to be produced at higher quality, and there is no middleware that generates this content.



Around the Network

"Gameplay first,graphics after".It 's always right
I have played many games with good graphics but poor gameplay this gen,I want to see the difference next gen.



HappySqurriel said:

Middleware reduces the amount of work required by programmers to produce a game, the vast majority of work required to produce a videogame is in creating the artistic assets ... It isn't just the increase in time it takes to create the core assets, you suddenly need far more "filler" assets and these need to be produced at higher quality, and there is no middleware that generates this content.

But what about a game like RDR that seems to make use of alot of generic assets and is still a quality game?



Andrespetmonkey said:

Because improving how a largely VISUAL medium looks won't improve the experience.

He was pretty clear in saying that, if the gameplay is bad, then wonderful graphics won't help, not that wonderful graphics don't improve the experience for great games.

The Avatar movie has great visuals, but it will never be a great game... because the gameplay is practically (or rather, actually) nonexistent. Graphics is one of those "all other things being equal" type of things - better graphics won't make a game better than another game that has better gameplay. But for two games with equally good gameplay, the one with better graphics will be the better game.

Iwata is arguing for more focus on making better gameplay, because graphics are more than good enough as it is, and improving them won't make anywhere near as much impact, now.



happydolphin said:
HappySqurriel said:

Middleware reduces the amount of work required by programmers to produce a game, the vast majority of work required to produce a videogame is in creating the artistic assets ... It isn't just the increase in time it takes to create the core assets, you suddenly need far more "filler" assets and these need to be produced at higher quality, and there is no middleware that generates this content.

But what about a game like RDR that seems to make use of alot of generic assets and is still a quality game?

There are ways that companies can reduce the costs, but using libraries of graphical assets would reduce the benefit of enhanced graphics



Aielyn said:
Andrespetmonkey said:

Because improving how a largely VISUAL medium looks won't improve the experience.

He was pretty clear in saying that, if the gameplay is bad, then wonderful graphics won't help, not that wonderful graphics don't improve the experience for great games.

The Avatar movie has great visuals, but it will never be a great game... because the gameplay is practically (or rather, actually) nonexistent. Graphics is one of those "all other things being equal" type of things - better graphics won't make a game better than another game that has better gameplay. But for two games with equally good gameplay, the one with better graphics will be the better game.

Iwata is arguing for more focus on making better gameplay, because graphics are more than good enough as it is, and improving them won't make anywhere near as much impact, now.

That's a good point. I guess my post is directed at the "who cares about graphics/graphics aren't important" camp rather than Iwata himself or his comments. Good catch