By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Soleron said:
Slimebeast said:
Soleron said:

...

Great post. It was an eye-opener a few years ago when I realized that the performance of desktops and laptops suddenly had reached the level where the average consumer needs are perfectly satisfied. Same with HD TV. Only a very small niche will crave for 4K.

But I disagree about video game graphics. I really can't even imagine the plateau there. Now someone might say that Avatar quality graphics are almost  around the corner but then I'll say I want fully destructible Avatar graphics in huge worlds with interactive environments like in real life.

So our GPUs really need to become 100 times (if not 1000) faster than a 680GTX until we reach that level.

Plateau in what the average consumer requires of graphics, not best possible.

As long as a game looks like a good 360/PS3 game (i.e. 2005 era PC graphics), no one will actually be dissuaded from buying it due to that.

Plus seriously, game companies don't have the budget to make it look better than that for all but the highest selling games. Until we can scan IRL objects in and use raytracing on them that won't change.

Yes, we are definitely on our way to that demand plateu and by each console gen significantly less gamers are choosing graphics as being the difference maker for a purchase decision. On that I agree. Nintendo definitely made the correct analysis.

It depends on how you define a demand plateau. If let's say in last gen 75% of core gamers had their mouths dripping when they got a glimpse of current gen graphics, and that number has fallen to only 40-50% of core gamers being wowed by the UE4 demo, then we sure are trending towards a demand plateau, but I personally think that there's still a lot of gamers who value improved graphics to the level that it justifies these substantial leaps in hardware power that console gens bring.