By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - When did you first notice that grafix had diminishing returns?

For me it was Mafia 2 in 2010. At the beginning of the game it takes place in WW2 and you were running through some historic building or something and there was so much detail (for the time) and I was thinking damn they put so much work into this just to force the player to run through in mere seconds and not have time to appreciate all the little details.

In those years (2009-2011) my brain had to adjust to the modern games because there was so much little details everywhere that I wanted to stop at each table for example to check if there was anything useful on it or it was all junk.

Now I recently upgraded my grafix card to play everything on ultra at 1440p but I keep coming back to PS3, currently playing Uncharted 2 which is 10 years old now but looks so good.

My Etsy store

My Ebay store

Deus Ex (2000) - a game that pushes the boundaries of what the video game medium is capable of to a degree unmatched to this very day.

Around the Network

I haven't.

You'll need to tell me what else the resources would be used for in order for me to determine whether spending resources on improving graphics is the best use of them.

It first happened around the end of the fifth gen. I didn't think early Xbox 360 games looked better than RE4. Of course, I was wrong...

Then it happened again at the end of the 7th gen...and continued though the 8th gen. Games look better than ever but that "wow" just doesn't happen much anymore. Newer hardware can do things older hardware can't, but it FEELS like everything we get now could have been done on Xbox 360/PS3 with some cuts. Those games still look good to me.

Twitter: @d21lewis

Well I have surely noticed diminishing returns, but there's still enough room to improve.
The leap from early 7th gen to now is gigantic and even looking back to something that was once considered a good looking game from the middle of 7th gen (Resident Evil 5) now looks pretty shitty even though it has seen a remaster.
Just mentioned that because I'm playing the remaster right now. Character models look acceptable but effects and environments are pretty poor.

Around the Network

Somewhere during the PS3/360 era. Naturally graphics are still constantly improving and getting better, but since then there hasn't really been the kind of huge leaps that happened from PS2 to PS3 or NES to SNES and so on. The generational differences aran't as pronounced as before, so even though I do still appreciate the improvements developers are constantly making, it's quite rare that a game will blow me away with just its graphics anymore.

Going from Xbox to 360 wasn't a very big leap. At least not in the early years. SNES blew away NES. N64 blew away SNES. PS2 blew away PS1. 360 did not blow away OG Xbox. PS3 blew away PS2, but PS2 was a very low bar. Hell, OG Xbox blew PS2 away.

Wii/xbox360 and ps3 games look like shit these days

 "I think people should define the word crap" - Kirby007

Join the Prediction League

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

I would say the transition between PS3 to PS4.

In all fairness, it takes more horsepower to improve graphics, versus say when compared to audio sound, or game-play mechanics. Of course, storytelling remains more or less consistent; but varies with audience and pop culture.

Personally, I first noticed it transitioning from 8-bit to 16-bit; yup old timer here. When scaling and 3D rotations were introduced. However, still to this day, I am extremely impressed and in complete awe standing in front of an UHD 4K set. Yes, this also includes PC graphics. And I still retain a soft spot for 3D formats.