Chrkeller said: I don't agree with the diminishing returns argument. There is massive improvement still to be had, the problem is it requires complementary hardware and is expensive. The one aspect the ps5 crushes the ps4 with is lighting, but without OLED the difference is somewhat negated. 120 hz is superb, but requires a TV that supports 120 hz. The list goes on and on.
I just upgraded to a rtx 4090, is slaughters the ps5. But again requires a good TV/Monitor and of course isn't cheap. So diminishing returns exists, depending on how it is define, but graphics can get way better than seen on consoles. |
We went from one generation where experiences were mostly 60fps (PS2) to one where they were barely hitting 30fps (PS3/360), and the general gaming public didn't really bat an eyelid around the new 30fps base because they were wowed by overall graphical presentation. There was no digital foundry to unite people around the numbered comparisons. Things like 120hz are simply not going to be a meaningful thing for most gamers.
The difference you reference are really tech enthusiast ones, the average person doesn't even know whether their TV has the aweful motion+ (force high frame rate modes on). What makes a difference to people are worlds, storys, characters, gameplay that wouldn't have been possible on prior hardware and we're really getting a point where that is rarely applying to new games.
This kind of comparison would never of been able to be made comparing one of the best looking games over a year into a new generation, to another version of the game on prior hardware.
4k for sure makes a difference but the essence of the visuals and what its able to achieve are the same.