By using this site, you agree to our Privacy Policy and our Terms of Use. Close
HappySqurriel said:

Personally, I would expect the raw processing power jump to be less than the jump from the PS2 to the PS3; and for diminishing returns to make the perceived jump even smaller. If Sony and Microsoft push for many "technical" enhancements (1080p, 60fps, 3D, etc.) the perceived benefit could be pretty minimal.

Where I expect a big difference is in interactivity ...

From the dawn of 3D videogames developers have been limiting interactivity in the environment to ensure that the resources are directed towards producing the best visual results; static lighting and objects used to limit real-time calculations, non-deformable objects to limit the quantity of polygons, etc. are all tactics used that limit interactivity but improve graphics. The "classic" example I would use of this in this generation was the original FEAR, which was a game that was more static than the original DOOM and Wolfenstein games released a decade earlier.

Why I see growth in interactivity is that the benefit of limiting interactivity will be heavily reduced by diminishing returns which will change how the trade-off is seen.

I disagree. I think the raw processing power will have an even bigger jump. Graphics technology, however, will seemingly not be able to advance at the same degree. The thing is, the amount of detail we are able to put in games needs to exponentially increase if we are to even see a difference between generations. For instance, you will see a huge difference between a picture made with 100 pixels, compared to one made with 1000.  However, the difference between a picture made with 1000 pixels compared to one with 2000 may not be that much even though the difference is even more than the previous comparison. This is sort of why pc gaming hasn't really changed all too much over the past years. The best looking pc game, arguably Crysis, is already 4 years old, but the hardware  since then has gotten way better.