By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.