By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Hynad said:
Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.  


I think you have a fundamental lack of understanding of computer graphics.  What is taxing to the system is not the display resolution, 1920x1080 is all it takes to fill the pixels.  It's the physics required to render believable images at that resolution that is taxing.  It is much easier for a computer to render 100 images on the same sized screen using smaller pixels on a higher resolution screen than forcing a programmer to render the same 100 images on the same sized screen using bigger pixels.  TV resolution refers to the number of physical pixels.  Your computer has a hard time rendering at 1080p because your gpu and cpu cannot support the number of calculations needed to render the image on a timely basis.

Your verdict is based on a system that rendered pictures, not as intended.  Systems that can easily produce 1080p with all the physics produce BETTER quality pictures with cleaner edges.  There are effects you simply dont see, such as shimmering.

Video Game = Interactive Computer Generated Images.  What is taxing is how to emulate reality with computer generated images.  A monitor with higher resolution allows finer dots to draw with. 1080p is much fewer pixels than most decent sized computer monitors. Grapics have improved as the GPUs and CPUs have improved.  Current gen consoles have problems populating 720p screens at much beyond 30 frames per second.  The limiting facor was what was under the hood.

We now have the 1080p.  Wii U is 1080p capable.  That doesn't mean if everyone rendered at 1080p, the graphics will be nearly the same quality.  The physics could be infinitely taxing, layers and layers of effects can still be piled on.  When I said Avatar was a bad example, I didn't argue with much else of what you said, bur Avatar is CGI (with human art vs computer generated in spots).  Just because the new Super Bug can do 0-62 in 1.8 seconds doesn't mean you can use that as an example to prove all cars with four wheels with a Scion budget has much room for improvement.  No!  Consoles are budget computers just like the one you got.  There is no way in hell these consoles can go anywhere near Avatar, which is just a series of images displayed one after another with ZERO interaction that causes any recalculations. 

Again, I am ok if you insult me with "as if I don't understand what I'm saying".  But I do hope you are able to separate TV resolution from computer generated images quality.  Understand that it wasn't the resolution that caused your comouter to huff and puff but the PHYSICs.  Things wouldn't look as jaggedy if they were painted with a finer brush.  The granularity is EASY.  But making all those grains dance, shimmer, reflect light is super hard and super taxing.  If you lowered the resolution, your pc finally catches up, but the Physics wont work if by default your display resolution is less granular than what the damn engine is trying to refine.  

Thats why I really agree that 1080p display resolution is gonna be it for a while, at the expense of 4k TVs. So I agree with you that resolution is not the only way to improve graphics (even though it's the easiest way), but respectfully disagree with how much room there can be with console budgets.