By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Resolution of games. Seriously need explantion.

Image resolution of games need serious explantion by someone. Remedy are the first dev to actually tell us all the different aspects of a game will vary greatly in resolution.

I'm guessing that normally native res is determined by the geometry. This is quite bad actually. That means the actual hard edges of objects. Thats it. Everything else that makes up the image could be 240p and noone would know till the game is out that the image looks atrocious.

Doing a bit of digging. I hooked up VT3 on my tv. Looking at it now, it's so obvious. The geometry lines look great. But the shoes, shirts, hair, grass. Looks shifty as hell. The geometry of the image looks great, but the effects and textures look like Xbox 1.

Remedy talk about 50 odd different image points that have differeing resolutions. I'm guessing shadows is 1 as some games have very low shadows where you can clearly see major res drops.

Personally, I would like an average of all the image resolutions to be the overall resolution count on the box. Not just the geometry. As it's about 1/50th of an image. No?

I have left a message with Remedy on a list if possible so as to start understanding this more.

If anyone here has alot of knowledge about this ( but for some reason has stayed quiet and never let us know ) please could you help me understand why devs knowingly hid this info before?

Personally I will gladly have 540p, if all the rest of the image res is upped. Overall look would be much better.



Around the Network

higher resolution = higher def lol
in all seriousness, poeple just want the best graphics so they buy expensive tvs/games/movies so stuff looks better



"After you win, son, I feel like going for a ride on your bike, haha." ~Doc Louis (Punch Out Wii)

 

 

at the end of the day...i don't need a number on the back of the box to tell me what looks better on my tv, so when i pop in call of duty mw2 it looks good but not great even though it says 1080p on the box, when i pop in uncharted 2 it looks amazing even though it says 720p on the box, so its w/e for the whole alan wake thing, it doesnt matter what it says on the box, the first 10 min or w/e looks awesome...I let me and my tv do the judging of graphics.



"even a dead god still dreams"

 

The screen resolution is what it is, nothing that goes into creating a frame will change that.

http://www.eurogamer.net/articles/digitalfoundry-alanwake-sub-hd-blog-entry

...So, who is right - Remedy or the pixel counters? Perhaps the most crucial thing is that there is nothing in Maki's carefully worded statement that is at odds with what the pixel counters are saying. Native resolution of the actual framebuffer is never mentioned. That metric is indeed just one element in overall image quality, but it is also one of the most important. Remedy's argument is very similar to the one put forward by Bungie in the wake of Halo 3 being revealed as running at 640p. The bottom line there is that there's little doubt that the Master Chief epic is sub-HD, and would look significantly improved running at native 720p - indeed, the team's own shots confirm that.

Maki is quite right to point out that individual elements of the image operate at their own individual resolutions, but in most cases the opaque geometry usually operates at 720p. Killzone 2 has a 640x360-sized buffer for particles. Conversely, some of the textures on Kratos in God of War III are 2048x2048 in size, but both games are obviously 720p: no-one claims that these games are 360p or 2048p.

Moving on from that, when we select custom resolutions in PC titles, opaque geometry is the key metric being used to define the size of the framebuffer. It's the amount of pixels used to create the image: higher-resolution shadowmaps or textures can't change that, although they do of course play their own part in overall image quality. Regardless, it's also the case that going lower than 720p usually results in scaling artifacts (most noticeable on high detail and edges) and a blurrier image overall.

...

-----------------------------------------------------------------------------

Basically, the actual resolution is easy to comprehend, but that doesn't mean the components of a frame can't be of higher detail than what is expected of something rendered in 720p.



The chase for resolution comes from PC gamers...or console gamers thinking that they are just as cool as PC gamers and that higher res means better graphics.

In todays console world however, with all the upscaling and elements involved, it really doesn't matter.

I mean, rent Deadly Premonition...that game uses some 100p textures...but it is 720p resolution. And reguarless of everything the game is still awesome... :)



Around the Network

As the DF feature pointed out, while resolution is very important (and the higher the better), its not the be all and end all. If the developer makes, say, higher fidelity textures to hide a smaller resolution, that negates some of the loss in quality hopefully.



Because all engines are so different it would be very difficult to come up with a fair metric.

Take shadows. It's really impossible to measure shadow fidelity unless it's obviously bad looking, or the devs let us in on it, because shadowmapping is a texture space operation. Because the shadows are textured across geometry at angles typically not the same as the viewers, you really can't pixel-count shit.

I just think of the effects available... What about depth of field? Some devs might being using 4 depth samples, others 8. Some games I have seen seem to be using a nasty block filter for DOF, others must be using a nice gaussian filter (so hard to tell).

Another example I can think of is luminance extraction for dynamic exposure. I can guarentee you that the initial luminance extraction of the scene is done at half resolution in 99% of games with HDR implementations. Why? It saves a ton of operations, and is more than good enough as the start towards calculating the average luminance.

Could you imagine people screaming "X game only calculating scene luminace at 360p!!!? It's nuts.

Now geometry is measured for fidelity because well, it's an easy target really. Engines are all about making tradeoffs to get what you want. If you wan't to make a world with high res sunlit shadows, big viewdistance, realtime reflections then why should you have to do that at 720p? You don't have too, so the devs don't.



restored_lost said:
at the end of the day...i don't need a number on the back of the box to tell me what looks better on my tv, so when i pop in call of duty mw2 it looks good but not great even though it says 1080p on the box, when i pop in uncharted 2 it looks amazing even though it says 720p on the box, so its w/e for the whole alan wake thing, it doesnt matter what it says on the box, the first 10 min or w/e looks awesome...I let me and my tv do the judging of graphics.

MW2 is sub HD and the 1080p sign on the box only means that it's possible to upscale it to 1080p, whereas Uncharted 2 only supports up to 720p.



Barozi said:
restored_lost said:
at the end of the day...i don't need a number on the back of the box to tell me what looks better on my tv, so when i pop in call of duty mw2 it looks good but not great even though it says 1080p on the box, when i pop in uncharted 2 it looks amazing even though it says 720p on the box, so its w/e for the whole alan wake thing, it doesnt matter what it says on the box, the first 10 min or w/e looks awesome...I let me and my tv do the judging of graphics.

MW2 is sub HD and the 1080p sign on the box only means that it's possible to upscale it to 1080p, whereas Uncharted 2 only supports up to 720p.

yes i know that, thats the point i was trying to make...that it doesn't really matter what it says on the box or what DF tells you how many pixels are represented in the screen. The Point i was trying to make was that you should look at the game with your own eyes, does it look awesome? or not? often you can tell without having to break out a calculator, that is unless you want to win a forum debate about which game looks better than game x. BUt if you have to go into that much detail, chances the games both look great.



"even a dead god still dreams"

 

_mevildan said:
Because all engines are so different it would be very difficult to come up with a fair metric.

Take shadows. It's really impossible to measure shadow fidelity unless it's obviously bad looking, or the devs let us in on it, because shadowmapping is a texture space operation. Because the shadows are textured across geometry at angles typically not the same as the viewers, you really can't pixel-count shit.

I just think of the effects available... What about depth of field? Some devs might being using 4 depth samples, others 8. Some games I have seen seem to be using a nasty block filter for DOF, others must be using a nice gaussian filter (so hard to tell).

Another example I can think of is luminance extraction for dynamic exposure. I can guarentee you that the initial luminance extraction of the scene is done at half resolution in 99% of games with HDR implementations. Why? It saves a ton of operations, and is more than good enough as the start towards calculating the average luminance.

Could you imagine people screaming "X game only calculating scene luminace at 360p!!!? It's nuts.

Now geometry is measured for fidelity because well, it's an easy target really. Engines are all about making tradeoffs to get what you want. If you wan't to make a world with high res sunlit shadows, big viewdistance, realtime reflections then why should you have to do that at 720p? You don't have too, so the devs don't.

Thanks for this post. It was kinda what I was thinking. But you made it sound simple. I think understand it alot more. Cheers. ;)