VanceIX said:
BenVTrigger said:
VanceIX said:
A 900p upscaled to 1080p is perfectly fine in my book as long as it allows devs to make the game better in other ways. Too many people a. caught up in miniscule resolution details.
If you have some insanely great eyes that can make out the minute differences in every game, good for you. Most people don't. Most people in this thread probably haven't even played a One for any decent amount of time to tell how the games look in comparison.
|
Image quality is a massuce difference maker. High resolutions and good AA is incredibly important to me.
And the myth no one can see the difference in resolutions needs to stop. Once your eyes are accustomed to 1080p and higher resolutions anything less, especially a massive difference like 720p, is instantly noticable. It doesnt take a crazy eye for detail.
|
Did I say no one can see it? You can, but it is hard to see unless you are looking for the differences. 900p upscaled to 1080p looks great, amazing really, depending on the game. Did you think that TLOU was a shit game when it came out because it was 720p? Do you think Call of Duty games beat TLOU because they played in a higher native resolution? I doubt it. Resolution is a small aspect of a much bigger picture. Lighting, textures, physics, etc all play just as big a role in making a game look great.
When someone first saw TLOU on PS3, they said "damn, that games looks amazing!", not "damn, I can easily see that the game is 720p, they should have downgraded some of the textures and lighting to put it at 1080p!"
If a game can achieve 1080p without sacrificing other visual aspects, great. But it isn't the end of the world if a dev decides to prioritize other visual aspects over resolution.
|
People rightfully have higher expectations from consoles that are almost a decade newer and with outlandishly higher GPU horsepower. Not to mention that 1080p sets of 55"+ are commonplace now, something that wasn't true through most of last gen.
TLOU on PS3 was judged fairly : it was a PS3 title on a system with a potato for a GPU and 256MB of Vram, and an architecture about as complex as trying to understand inverted heiroglyphics while tripping your balls off. For that, the accomplishment was astonishing.
EXCELLENT programming for a AAA title for 8G PS4/XB1 should hit 1080p/30 easily while maintaining awesome levels of detail. Comparing games and standards between gens is a false dichotomy. Otherwise, what will you get when 4K 80" TVs are common in gen9? "Oh, it's just great that we're still playing in 720p and 900p in 2023!" ??
Progress should be demanded and welcomed. When I got SMW for SNES, I was incredibly happy to hear the massive upgrade in audio quality, and see the vibrant expanded color pallette, and large expansive levels. When Gears of War came out, people could obviously tell that PS2 and Xbox OG would have died trying to run it at 5fps.
We already have examples of superb technical achievement in 8th gen titles at 1080p.
And yes, it goes without saying that things like art direction, quality of the textures, level/game design are absolutely critical. It's not asking too much for that level of quality to also include native 1080p HD in freaking 2014.
Now some options would make this better for everyone. Framelocked 60fps 720p should be an option for many titles, as the XX?% of people that only have 720p TVs get nothing extra for someone making a 1080p title. Upscaling sucks, it introduces artifacts, and depending on chosen resolution, can be WORSE than native 720 +GOOD AA. Titanfall is a great example. I've run it at 720P as a test with max AA on PC, and it looks waaaaaaaaay better than it did on XB1's 792p upscaled to 1080p on the same IPS 120hz 1440p display. But, running the PC version at 1080p native with medium AA? Looks far better than the 720p +AA settings.