Quantcast
Anyone else feel resolution is overrated?

Forums - Gaming Discussion - Anyone else feel resolution is overrated?

I prefer...

Resolution over detail/effects 23 26.14%
 
Detail/effects over resolution 65 73.86%
 
Total:88
AngryLittleAlchemist said:

This is one of the reasons why I kind of don't understand this topic, or rather the many replies in this thread. Old games look infinitely better at higher resolutions; sometimes resolutions they don't even support. 

Resolution is ALWAYS going to be one of those things that many people don't care about much in the present, why? Because you're playing games that are already at an acceptable resolution, which were tailored for the standards of today. When that happens, of course people are going to say resolution is "overrated". They take it for granted, especially in an era where remasters allows us to look at games the same way we thought they looked back when we had first played them. 

It really is an important factor. Not the most important but it's up there. 

Edit: Also next gen I want to see better AI mostly : ) 

Oh yeah. I forgot about AI since this was a mostly graphics-focused thread. I've found it amazing that AI hasn't advanced much over the past decade. Most of it's fairly basic stuff. It's pretty bad with F.E.A.R. still stands head and shoulders above nearly every AAA game out there today. That game was 12 years ago, yet I haven't seen anything since that really made me take a step back and go "Whoa, that's some good AI." Maybe the closest thing to it was BioShock, whose enemies seemed to make decent use of their environment.



Around the Network
Shadow1980 said:
AngryLittleAlchemist said:

This is one of the reasons why I kind of don't understand this topic, or rather the many replies in this thread. Old games look infinitely better at higher resolutions; sometimes resolutions they don't even support. 

Resolution is ALWAYS going to be one of those things that many people don't care about much in the present, why? Because you're playing games that are already at an acceptable resolution, which were tailored for the standards of today. When that happens, of course people are going to say resolution is "overrated". They take it for granted, especially in an era where remasters allows us to look at games the same way we thought they looked back when we had first played them. 

It really is an important factor. Not the most important but it's up there. 

Edit: Also next gen I want to see better AI mostly : ) 

Oh yeah. I forgot about AI since this was a mostly graphics-focused thread. I've found it amazing that AI hasn't advanced much over the past decade. Most of it's fairly basic stuff. It's pretty bad with F.E.A.R. still stands head and shoulders above nearly every AAA game out there today. That game was 12 years ago, yet I haven't seen anything since that really made me take a step back and go "Whoa, that's some good AI." Maybe the closest thing to it was BioShock, whose enemies seemed to make decent use of their environment.

Don't forget Killzone 2 and S.T.A.L.K.E.R. Shadow of Chernobyl. All three games stand above all others regarding AI.



                                                                                                                                     

1080 still is the majic number. I think it looks very clear and with good effects and texture combined with 60fps it's stunning. 700-900p looks ok but it's evident it looks less. But the 1080p standard looks very good. GT sport in 1080p looks stunning I'm also playing rise of tomb raider on pc in 1080 60fps it's a looker

...not much time to post on here anymore, used to be some good ol times on VGchartz...

Playstation Fan for life... PSN: Skeeuk - XBL: SkeeUK - Steam/Uplay/Origin: Skeeuk

Miss the VGCHARTZ of 2008 - 2013...

curl-6 said:
Mr Puggsly said:

RAM was certainly a limitation long term, especially on PS3 which had less flexibility on how RAM was used. Consider 7th gen consoles had games like Crysis, Far Cry, Battlefield, Assassins Creed, GTA, Elder Scrolls, even some MMOs, etc. There The biggest RAM hog is often textures. Crossgen 7th gen games had abysmal textures likely because the game engines had less efficient use of RAM. But overall 7th gen consoles did amazing things with a small amount of RAM.

So I dont think RAM was the biggest limitation, if last gen had double the RAM then MOST games would have just had better textures. More often games struggled with performance or made significant visual compromises because of GPU limitations.

I will put it like this. If I could have chose double the RAM or GPU power for last gen, I would have opted for double the GPU power. Because more games would have utilized it better. Better resolutions (which Alan Wake needs), performance and higher quality effects. Last gen already had just enough RAM to make great open world games. Also, we have have consoles with significantly more RAM but the open world experiences haven't evolved much. You know what has improved significantly? Textures.

It's not as simple as that; RAM impacts a lot more than just textures. In open world games last gen, the limited RAM meant games had to be constantly shunting data in and out of memory. This not only resulted in a lot of pop-in and loading stalls, but also impacted performance because the CPU was bogged down by managing all this data streaming. These factors are a big part of why even top tier productions like Skyrim and GTA 5 could run so poorly.

It was also a big headache for developers; ask pretty much any dev who worked on PS3/360, and apart from the PS3's ridiculous architecture, they'll likely tell you that the hardest part was working with such extreme memory constraints.

I hear ya, but we have console with like 10x the memory to work with and generally its not producing a world of difference in game design. Even modern open world games suffer from the same issues of clearing out data after a certain distance, pop in, loading stalls, etc. If you play modern PC games you would also see a big RAM hog (VRAM) is textures. I feel there is also just less efficient use of RAM given there is less limitations to work with. Many of the performance issues on 7th gen were likely due to GPU bottleneck.

Just an example, Gears of War Ultimate is really just the original game with improved graphics. But its somehow a massively more resource intensive than the original game in every way. I mean its still a linear shooter, its essentially using the same code, yet much more demanding. I see that as an example of less efficient use of resources.

In spite of the memory constraints, they did amazing things on 7th gen specs. I also feel the average game would have benefitted with GPU power over RAM.



Recently Completed
Uncharted: Lost Legacy
for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
curl-6 said:

It's not as simple as that; RAM impacts a lot more than just textures. In open world games last gen, the limited RAM meant games had to be constantly shunting data in and out of memory. This not only resulted in a lot of pop-in and loading stalls, but also impacted performance because the CPU was bogged down by managing all this data streaming. These factors are a big part of why even top tier productions like Skyrim and GTA 5 could run so poorly.

It was also a big headache for developers; ask pretty much any dev who worked on PS3/360, and apart from the PS3's ridiculous architecture, they'll likely tell you that the hardest part was working with such extreme memory constraints.

I hear ya, but we have console with like 10x the memory to work with and generally its not producing a world of difference in game design. Even modern open world games suffer from the same issues of clearing out data after a certain distance, pop in, loading stalls, etc. If you play modern PC games you would also see a big RAM hog (VRAM) is textures. I feel there is also just less efficient use of RAM given there is less limitations to work with. Many of the performance issues on 7th gen were likely due to GPU bottleneck.

Just an example, Gears of War Ultimate is really just the original game with improved graphics. But its somehow a massively more resource intensive than the original game in every way. I mean its still a linear shooter, its essentially using the same code, yet much more demanding. I see that as an example of less efficient use of resources.

In spite of the memory constraints, they did amazing things on 7th gen specs. I also feel the average game would have benefitted with GPU power over RAM.

Well, open world games this gen do tend to run a lot better than they did last gen. It's mostly a developer convenience thing; devs no longer have to do insane memory management hijinks to work around harsh memory restrictions, which makes their job easier. Nowadays they may focus mostly on memory-hungry textures, but that's only cos now they have the luxury of doing so; back then it was a struggle just to get the fundamental design of an open world game to run decently, before even worrying about textures.

By say, 2008, PS3/360 were still fairly capable in the CPU/GPU department, (far from cutting edge, but decent) but even by then 512MB was just miniscule. Their GPUs continued to pump out great looking games throughout their lifespan, while their RAM limitations became quite evident long before their replacement.

That said, I definitely agree devs did amazing things on last gen; there are still PS3/360 games that I can boot up today and be like "holy shit, how the hell does this run on less than 500MB of RAM and a GPU from 2005?" Just this year I replayed Gears 3 and Halo 4 expecting them to have aged poorly and was taken aback by how good they still look.

Last edited by curl-6 - on 04 December 2018

Around the Network
curl-6 said:
Mr Puggsly said:

I hear ya, but we have console with like 10x the memory to work with and generally its not producing a world of difference in game design. Even modern open world games suffer from the same issues of clearing out data after a certain distance, pop in, loading stalls, etc. If you play modern PC games you would also see a big RAM hog (VRAM) is textures. I feel there is also just less efficient use of RAM given there is less limitations to work with. Many of the performance issues on 7th gen were likely due to GPU bottleneck.

Just an example, Gears of War Ultimate is really just the original game with improved graphics. But its somehow a massively more resource intensive than the original game in every way. I mean its still a linear shooter, its essentially using the same code, yet much more demanding. I see that as an example of less efficient use of resources.

In spite of the memory constraints, they did amazing things on 7th gen specs. I also feel the average game would have benefitted with GPU power over RAM.

Well, open world games this gen do tend to run a lot better than they did last gen. It's mostly a developer convenience thing; devs no longer have to do insane memory management hijinks to work around harsh memory restrictions, which makes their job easier. Nowadays they may focus mostly on memory-hungry textures, but that's only cos now they have the luxury of doing so; back then it was a struggle just to get the fundamental design of an open world game to run decently, before even worrying about textures.

By say, 2008, PS3/360 were still fairly capable in the CPU/GPU department, (far from cutting edge, but decent) but even by then 512MB was just miniscule. Their GPUs continued to pump out great looking games throughout their lifespan, while their RAM limitations became quite evident long before their replacement.

That said, I definitely agree devs did amazing things on last gen; there are still PS3/360 games that I can boot up today and be like "holy shit, how the hell does this run on less than 500MB of RAM and a GPU from 2005?" Just this year I replayed Gears 3 and Halo 4 expecting them to have aged poorly and was taken aback by how good they still look.

Everything tends to run better than last gen because is a greater focus on stable performance. Later in the generation we saw greatly improved performance, but you still saw technically impressive stuff like Sleeping Dogs or Far Cry 3 running pretty bad. Meanwhile, GTAV was stunning and maintain fairly stable performance.

You keep saying bringing open world games to last gen was a struggle, but it was done many time over. It also depends on what the game was attempting to do. For example, we've seen Bethesda games struggle when there are too many changes in the world, which is saved in some sort of memory. But the average open world game isn't doing that. Maybe the way Bethesda games did it was also inefficient, their engine(s) aren't exactly praised. The way you speak though games like Assassin's Creed should never have existed on last gen or even improved with each release.

I started playing Gears 3 recently and it especially looks fantastic on the X1X. Halo 4 is the more technically impressive game of the two, because its a larger scale action. The campaign maps are like small open world with lots of enemies, vehicles, etc. It even does that in split screen which is like an anomaly in modern games. Even multiplayer can have up to 16 players on a large map with vehicles, some of the most fun I had last gen. I'm sure it took effort to make all of this work on ~500MB of RAM, but I feel the average game was struggling more with the GPU limitations and were more often pushing the limits of that.



Recently Completed
Uncharted: Lost Legacy
for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
curl-6 said:

Well, open world games this gen do tend to run a lot better than they did last gen. It's mostly a developer convenience thing; devs no longer have to do insane memory management hijinks to work around harsh memory restrictions, which makes their job easier. Nowadays they may focus mostly on memory-hungry textures, but that's only cos now they have the luxury of doing so; back then it was a struggle just to get the fundamental design of an open world game to run decently, before even worrying about textures.

By say, 2008, PS3/360 were still fairly capable in the CPU/GPU department, (far from cutting edge, but decent) but even by then 512MB was just miniscule. Their GPUs continued to pump out great looking games throughout their lifespan, while their RAM limitations became quite evident long before their replacement.

That said, I definitely agree devs did amazing things on last gen; there are still PS3/360 games that I can boot up today and be like "holy shit, how the hell does this run on less than 500MB of RAM and a GPU from 2005?" Just this year I replayed Gears 3 and Halo 4 expecting them to have aged poorly and was taken aback by how good they still look.

Everything tends to run better than last gen because is a greater focus on stable performance. Later in the generation we saw greatly improved performance, but you still saw technically impressive stuff like Sleeping Dogs or Far Cry 3 running pretty bad. Meanwhile, GTAV was stunning and maintain fairly stable performance.

You keep saying bringing open world games to last gen was a struggle, but it was done many time over. It also depends on what the game was attempting to do. For example, we've seen Bethesda games struggle when there are too many changes in the world, which is saved in some sort of memory. But the average open world game isn't doing that. Maybe the way Bethesda games did it was also inefficient, their engine(s) aren't exactly praised. The way you speak though games like Assassin's Creed should never have existed on last gen or even improved with each release.

I started playing Gears 3 recently and it especially looks fantastic on the X1X. Halo 4 is the more technically impressive game of the two, because its a larger scale action. The campaign maps are like small open world with lots of enemies, vehicles, etc. It even does that in split screen which is like an anomaly in modern games. Even multiplayer can have up to 16 players on a large map with vehicles, some of the most fun I had last gen. I'm sure it took effort to make all of this work on ~500MB of RAM, but I feel the average game was struggling more with the GPU limitations and were more often pushing the limits of that.

GTA5 didn't run stable on last gen though, it could and often did drop hard when moving through detailed areas as the systems struggled to hustle data in and out of memory. And Assassin's Creed was a series notorious for uneven performance and characters popping in meters ahead of the player. Both were examples of the RAM limitations biting. It's not that last gen systems couldn't do open world, just that it was not their forte since of their three key components (CPU/GPU/RAM) the latter was the biggest bottleneck by far.



720-1080 is sufficient for me. Give me good framerate and gameplay and I'm good.

CGI-Quality said:
Shadow1980 said:

Oh yeah. I forgot about AI since this was a mostly graphics-focused thread. I've found it amazing that AI hasn't advanced much over the past decade. Most of it's fairly basic stuff. It's pretty bad with F.E.A.R. still stands head and shoulders above nearly every AAA game out there today. That game was 12 years ago, yet I haven't seen anything since that really made me take a step back and go "Whoa, that's some good AI." Maybe the closest thing to it was BioShock, whose enemies seemed to make decent use of their environment.

Don't forget Killzone 2 and S.T.A.L.K.E.R. Shadow of Chernobyl. All three games stand above all others regarding AI.

Whoops. Forgot about KZ2. I did borrow that from a friend and the Helghast AI seemed pretty competent. Haven't played Shadow of Chernobyl yet, though.



4K is the worst thing TV manufacture came up with. 1080P is fine. At most 2K or 1440P. More detailed gfx/affects is more important. This is the advantage that PC gamers have as they can scale gfx setting/resolution according to their preference.