By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Anyone else feel resolution is overrated?

 

I prefer...

Resolution over detail/effects 23 26.14%
 
Detail/effects over resolution 65 73.86%
 
Total:88
AngryLittleAlchemist said:
curl-6 said:

In case my OP was misunderstood by some, as CGI rightly pointed out, I am not saying resolution is entirely unimportant. Just that I personally think the end result is often more pleasing when other graphical aspects are prioritized over sheer pixel count, and that I feel the raw number of pixels is sometimes given more emphasis than it warrants.

A lot of us don't only play games made in today's standards though. This year I've spent more time playing on my 360 than my Switch due to the lack of appealing games on the latter, so a lot of the stuff I've played recently are games made 6-13 years ago. Speaking of Switch, due to hardware constraints when porting from more powerful hardware, some of its games don't reach the generally accepted resolution "standards of today" either.

Totally agree with your edit, that and interactivity are two areas I want to see future games focus more on.

Curl .... literally my second sentence. 

I'm not talking about playing old games at higher resolutions though. I'm talking about the tradeoffs made when a game is created. For example, when Alan Wake was made, the devs chose to target 540p instead of the typical 7th gen 720p in order to focus more resources on other aspects.



Around the Network
curl-6 said:
AngryLittleAlchemist said:

Curl .... literally my second sentence. 

I'm not talking about playing old games at higher resolutions though. I'm talking about the tradeoffs made when a game is created. For example, when Alan Wake was made, the devs chose to target 540p instead of the typical 7th gen 720p in order to focus more resources on other aspects.

But that's why the topic doesn't make sense. 

How many devs nowadays really make the tradeoff in favor of resolution instead of against it?  In order for something to be overrated, people have to be massively in favor of it. But I've certainly never heard someone say that they wanted a game to look worse for a better resolution. Games which are using the PS4 and Xbox One to their absolute maximum capabilities are cutting back on resolution as a result, and most games (at least on base PS4, and every console more powerful than it) perfectly balance between a high resolution and great graphical capabilities. There has never been a time where I turned on a PS4 game and thought "Damn, if only they cut back on the resolution, then this game would be prettier" - most of them fit their pursuits perfectly. Realistically 99% of triple A games meet the graphical quality they want to meet, unless you're talking about the PC version which will always have more options. This topic would make a lot more sense for graphics over frame-rate, because it doesn't make a lot of sense in it's current state. Especially because resolution is a big factor in how good a game looks. 



curl-6 said:
Conina said:
Mr Puggsly said:

Its also worth noting Remedy went surprisingly low on resolution for Quantum Break. I mean its a 720p game well into the X1's launch and its not even aiming for 60 fps. However, it makes great use of post processing effects so you get a better image than a standard 720p game. Alan Wake on the hand, not so much. I'm surprised MS helped get the game relisted but hasn't funded a remaster or even a 4K patch.

The 7th gen could have really benefitted from dynamic resolution given GPU power was often its biggest obstacle. A game like Alan Wake could have used it. Rage was one of the first times I recall it implemented on consoles and it was pretty late into the 7th gen.

Remedy do seem to have a tendency to focus on effects over pixel count, which I like.

There were a few games on 7th gen with dynamic resolution (In addition to Rage, Wipeout HD, Doom 3 BFG edition, Wolfenstein The New Order, and Syndicate on PS3 come to mind) but yeah it didn't really become common until the PS4/Xbone hit their stride. If I had to pick out one obstacle for last gen consoles though, I'd say it would be RAM, not GPU power. Less than 500MB of RAM available to games was a real bottleneck, especially in open world titles.

RAM was certainly a limitation long term, especially on PS3 which had less flexibility on how RAM was used. Consider 7th gen consoles had games like Crysis, Far Cry, Battlefield, Assassins Creed, GTA, Elder Scrolls, even some MMOs, etc. There The biggest RAM hog is often textures. Crossgen 7th gen games had abysmal textures likely because the game engines had less efficient use of RAM. But overall 7th gen consoles did amazing things with a small amount of RAM.

So I dont think RAM was the biggest limitation, if last gen had double the RAM then MOST games would have just had better textures. More often games struggled with performance or made significant visual compromises because of GPU limitations.

I will put it like this. If I could have chose double the RAM or GPU power for last gen, I would have opted for double the GPU power. Because more games would have utilized it better. Better resolutions (which Alan Wake needs), performance and higher quality effects. Last gen already had just enough RAM to make great open world games. Also, we have have consoles with significantly more RAM but the open world experiences haven't evolved much. You know what has improved significantly? Textures.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

AngryLittleAlchemist said:
curl-6 said:

I'm not talking about playing old games at higher resolutions though. I'm talking about the tradeoffs made when a game is created. For example, when Alan Wake was made, the devs chose to target 540p instead of the typical 7th gen 720p in order to focus more resources on other aspects.

But that's why the topic doesn't make sense. 

How many devs nowadays really make the tradeoff in favor of resolution instead of against it?  In order for something to be overrated, people have to be massively in favor of it. But I've certainly never heard someone say that they wanted a game to look worse for a better resolution. Games which are using the PS4 and Xbox One to their absolute maximum capabilities are cutting back on resolution as a result, and most games (at least on base PS4, and every console more powerful than it) perfectly balance between a high resolution and great graphical capabilities. There has never been a time where I turned on a PS4 game and thought "Damn, if only they cut back on the resolution, then this game would be prettier" - most of them fit their pursuits perfectly. Realistically 99% of triple A games meet the graphical quality they want to meet, unless you're talking about the PC version which will always have more options. This topic would make a lot more sense for graphics over frame-rate, because it doesn't make a lot of sense in it's current state. Especially because resolution is a big factor in how good a game looks. 

 The push for resolutions over 1080p on Pro/X is probably the prime example putting too much emphasis on pixel count in my opinion.



Mr Puggsly said:
curl-6 said:

Remedy do seem to have a tendency to focus on effects over pixel count, which I like.

There were a few games on 7th gen with dynamic resolution (In addition to Rage, Wipeout HD, Doom 3 BFG edition, Wolfenstein The New Order, and Syndicate on PS3 come to mind) but yeah it didn't really become common until the PS4/Xbone hit their stride. If I had to pick out one obstacle for last gen consoles though, I'd say it would be RAM, not GPU power. Less than 500MB of RAM available to games was a real bottleneck, especially in open world titles.

RAM was certainly a limitation long term, especially on PS3 which had less flexibility on how RAM was used. Consider 7th gen consoles had games like Crysis, Far Cry, Battlefield, Assassins Creed, GTA, Elder Scrolls, even some MMOs, etc. There The biggest RAM hog is often textures. Crossgen 7th gen games had abysmal textures likely because the game engines had less efficient use of RAM. But overall 7th gen consoles did amazing things with a small amount of RAM.

So I dont think RAM was the biggest limitation, if last gen had double the RAM then MOST games would have just had better textures. More often games struggled with performance or made significant visual compromises because of GPU limitations.

I will put it like this. If I could have chose double the RAM or GPU power for last gen, I would have opted for double the GPU power. Because more games would have utilized it better. Better resolutions (which Alan Wake needs), performance and higher quality effects. Last gen already had just enough RAM to make great open world games. Also, we have have consoles with significantly more RAM but the open world experiences haven't evolved much. You know what has improved significantly? Textures.

It's not as simple as that; RAM impacts a lot more than just textures. In open world games last gen, the limited RAM meant games had to be constantly shunting data in and out of memory. This not only resulted in a lot of pop-in and loading stalls, but also impacted performance because the CPU was bogged down by managing all this data streaming. These factors are a big part of why even top tier productions like Skyrim and GTA 5 could run so poorly.

It was also a big headache for developers; ask pretty much any dev who worked on PS3/360, and apart from the PS3's ridiculous architecture, they'll likely tell you that the hardest part was working with such extreme memory constraints.



Around the Network
curl-6 said:
AngryLittleAlchemist said:

But that's why the topic doesn't make sense. 

How many devs nowadays really make the tradeoff in favor of resolution instead of against it?  In order for something to be overrated, people have to be massively in favor of it. But I've certainly never heard someone say that they wanted a game to look worse for a better resolution. Games which are using the PS4 and Xbox One to their absolute maximum capabilities are cutting back on resolution as a result, and most games (at least on base PS4, and every console more powerful than it) perfectly balance between a high resolution and great graphical capabilities. There has never been a time where I turned on a PS4 game and thought "Damn, if only they cut back on the resolution, then this game would be prettier" - most of them fit their pursuits perfectly. Realistically 99% of triple A games meet the graphical quality they want to meet, unless you're talking about the PC version which will always have more options. This topic would make a lot more sense for graphics over frame-rate, because it doesn't make a lot of sense in it's current state. Especially because resolution is a big factor in how good a game looks. 

 The push for resolutions over 1080p on Pro/X is probably the prime example putting too much emphasis on pixel count in my opinion.

So you made this thread because completely optional consoles which haven't sold a lot and don't dictate game development in the slightest (meaning very little devs are going to put an emphasis on resolution over graphics) exist? 



AngryLittleAlchemist said:
curl-6 said:

 The push for resolutions over 1080p on Pro/X is probably the prime example putting too much emphasis on pixel count in my opinion.

So you made this thread because completely optional consoles which haven't sold a lot and don't dictate game development in the slightest (meaning very little devs are going to put an emphasis on resolution over graphics) exist? 

Nope, I made a thread because I wondered how many others felt that graphical resources on any current system, not just Pro or X, are better spent on detail/effects over pixel count.



curl-6 said:
AngryLittleAlchemist said:

So you made this thread because completely optional consoles which haven't sold a lot and don't dictate game development in the slightest (meaning very little devs are going to put an emphasis on resolution over graphics) exist? 

Nope, I made a thread because I wondered how many others felt that graphical resources on any current system, not just Pro or X, are better spent on detail/effects over pixel count.

But that doesn't change the fact that the comment was nonsensical. There is not a massive push for Pro's and One X's, they're completely optional. Sony and Microsoft might advertise them extensively, but that's because they want you to buy a more expensive model. Sure, people upgrade to them ... when they have disposable income. It is not the life blood of gaming or the gaming community. So again, how can resolution be "overrated"? 

The perfect set up developers have is fine, most games strike a great balance. 



AngryLittleAlchemist said:
curl-6 said:

Nope, I made a thread because I wondered how many others felt that graphical resources on any current system, not just Pro or X, are better spent on detail/effects over pixel count.

But that doesn't change the fact that the comment was nonsensical. There is not a massive push for Pro's and One X's, they're completely optional. Sony and Microsoft might advertise them extensively, but that's because they want you to buy a more expensive model. Sure, people upgrade to them ... when they have disposable income. It is not the life blood of gaming or the gaming community. So again, how can resolution be "overrated"? 

The perfect set up developers have is fine, most games strike a great balance. 

The same way anything can be overrated; when people, such as myself and those who voted accordingly in the poll, feel too much emphasis is placed on it.



Me.

Regarding of signal processing, usually, large part of the energy of an image is on the lower frequencies. When rendering on lower resolutions, we lose some of the higher frequency content. But there are usually few of it.
Mathematically, a 1080p image has kind of 95% of the 4k content. Is 4x less pixels to render for 95% of the content. make the math.
Checkerboarding helps a lot too, id say 98%. Also, the AI solution of DLSS create a 4k image from a lower resolution, letting the AI complete the higher frequency content post rendering, 99.9% maybe.
Even without those techniques, Id rather have a 1080p image with a good AA than spend money for 4x more power on computer resources for a 5% gain.