By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Image quality or graphics. Which is more important? (Poll)

 

You can only choose one.

Shit image quality, great graphics. 8 38.10%
 
Shit graphics, prestine image quality. 13 61.90%
 
Total:21
Pemalite said:

We all have different tastes as gamers... So this question will never be clear cut.

But I definitely prefer image quality. A clean 1440P image will beat a 1080P one that uses every rendering trick to dress up a turd.

I think this is like a question of.... would you rather.

Would you rather every game in the world looked like Roblox, but ran at 4k60.

4k Roblox! wooohoooooo.....  (not really lol)


or


Would you rather every game was like Monster Hunter Wilds but ran at like sub 720p sometimes, and maybe not stable 60.

Watch the 360p part :P

Personally would much rather Graphics > Image Quality (high resolutions without upscale ect), if I could only choose 1.

Monster Hunter at 360p looks much better to me than a 4k roblox would.

Last edited by JRPGfan - 4 days ago

Around the Network

We didn't have any issue with TV before the HD age. On the other hand, I personally really dislike using a high resolution in games that were clearly not meant for such high resolutions, because it really highlights the lack of details in a game. As long as a game is designed to work well with low image quality, I'm OK with it, but the reverse I'm not as OK with.

I also have zero interest in upgrading to 4K because it's just waste of processing power to me, and I'm really hesitant about 1440p as well for the same reason, although to a lesser extent. I'd probably still be fine with 720p if stuff was designed with it in mind.



I really dislike all that DLSS and FSR upscaling crap. Often it looks like hot garbage. I got myself an 4K OLED screen and a PS5 Pro in the last couple of weeks, expecting superior image quality and all that. Then I started playing Monster Hunter Wilds in performance mode and it looked absolutely horrible. It is somewhat ok in balanced mode, but that gets me only 40 fps or something.

I got curios and hooked up my good ol' PS3. And as it turns out, a nice and clean 720p image without all that upscaling BS looks better than some of the games I play on PS5. I admit it is not always the case. But it is a friggin' shame that it happens at all.

With that said, in this day and age, I definitely prefer image quality.



Official member of VGC's Nintendo family, approved by the one and only RolStoppable. I feel honored.

Pemalite said:
SvennoJ said:

However would you choose 1080p with ray traced lighting/shadows over 1440p with baked lighting/shadows?
1080p with PBR or 1440p with flat lighting?

We're not talking about dressing up turds, a turd would look even worse in 1440p :p


I would choose 1440P every single time... Especially with HDR and double especially if I get to run at 144hz or more.

Well then, 1080p with HDR or 1440p SDR ;)

HDR is a graphical upgrade especially if done correctly (fp32/fp16 render pipeline and textures)
You're also more likely to hit 144hz at lower resolutions...

So still choosing 1440p over proper HDR and higher fps?



Resolution is better. I'd rather play a PS2 game in 4K than a PS5 game in 480p.



Lifetime Sales Predictions 

Switch: 161 million (was 73 million, then 96 million, then 113 million, then 125 million, then 144 million, then 151 million, then 156 million)

PS5: 115 million (was 105 million) Xbox Series S/X: 40 million (was 60 million, then 67 million, then 57 million. then 48 million)

PS4: 120 mil (was 100 then 130 million, then 122 million) Xbox One: 51 mil (was 50 then 55 mil)

3DS: 75.5 mil (was 73, then 77 million)

"Let go your earthly tether, enter the void, empty and become wind." - Guru Laghima

Around the Network
OdinHades said:

I really dislike all that DLSS and FSR upscaling crap. Often it looks like hot garbage. I got myself an 4K OLED screen and a PS5 Pro in the last couple of weeks, expecting superior image quality and all that. Then I started playing Monster Hunter Wilds in performance mode and it looked absolutely horrible. It is somewhat ok in balanced mode, but that gets me only 40 fps or something.

I got curios and hooked up my good ol' PS3. And as it turns out, a nice and clean 720p image without all that upscaling BS looks better than some of the games I play on PS5. I admit it is not always the case. But it is a friggin' shame that it happens at all.

With that said, in this day and age, I definitely prefer image quality.

Yeah all those upscaling artifacts muddy the water.

Should really be about 720p/1080p on a 720p/1080p TV/monitor vs dynamic resolution upscaled various times to eventually be displayed on a 4K tv/monitor.

At least 720p and 1080p triple/double on a 4K screen so you just get bigger pixels instead of upscaling artifacts.



Problem is that games with very primitive graphics can actually look better at lower resolution where the flaws aren’t as visible. And the opposite with advanced visuals where high resolutions are needed to discern the detail. I picked image quality but such a binary choice doesn’t make much sense.



SvennoJ said:
Pemalite said:

I would choose 1440P every single time... Especially with HDR and double especially if I get to run at 144hz or more.

Well then, 1080p with HDR or 1440p SDR ;)

HDR is a graphical upgrade especially if done correctly (fp32/fp16 render pipeline and textures)
You're also more likely to hit 144hz at lower resolutions...

So still choosing 1440p over proper HDR and higher fps?

Depends on the type of HDR.

You have HDR lighting like what we had in Halo 3 on the Xbox 360... Which was a graphical effect, still an SDR output.

But there is HDR which is just developers pointing out different light and colour levels of a scene for a monitor to display and doesn't come with any real rendering overhead.

Then you have HDR which is essentially inverse tone mapping of a scene which comes with overhead.

I will still choose 1440P.



--::{PC Gaming Master Race}::--

Pemalite said:
LegitHyperbole said:

You game at a PC though, up real close. If you gamed on console at a distance I believe you would see things differently, like literally see things differently. Try gaming bavk 4 feet from your monitor and see how little that 1440p makes over 1080p.

I game on everything, I own everything.

I have a large computer monitor.. Not a small 27" or mid-sized 32". - So at 4 feet there is still a large discernible difference between 1440P and 1080P.
On my TV in the living room, I too can also see a massive difference between 1440P and 1080P.

1440P is just a superior resolution.

Don't take me as someone who is naive when it comes to gaming on other platforms, this is primarily a console gaming website, so obviously that is one of the draw cards of me being a part of this community.

SvennoJ said:

However would you choose 1080p with ray traced lighting/shadows over 1440p with baked lighting/shadows?
1080p with PBR or 1440p with flat lighting?

We're not talking about dressing up turds, a turd would look even worse in 1440p :p


I would choose 1440P every single time... Especially with HDR and double especially if I get to run at 144hz or more.

Idk how. I still far exceed 20/20 vision and I got out a measuring tape from my 50 inch and used the PS5 menus as the perfect example of detail. If you're saying you notice a clear difference at 4 feet, you're lying to yourself. Between 3 and 4 feet is where it melts away and doesn't matter very much. Between 4 and 5 feet it matters basically to the point of nothing. At my usual 10 feet back there is zero difference. It's like having a 1440p phone, I had a 6 inch 1440p phone for about 2 years and I "upgraded" to a 1080p phone at 5.2 inches which most phones are now and for good reason and that looked clearer and sharper or at the very least no more sharper than the waste of battery power 1440p was and to that point 60fps vs 120fps is a noticeable difference if you're paying attention but it really matters not for daily use, only when you put your intent on it and switch between the two will you actually notice it, perhaps that is what you are doing. Switching between the two in the settings and having your attention on the difference make it feel like it's a bigger difference than it is.

I would suggest you get someone to randomise the settings while you're not looking, go no closer than 4 feet from your large monitor and see if you can see the difference in an unbiased controlled manner and after a few attempts you'll see that you are guessing. 

Edit: I will admit, there is a slight difference to rounded edges, looking at some big blocky text and the roundness of the lettering is apparent but it's not enough to notice if I wasn't switching back and forth between 1080p and 2160p going out of my way to check.

Last edited by LegitHyperbole - 3 days ago

Sitting 3 meters back from a 42 inch screen as I do, stuff like Ace Combat 7 on Switch at 720p still looks fine.

Image treatment is a major factor though; AC7 is relatively clean without a lot of post-processing and such, whereas games that throw around a lot of post-processing tend to look blurrier at the same pixel count, so there's more to image quality than just resolution.

4K is a waste of resources in my opinion, for consoles anyway, when 1440p and even a well treated 1080p still look good enough in my opinion.