By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Leynos said:
Pemalite said:

Film grain. - Just muddies the image.
Lens flare. - Unrealistic and often intrusive.
Ambient occlusion. - Gives assets a floaty look.
Motion Blur. - Exception is quality per-object motion blur done tastefully.
Morphological Anti-Aliasing. - Pixel shader based that blurs the edges of polygons. It's shit.


Funny you mention lens flare. It's something I don't get when the camera is supposed to not exist. I forgave it in Mario 64 as they are showing off this new-fangled 3D console.  One recent game just annoyed the shit out of me for a stupid effect. Lords of the Fallen.  Snow hitting the lens. Why? It's not Samus's Visor and it's not FP and it's a fantasy setting. Why is snow hitting the fucking lens?

Some games it adds to the aesthetics and scene really well... Mass Effect is probably one of the best examples.

Other games like Serious Sam just abused the technique.


Chrkeller said:

Bad HDR is a good one.  I love Tiny Tina, but it is the worse HDR I have ever seen.  I have to turn it off because it burns my eyes.  And the in game  calibration doesn't work.  

Motion blur and film grain are the first two things I turn off.  I hate both.  

We need to keep in mind that there are different "types" of HDR.

Halo 3 on the Xbox 360 used a HDR rendering pipeline... As did games like FarCry and Half Life 2.
For a good break down give this a read from Arstechnica and valve: https://arstechnica.com/features/2005/09/lostcoast/

In short it differs from today's "HDR" by being a rendering technique to better illustrate differences of brightness... And developers can use that extra frame information to build additional effects like Bloom or Motion Blur... Nothing happens on the display end.

Today we use HDR as additional information to tell a display, what areas to brighten or darken.



--::{PC Gaming Master Race}::--