MikeB said: @ NightAntilli What makes other games have 'proper' HDR and why AW is lacking? What other games are you referring to specifically. Apart from Halo 3 there are no other XBox 360 games doing 'proper' HDR. The XBox 360 is not powerful enough to do this with good results, hence Microsoft started a new sub-HDR FP10 format for the XBox 360. Or is UC2 FP16 while AW is FP10? Yes, also the first Uncharted game. |
It actually has nothing to do with any "power" of the X360, rather with the way that the hardware processes code.
As an example, you can't say the HD4870 is weaker or has less power than an HD5450 because the lack of DX11 in the 4870 while the HD5450 does have that feature.
As an older and maybe better example, you can't say a X850XT PE is weaker or has less power than a 6800GS/GT because it only has SM2.0 instead of SM3.0..
WereKitten said:
I think he was referring to Halo 3 using FP16 for the buffer. It gives all the 16 bits of precision for each RGBA components and the Xenos supports multisampling with it, but it needs double the memory than a "normal" 8-bit RGBA space. That means that if you want the bandwidth by staying inside the 10MB eDRAM you'll have to cut on AA and resolution (that's what Halo 3 did) or resort to tiling, which many other games did. Plus, I think Xenos doesn't support alphablending in FP16. The 360 offers a different option that is almost penalty free: the FP10 render target format. In theory the limted range per component (10 bits versus 16) can cause artifacts when the image is toned down to the visible range, in practice it's often "good enough" and it can be used with MSAA. That's what most 360 games use, but I suppose that's what some consider not a "true HDR". On the PS3 (but you can do the same on the 360 if you have enough free pixel shaders budget available) engines such as Uncharted have moved to using a format called NAO32 (pioneered in Heavenly Sword) instead of FP16 that AFAIK stores the logarithm of luminance in 16bits (thus a full HDR luminance range) of an RGB8 space and the other chrominance components in the remaining 16bits. It solves bandwidth/memory problems and I understand that it works well with AA, filtering and shader-based postprocessing, but it requires some extra decoding/encoding phases in your shader pipelines and extra work for alpha. Which HDR method AW employed is a mystery to me, and I'm not sure how (if?) FP10 artifacts would show given the heavy use of (deferred) postprocessing for lighting and bloom effects or how bad the absence of hardware blending would hit the choice of FP16. Anyway, I for one don't like the notion of "true" HDR as needing 16 bits. If it works for the eye (and HDR is all about mimicking our perceptual space) given the game's overall visual style, and if the artifacts are not too detrimental to the image quality then it's real enough for me. That said, the Luv encoding in NAO32 is closer to the real perceptual meaning of HDR than the standard "flat" FP16 solution or its downgraded FP10 cousin. |
Thanks. That's the explanation I was looking for.
Truth does not fear investigation