By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Kynes said:
All this "fiasco" makes me remember when they developed Doom3 and, instead of calculating the lightning shader in real time, they developed texture lookups. In the nVidia FX cards it helped a lot, due to their poor shading capabilities, but in the top cards of the 9 series of ATI it made the game slower. A graphics enthusiast (Humus, nowadays the main developer of the engine used in the avalanche studios games -> Just Cause 2) discovered this, and developed a substitution shader that improved the performance in almost a 40%.

When you develop a game for more than one architecture, sometimes you have to adapt to the shortcomings of one of them, but you can't always develop close to the metal. Some PS3 fans think they deserve that the developers spend much more time in their version than in others. That was reasonable in the last generation, with one console capturing almost the 75% of the market, but it's unreasonable in a market almost evenly fragmented.


Very inaccurate post...

The lighting calculations are all performed in real time. What you are talking about is the specular component of the default lighting interaction shader. Here instead of a power function, a texture lookup was performed to simulate the power function. This was done to make the game look the same on all the hardware codepaths they had, because the codepaths for older hardware at the time couldn't do per-pixel a specular calc without using a texture lookup. The high end NV and ATI cards at the time ran on the vertex/fragment shader path and it just so happened the NV cards were faster on the lookup. Never mind the fact that the game ran faster on NV cards by miles because of ATI's lagging GL driver...