WereKitten said:
That's a lot of speculation you have there: the buffers can have different resolution and different bit depth, so that math is questionable. I'll wait for the final version to be in the hands of players and reviewers. And of course none of these technical details will make or break the overall visual quality of the game by itself. The Remedy guy was absolutely right in saying that it's a complex process, with oh so many factors influencing the final result. If people loved what they saw in the gameplay videos, it shouldn't be a resolution number to change their opinion. |
I never talk about visual quality, just of rendering. So... about the rendering, I find very hard to believe that they use buffers with many different resolutions and different bit depths. Such a solution would be very heavy: you could save a bit of vram, but the computational cost for the GPU would be unbearable. Scaling (and converting if they're at different bit depth) every buffer(or a large number of) at every frame would be impossible. It's much easy (and reasonable) to use a small number of native resolutions (like 2 or 3), then compose the scene at low resolution and then upscale (once or twice).