Dr.Henry_Killinger said:
You make textures when you develop(create it) a game not when you render(run it) it, furthermore, textures determine the color applied to each pixel or a texel(a unit of texture), it doesn't affect cpu, but the texture itself is stored in vram so that it is drawn by the gpu on the pass. Texture is not resolution, all resolution determines is how many pixels are stored in the framebuffer. The texture itself is just a 2d image applied to a 3D object. So yeah you can render a game developed in 480i in 4k, but it will still have 480i textures applied to it. The framebuffer will hold 4096 x 2160 pixels. The textures applied to the models will still look blocky even if rendered with more pixels. |
Yeah, I know all that. What I'm saying is, there is no such thing as a "480i texture", or a "4K texture." Textures are going to vary by object size, distance, and the level of object detail the developer is going for, and typically come in 256x256, 512x512, 1024x1024, 2048x2048, and higher. All those texture sizes can be used in a 1080p rendering just as easily as they can be used in a 4K rendering (or 720p, or 480p).
I get the feeling that a lot of people, when they mention 4K, automatically think this refers to a higher level of graphical fidelity, when it literally means rendering the screen at 3840 x 2160. Nothing more, nothing less.
Anyways, I have a feeling we're severely off topic on this one.