| MikeB said: @ LordTheNightKnight What's up with the cheapshots? It's a trademark of blind fanboys, I think I myself and most of those people posting at Beyond3D understand things better than you do. ![]() If you are going to continue on this road don't expect me to respond to you anymore. |
And you don't even understand cheapshots. Calling you stupid would be a cheapshot. I just pointed out that if you don't know a fundamental fact of something, your claims about that thing are suspect. That isn't a cheapshot; it's a logical deduction. The same applies to everyone. I don't really understand the three laws of thermodynamics, so if I argued about heat dispersal, I wouldn't have solid ground to be making my arguments.
The thing is that you don't seem to understand how a frame buffer works, and why the screen resolutions of both versions of CoD 4 are this low.
The basics are that a frame buffer doesn't store the actual graphics; it just controls what part of the graphics we see on the screen. That is a very important distinction. The reason being that 3D graphics cannot "see" the screen. For example, they can't "see" if they cause jaggies, so anti-aliasing has to be stored in the frame buffer.
Yet a jaggie is not limited to a set number of pixels on a screen. Jaggies can happen on every singel pixel. That means higher resolutions means more work for the frame buffer. And it's not just anti-aliasing that had more work with more pixels. Almost every part of the frame buffer has more work with more pixels.
In othe words, lower resolutions mean more memory is available for other things, such at texture resolution, which is the real measurement of how detailed HD graphics are.*
This also applies to both the PS3 and 360. The thing with the PS3, is that the fram buffer is a conventional fram buffer. It still has to follow the rules of such. That's why it also has to sacrifice native resolution for some games. If the frame buffer was larger, it could do 1080p on every game, but it isn't. Same applies to the 360.
Both can output some games in 1080p, but the bandwidth is eaten up. They run much better as 600p-720p systems.
*The best way to tell is to take a late 1990s PC game and put it on a PC with and HD graphics card and monitor. You can easily put those at HD resolutions, but those games will still look like N64 era games, since only the screen resolution is up, not the texture resolution.
A flashy-first game is awesome when it comes out. A great-first game is awesome forever.
Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs









