WereKitten said:
Digital Foundry makes accurate, objective frame-rate and resolution measures, and I respect them very much as a source. Same goes my respect for Carmack. On the other hand taking out of context a 3x factor, and pulling out of thin air a 4x for an undefined "performance" parameter is a) a logical fallacy b) a rethorical dishonesty, because quoting numbers from authoritative sources out of context for pure non-seuiturs is a way to have some of the trust they rightfully earned rub off on your wishful thinking and speculations. As someone who works in the scientific field, I am very touchy on the necessary distinction. Try pulling any kind of rethoric, logical inconsistent trick like this in an objective scientific debate and you become a laughing stock. Let's try to keep the level higher, and wait for example for the in-depth analysis Digital Foundry has promised of the performance and features of the KZ2 engine (which they seem to be highly praising, notwithstanding the 20 fps dip they measured in instances they will surely detail) As for the "awsomeness" of Infinity Ward, you're free to tag them in any way and they are authors of a greatly entertraining game. But again, in the context of graphic engines? That's misleading. CryEngine2 is lauded for its realism. Id's Tech5 or even the deferred rendering techniques Guerilla developed for the KZ2 engine are the object of whitepapers and discussions among workers in the field. When a commercial project needs to license an existing engine UE3 or Source are discussed. But the engine of COD4 (modified Id doom3 engine with bloom, self-shadows and dynamic lighting updates and proprietary physiscs as I recall) was merely adequate for the game, and never more than "good looking". The lighting in particular seemed to me (PC version, where framerate was not an issue, ever) to be quite sub-par and the texturing in the game was quite poor, probably a trade-off for framerate. But if you can point me to objective peer analysis sustantiating the "awsomeness" of the COD4 engine, I'll be happy to read about it.
|
Call of Duty 4 = 1024x600 (2x AA) PS3
Call of Duty 4 = 1024x600 (2xAA) Xbox 360
Unfortunately I do not have an exact figure on the 'time cost' per frame of implementing. Lets just assume it takes 0.5MS per frame to implement, the exact figure isn't significant unless you want to definatively prove something which I do not have the time to do.
- 60 frames per second = 1.66ms per frame
- 30 frames per second = 3.33ms per frame
- 20 frames per second = 5.0ms per frame.
Implementing MSAA (Assumed cost 0.5ms)
- 60 fps = 30% of rendering time spent implementing MSAA
- 30 fps = 15% of rendering time spent implementing MSAA
- 20 fps = 10% of rendering time spent implementing MSAA
John Carmack was referring to a game called Rage which is set in a desert with high contrast so of course it is likely he would implement MSAA.
He was likely referring to the length of time he could run the shader units on the respective consoles, and once you factor the other stops on the trip in the rendering pipeline running at 60fps with MSAA leaves very little time in comparison to running the game at 30 or 20 fps.
Tease.