| Zekkyou said: Here is the video: http://killzone.dl.playstation.net/killzone/kzsf_multiplayer/KZSF_HIGHTBITRATE_H264.zip As if by divine intervention the majority of the blur is gone, yet it's still using the blending effect for the 1080p? o: It's almost as though a large chunk of the motion blur was added in post processing, rather than during the actual render of each frame set ^^ I can understand why they did it, while the blend effect does create an image incredibly close to 1080p in regards to clarity, it does give the impression of poorer AA (such as the trees). Regardless to if i think that was a good move or not, i think the video does a perfectly adequate job of demonstrating how the simulated resolution itself is not the cause of most of the blur seen in the final game. I'd also say they cut down the lighting slightly, no doubt for increased stability, but for the sake of this argument that's irrelevant. I'm sure you can understand why it's such a bad example for comparison now? If the blur is coming from post processing, then it would look the same regardless to if the game was native 1080p+ motion blur, or simulated 1080p+ motion blur. The difference only becomes apparent when the motion blur is removed, in which case the simulated 1080p images shows signs of lesser AA (in-fact caused by the blends lapping over). Personally i'd say that's a pretty good trade off for an extra 15fps, as did Gorilla it seems. Shame about the motion blur, not a fan myself, but it's understandable (they had the same motion blur effect in Killzone 2, so it's not like this is something new from them). Alas, my words no doubt fall upon deaf ears. I'm once again thoroughly bored of how often this topic needs to be explained to you, so i'll leave you too it. If you look around hard enough i'm sure you can find something else to cling onto for another couple of months. It's nice to see users who apparently don't "play resolutions" take such a deep interest in them :) |
You do see less blur. Of course, you also see virtually none of the artifacts or vertical lines you see in the retail release when the game cannot accurately "predict" the next screen. So they either did not have the reprojection working fully for that video or they did massive changes for the retail release. GG themselves have said the lines and blur are caused by the game not being able to predict where certain things will be correctly. I hope that has helped clear that up for you, as for the rest...
My point was: how important can full native 1080p60 graphics really be when the people most vocally championing it here, could not even tell they were playing a game that was neither of those? You can qualify how close the game came to achieving them or point to this or that footage from before the game came out, or make excuses all you want, it was just a simple question. If you can answer the underlined, please do. If not, feel free to respond but I won't reply.
Actually I probably won't either way, it seems the OP is more about how advanced televisions are compared to the graphics we get. Which I agree. The games are still great, but 1080p has been around in the PC world for well over a decade. Seems silly that console technology is truly that far behind the curve.













