By using this site, you agree to our Privacy Policy and our Terms of Use. Close
NavyNut said:
Can't say I'm surprised by this but its still disappointing when a console that's supposed to be 10x more powerful can't even produce 1080p with consistent 60fps. Actually the console is probably powerful enough its more than likely poor programming.


Well when you actually think about it it makes sense. 720p (I doubt Watch Dogs will actually be 720p on last gen consoles) is 921,600 Pixels, 1080p is 2,073,600 that is 2.25X more so right there you have used a large chunk of your new power just upping the resolution. Going from 30FPS to 60FPS also needs more than double the performance, because if you are rendering at 30FPS is 33.33ms per frame and 60 FPS is 16.67ms but that doesn't mean you just have half the time to render the frame. You still need to update your animations, physics, streaming assets, etc so if everything else is equal and lets say you need say 10ms each frame to do non rendering prep work then at 30FPS you have 23.3ms to render the frame but at 60FPS you have just 6.67ms to actually render the frame that means in this made up number example you would need 3.5X the performance just to go from 30 to 60FPS. So when you add those together you have just burnt through almost 8X more rendering performance just going from 720P 30FPS to 1080P 60FPS. So with a 10X more powerful system you barely have room to upgrade other things like textures, draw distance, poly counts, particles, shadow resolution, physics, post processing (anti aliasing, depth of field, motion blur, etc).

Now that is a very simplified example of course and there are a lot of other variables in play, and there is a lot more to total performance than just rendering. But still that should show you how quickly 8X the performance dissapears when you up the resolution and framerate.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!