fatslob-:O said:
curl-6 said:
I don't think we can lay all of the blame on the hardware here. Sure, slowdown during scenes with tons of AI/animations/physics is at least partly due to the lower clocked CPU, but it also slows down during scenes were bugger all is happening, which is just shitty coding.
Heck, Assassin's Creed 3 on Wii U performs better with way more NPCs in play, and that was a launch title.
Also, if it fully utilized the hardware we'd see memory related improvements, like the higher resolution textures in Need for Speed Most Wanted U.
|
It's not always about clock speeds ... There are TONS of aspects to consider such as memory subsystem, SIMD widths, memory models, instruction throughput capabilities and other things as well.
Ubisoft definitely could've done a better job of profiling the cutscenes.
As for AC3, it's not always about having higher NPC counts either.
Textures depend on more than just memory capacity too. It's no use adding in higher resolution textures when you don't have enough texture units or memory bandwidth to sustain it.
|
I agree about the cutscenes, but it's not just those; when he's sneaking out of the stadium it can't even hit 30fps in bland rooms devoid of effects and with a mere handful of characters. That's clearly not the hardware's fault when we've seen it run much more demanding scenes more smoothly in other games.
And if Need for Speed Most Wanted U, a game that blows Watch Dogs Wii U out of the water graphically, (and is also an open world game you can move through at high speed) can spare the texture units and bandwidth for better-than-PS360 textures, I'm sure WD could as well had it recieved the same level of care.
Of course, Ubisoft knew the game would flop, so they had no incentive to really invest in getting the most out of the hardware.