curl-6 said:
ninjablade said:
Barozi said:
RazorDragon said:
zippy said: @RazorDragon- If i am playing on a GameCube or even a Wii and a game has framerate issues, i can deal with that as i know i am playing on outdated hardware. However in this day and age, a game of Lego Citys calibur should be running smoothly on a console that is perceived to be more powerful than last gen tech. What annoys me more is the fact that Wii U is capable of much much more than a Lego game, so is this game another example of a "rush job"? |
Framerates are really not about power. PS3 promised us 120FPS, but, even so, announced PS4 games were designed to target 30FPS. In this case, though, the game probably wasn't really optimized to run on the Wii U, as Lego City doesn't look like a intensive GPU or CPU game.
|
open world games are usually CPU intensive. NPCs are a big reason for that.
Also draw distance can be a problem.
|
you also forgot memory bandwidth slower, then 360/ps3, that can be a huge problem.
|
Not nearly as huge a problem as the PS3/360 being capped at less than 500MB of RAM. Memory is key in open world games.
Then there's the 360 having DVD9 discs and only 10MB of eDRAM, and he PS3's inferior GPU and problematic split RAM. Bottlenecks aren't a Wii U exclusive feature, every system has them.
|
As curl-6 mentioned, eDRAM is a big factor in the Wii U. In the 360, the 10mb of eDRAM is stored on a daughter die which increases latency while the 32mb of eDRAM in the Wii U is stored on the same die. This allows for faster accessing and helps negate the "slower" memory bandwidth, potentially pushing it into the XXXgb/s range. You can think of it as a separate cache for the GPU that would handle the image treatments more than the actual RAM.
I suggest reading up here: http://www.notenoughshaders.com/2013/01/17/wiiu-memory-story/
OT: It is likely TTF made the game access the main RAM rather than the eDRAM for image processing so loading areas while driving and such leads to framerate drops. Again, developer issue, not hardware.