| Heavenly_King said: I admit I dont know technical hardware/programming stuff, so you got me there. But using a bit of logic, when you port a game, you are not supposed to have ported the engine to the console before, so that it take benefits of the assets at hand, having the actual game can run nicely? (I think it should work that way, I guess) For example if a current gen game like AC3, is developed current gen for the 360 (more graphics memory), then the engine needs to be ported to make use of more CPU (Cell in the PS3), and so on, right?? Games looked badly at first on the PS3, because 3rd party developers, did not knew or did not wanted to learn how to properly use the Cell. I think that optimization would not be necesary if other console is so much more powerful that the port would just be done with raw power, for example cross-generational games PS2/PS360 games, and upcoming PS360/PS4-720 titles. The bolded text would mean that the 3 consoles are in the same "power range", and I though that was not the case, considering some nintendo supporters posts, and the fact that we dont have any official specs yet (although there have been analysis of the hardware showing it is not a big leap from current gen). So if in reality the case is that the WiiU is more powerfull than the PS3, as it is more powerfull than the 360; then yeah it is expected to have this kind of stuff; of ported games not looking good on the WiiU; because developers already have much more experience on the other consoles. Which in the end means that it is not ubber-powerful as some people thought. So it is more powerful, but not by much. |
Ever try to run a PS2 emulator on your PS2? Probably runs like ass. Why? Trying to run code designed for one thing on something completely different nullifies the power difference.
Now granted port is rewritten with the new hardware in mind but if the initial game engine itself is expecting something like a 3.2 Ghz clock cycle and is running on hardware with a 1.2 Ghz clock, you're going to have problem no matter how more operations per second that new system can handle. That's just the nature of code and game engines.
Brute force can only take you so far. To really take advantage of the hardware, it must be optimized for it. Otherwise, it will run like ass.
Also to note, this is really dependant on the game engine, game code and hardware. Trine 2's developers stated they were able to get better perforamnce from the Wii U without any optimization. That means their game engine was not likely dependant on the CPU clock speed so it was able to brute force good performance.
Everything is a very specific case. You have the hardware differences, the game engine and game code and their levels of optimisation for the hardwares, the developers time and resources to do the optimizing and on and on.
I'll show you waht I meant earlier about the AMD & Intel differences with game engines. It might give you a better perspective on just much optimization for certain hardware can make a difference.
AMD versus Intel:
First, take a look at equal systems with Battlefield 3. Practically even from all CPU's.
http://www.tomshardware.com/reviews/core-i7-3970x-sandy-bridge-e-benchmark,3348-11.html
Now look at those same systems playing Skyrim. AMD CPU's all drop compared to Intel. Even Intels bottom CPU gives better FPS than AMD's best CPU.
http://www.tomshardware.com/reviews/core-i7-3970x-sandy-bridge-e-benchmark,3348-12.html
Same thing with World of Warcraft:
http://www.tomshardware.com/reviews/core-i7-3970x-sandy-bridge-e-benchmark,3348-13.html
You get the same effect with video cards. Ever see a PC game that has an nVidia splash logo that says "nVidia: The way it's meant to be played"? That means the developers optimized the game to run on nVidia GPU's.
The rEVOLution is not being televised







