By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:
ports will always make thw wii u to underpeform unless the developers change te source code to reallocate resources from main ram to cache or edram, and of course, they also have to leave the sound process to the dsp and use the other core of the wii u cpu for other job to speed up things

of course this takes time and effort and must of the dont ,mind doing it

This sounds like damage control for the WII U. 

If the WII U truly had more "power" these ports wouldn't be struggling because a game will automatically take advantage of more powerful hardware such as the PC. It wouldn't take any effort whatsoever if the hardware was truly more powerful. 

Clearly the WII U fails to provide the much needed bandwidth, fillrate, and processing power for games like COD and batman. 

it5 doesnt really matter how powerful it is if the source code doesnt take profit of the system

the port comes fro xbox, so what the source code takes into account?

only 1 mega of cache

only 10 megas of cache

 

even if wiiu has triple of that, the source code doesnt use it since its a port of xbox, all what could have been fitted on teh extra cache and edram does directky to main ram sicne the source code tells the system to do so

 

so, to sollve it, developers have to reallocate the resources, but mmust of them dont mind since are in a hurry

 

ps4 doesnt have to struggle with this since doesnt have edram, just a big gddr5 main ram, so no matter hopw lazy the developers are, the possibility of making a crap port is very unlikely

You need to understand that there are more factors to a games performance rather than memory usage! 

It is not the usage of memory alone that makes the game performs better. There are other things like fillrate which determine how fast it takes to write textures to a surface and how many pixels it can push out or the processing power required to calculate animations and lighting as well but all of those don't matter if it is under utilizied by low bandwidth either. 

The WII U has clearly demonstrated a lack of these elements altogether in some games shown. 

 

you understimate edram to much, unlike xbox, its not just for framebuffer, wiiu has edram also for textures, doing intensive cpu works though l3 cache, etc

 

the problem is that when developers dont make good use of the edram and use to much ram, the main ram bottleneck the gpu and cpu capailities, tahts whats going on

 

imagine an hd8800 with only 128 megabytes of vram

does it matter?cant we just rely on the main ram?

I'll say it one more time and never again in this thread. You cannot rely the eDRAM everytime becuase it is only used for caching frequently accessed data. There will be a time where the eDRAM misses and that's when the WII U is going to have to access the main memory and only the main memory. The WII U cannot by fed with just 32MB of data, it needs more than that and accessing the main memory is important for that operation. 

There are other bottlenecks in the WII U such as the fillrates and processing power too! The WII U is theoretically all around weaker than the PS360!

http://www.techpowerup.com/gpudb/1919/xbox-360-gpu.html

Look at the specifications of Xbox 360's GPU and if Wii U's GPU has more of anything it won't be used if the game is a port of Xbox 360 build that uses specific amount of SPU's, TMU's, ROP's, CU's also you ignore differences in the hardware that cause further difference and Xbox 360 and Wii U's eDRAM in couple of ways operate differently. ROP's are embedded into eDRAM in Xbox 360 while in Wii U it is not! That's a major difference...

Only issue why you can't get the picture about Wii U's hardware is that you check difference in architecture and how they operate...

If Wii U has any SPU, TMU, ROP and CU less than Xbox 360 then it would most likely not operate well since the build of that game is designed to use that and that. If the game uses just 10 MB of eDRAM on the GPU then it will only use 10MB with all tricks that were needed on Xbox 360 and that are not needed on Wii U yet those tricks may cause issue for Wii U's GPU that is different in every way.

You are basically forcing a premise that lets say code for Itanium should work on x86-64 because it is 64 bit code... It won't, the architecture is different.

Can you finally understand now?