By using this site, you agree to our Privacy Policy and our Terms of Use. Close
RazorDragon said:
Captain_Tom said:

LOL no it doesn't!  I play it maxed out 100% and it only uses around 1500MB-1800MB.  They then tried to fit that into 256MB.  What happened?  The PS3 and especially the 360 version can't even hold a steady 25 FPS:

http://www.youtube.com/watch?v=cLLWtgVglTE

This is all while having some butt-ugly textures.   So yeah your future for the Wii U looks great.  I can't wait to see what happens to console versions of games when 2+ GB is standard (Which it will be by next year).


RAM has almost nothing to do with framerate. The 360 and PS3 versions can't hold 25FPS because the GPU and CPU are outdated compared to current PC specs. If you try to run, for example, Crysis on a HD 4870 512MB GPU and then on a HD 4670 1GB GPU, which card will run the game with better framerates? It's obvious that the HD 4870 will run it much better, even with less RAM. About the butt-ugly textures, that may be because PS3 and 360 lack RAM, but in the framerate department, it doesn't matter at all how much RAM you have.

1) RAM is very important to framerate.  RAM feeds the CPU and GPU.  The difference between 1600MHz and 400MHz is small in modern games.  The difference between 1600 and 400 is huge.  The differnece between the Wii U and PS4 is 6000 to 1600.  That is a massive difference, and that is only concerning the CPU.  WIth GPU's slow RAM makes a very large difference periode (Compare the 7750 GDDR5 to the DDR3 version for proof). 

 

2)The CPU and GPU in the Wii U is massively weaker than the PS4/720's too, so those other points you made have no point.