By using this site, you agree to our Privacy Policy and our Terms of Use. Close
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.

can you please post on beyond3d, i'm the poster shinobi and i asked several times is wiiu more powerful then current gen, why do they tell me its on par and not stronger, i'm sorry but i can't see you being a better souce then them cause i know they have a rep as being the best place to discuss tech, not to mention they were on the money on the amount gflops before the gpu pic even came out.

They are biased, plain and simple. Just from raw numbers, how is 353 the same as 240 and not better? How do they assume that the higher efficiency and DX11 level instructions mean nothing? How do they assume that newer tech does not give any additional real-world benefits per flop over older tech? How do they assume that rushed day 1 ports are indicators of the max potential of the system? It makes no sense to me and I've been in the IT field for almost 10 years. Also, I have no interest in joining beyond3d. I'm not saying this is some massive leap over PS360, it's not, but it is certainly more powerful by at least some meaningful margin.

i would'nt call them biased, i mean they corrected neogaf, when they thought it was 160SP, they told them it was 320.