By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U GPU Die Image! Chipworks is AWESOME!

ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.



Around the Network
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.

can you please post on beyond3d, i'm the poster shinobi and i asked several times is wiiu more powerful then current gen, why do they tell me its on par and not stronger, i'm sorry but i can't see you being a better souce then them cause i know they have a rep as being the best place to discuss tech, not to mention they were on the money on the amount gflops before the gpu pic even came out.



How can they tell the power when the GPU is a ground up component not based on any GPU by what Morrison is saying, they don't know the tech but some how can determine full performance? Sounds like you're very gullible.



ninjablade said:
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.

can you please post on beyond3d, i'm the poster shinobi and i asked several times is wiiu more powerful then current gen, why do they tell me its on par and not stronger, i'm sorry but i can't see you being a better souce then them cause i know they have a rep as being the best place to discuss tech, not to mention they were on the money on the amount gflops before the gpu pic even came out.


like I mentioned before, a lot of those people are thinking on PC terms :P . It's pretty much what I always did say that the PS3 has more raw power than the 360 but the 360 won on the features part, not to mention, look at your own sig, do you really believe your own statement after looking at those specs when most devs don't even know how to utilize the hardware yet? We already know it's customized as hell, that can't be easy for devs and Nintendo is fucking crazy for it lol.



_crazy_man_ said:

Jim Morrison from Chipworks (the amazing peeps that gave us the GPU die image and will soon be posting the CPU as well) has stated some details.


1. This GPU is custom.
2. If it was based on ATI/AMD or a Radeon-like design, the chip would carry die marks to reflect that. Everybody has to recognize the licensing. It has none. Only Renesas name which is a former unit of NEC.
3. This chip is fabricated in a 40 nm advanced CMOS process at TSMC and is not low tech
4. For reference sake, the Apple A6 is fabricated in a 32 nm CMOS process and is also designed from scratch. It’s manufacturing costs, in volumes of 100k or more, about $26 - $30 a pop. Over 16 months degrade to about $15 each
a. Wii U only represents like 30M units per annum vs iPhone which is more like 100M units per annum. Put things in perspective.
5. This Wii U GPU costs more than that by about $20-$40 bucks each making it a very expensive piece of kit. Combine that with the IBM CPU and the Flash chip all on the same package and this whole thing is closer to $100 a piece when you add it all up
6. The Wii U main processor package is a very impressive piece of hardware when its said and done.

Trust me on this. It may not have water cooling and heat sinks the size of a brownie, but its one slick piece of silicon. eDRAM is not cheap to make. That is why not everybody does it. Cause its so dam expensive


What the bolded tells me is that they licensed AMD's technology and designed their own chip to include those functions in the chip, so it's not an AMD chip design but a product with AMD technology inside.



Around the Network
ninjablade said:
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.

can you please post on beyond3d, i'm the poster shinobi and i asked several times is wiiu more powerful then current gen, why do they tell me its on par and not stronger, i'm sorry but i can't see you being a better souce then them cause i know they have a rep as being the best place to discuss tech, not to mention they were on the money on the amount gflops before the gpu pic even came out.

They are biased, plain and simple. Just from raw numbers, how is 353 the same as 240 and not better? How do they assume that the higher efficiency and DX11 level instructions mean nothing? How do they assume that newer tech does not give any additional real-world benefits per flop over older tech? How do they assume that rushed day 1 ports are indicators of the max potential of the system? It makes no sense to me and I've been in the IT field for almost 10 years. Also, I have no interest in joining beyond3d. I'm not saying this is some massive leap over PS360, it's not, but it is certainly more powerful by at least some meaningful margin.



timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:
ninjablade said:
timmah said:

LOL, you didn't address any of the obvious issues that were pointed out regarding your flamebait sig. The WiiU CPU is OOOE, while the 360 CPU is not, WiiU CPU has more, faster cache, and is highly specialized. Just based on what is assumed, 353gflops with added fixed functions and more modern tech (such as tessellation) is not 'on par' with 240gflops on outdated tech with DX9 instructions and no mondern enhancements. Even so, about half of the blocks on the GPU are not identified, and there's speculation that some of them could be asymetric shaders that would push the guess of 353gflops up higher. You're assuming that the 50% unidentified blocks do absolutely nothing to add to the real-world power, so what are they, decoration? It's pretty obvious that some of the unidentified parts of a 'custom' GPU would be the 'custom' parts, designed to enhance performance/efficiency in some way (fixed functions, asymetrical shaders, whatever). 2GB RAM is not 'on par' with 512MB (and your throughput assumption is just a guess based on single channel RAM, it could be double that with dual channel, so just another piece of speculation). Your sig is such obvious flamebait.

Look, nobody thinks that the WiiU is as powerful as the PS4/Nextbox will be, but it is simply not 'on par' with or weaker than the PS360 either, simply due to modern architecture and DX11 feature set - irrespective of FLOPS (which are higher in your sig anyway!). It's obvious that the WiiU's GPU is very efficient, something that the PS3 and 360 are certainly not. If we go with the 'guess' that exists now, 353 highly efficient GFlops (based only on 50% of the GPU blocks) with fixed functions & better instruction set >>>>>>>> 240Gflops on old, outdated, inefficient tech. We'll just need to wait for games built specifically for the architecture to see this.

I changed my sig to be more accurate, still the over all consensus is on par with current gen on beyond3d, i'm not gonna trust your nintendo biased anylasis, i would love to see you post your theory on beyond3d, and if people agree with you i will be happy to eat crow, still i find it funny that every single cpu intensive game was inferior on wiiu compared to current gen, not to mention many games had bandwidth issue, and no graphical upgrades what so ever sure doesn't scream new tech to me. if a mod has a problem with my sig, they can message me.

Some rushed CPU intensive ports programmed & optimized for older architecture had issues, a GPU intensive game programmed for newer architecture (Trine 2) was able to pull off graphical effects not possible on the older consoles at higher resolution with better results in every catagory. Different architecture, not equal to or on par, and not fully understood yet. I already said PS4/Nextbox will be quite a bit more a bit more powerful, so not sure why you think I'm being Nintendo biased on that. You're clearly more biased against Nintendo than just about anybody I've seen here, why the crusade against Nintendo? You pretty much have an orgasm any time something suggests the WiiU is 'weak' or 'underpowered'. It's pretty pathetic IMO.


trine 2 played into the wiiu gpu strength, the cpu was hardly being used. i just don't see how bops 2  wus running sub hd, with worst frame it then 360/ps3 the system is bottle necked, and many tech experts have confirmed this to me, and i'm sure your tech knowedge can be compared to the mods on beyond3d, which actually developed games.

The 360/PS3 are very heavily bottlenecked, this is widely known. It has taken a long time for devs to get around the bottlenecks in 360/PS3 (just look at the early games on those consoles, especially ports between the systems).  As far as bottlenecks on the WiiU, we don't really know if/what bottlenecks exist, Nintendo has been very adament saying that it focused on memory latency a lot, as well as architecture efficiency, and I remember a couple developers praising the memory architecture, so it could be more of an optimization issue than anything else. In the case of BLOPS2, again, they had very little time to optomize the game for the hardware based on the dev kit release date, and there are many reasons why a quick port (quick by necessity, not laziness) generally performs poorly compared to the lead console, even on a console with more raw power. Add to this the fact that those games were optomized for consoles with higher raw clock speed, and it makes sense that CPU intensive scenes would have issues without propor time to re-code the CPU tasks. Keep in mind that the raw clock speed on the cores of the PS4/Nextbox are rumored lower than last gen as well (with more cores), and you'll realize that re-coding would be necessary to utilize those processors correctly (meaning an algorithm designed to run a single thread at 3.2GHz would run like shit on one of the Nextbox/PS4 1.6GHz cores if not re-coded properly). Again, WiiU is weaker than the PS4/nextbox by some margin yet to be known, but somewhat stronger and much more efficient than the current systems.

Also, you still don't realize the potential positive effect of tessellation + DX11 type instructions (less processing cost for better visual results) vs DX9 era instructions. Newer architecture can do more with less brute force. You have to look at the whole picture and the results in games DESIGNED FOR THE NEW ARCHITECTURE to get an accurate representation, this will take time.

can you please post on beyond3d, i'm the poster shinobi and i asked several times is wiiu more powerful then current gen, why do they tell me its on par and not stronger, i'm sorry but i can't see you being a better souce then them cause i know they have a rep as being the best place to discuss tech, not to mention they were on the money on the amount gflops before the gpu pic even came out.

They are biased, plain and simple. Just from raw numbers, how is 353 the same as 240 and not better? How do they assume that the higher efficiency and DX11 level instructions mean nothing? How do they assume that newer tech does not give any additional real-world benefits per flop over older tech? How do they assume that rushed day 1 ports are indicators of the max potential of the system? It makes no sense to me and I've been in the IT field for almost 10 years. Also, I have no interest in joining beyond3d. I'm not saying this is some massive leap over PS360, it's not, but it is certainly more powerful by at least some meaningful margin.

i would'nt call them biased, i mean they corrected neogaf, when they thought it was 160SP, they told them it was 320.



Wyrdness said:
How can they tell the power when the GPU is a ground up component not based on any GPU by what Morrison is saying, they don't know the tech but some how can determine full performance? Sounds like you're very gullible.

mabye you try reading from page 173 to 179 or here is a quote from the mod Shifty Geezer they based the gpu numers of these facts and they were pretty sure and correct, of course i'm gonna trust these guys.

 

On a par. Any performance advantages Wii U may have are offset by limitations (being small and low power draw), such that any increase in overall performance (nigh impossible to measure) above current gen will be fractional rather than a multiple.
Informed speculation based on a number of data as Function has outlined (including dev comments). It is nigh impossible for Nintendo to have produced a smaller, lower-power draw device that improves (at all, let alone significantly) on overall performance, especially when we know the memory is so damned slow. Only if they have secretly used a much smaller node is that possible.

I don't see how anyone can question the evidence. That's illogical
Originally Posted by function 
Die sizes, CPU process node, main memory type and quantity and bus size, clocks, power consumption, and the DF analyses are not pure speculation.

We easily know enough to say that the Wii U is in the PS360 ballpark
__________________


ninjablade said:
Wyrdness said:
How can they tell the power when the GPU is a ground up component not based on any GPU by what Morrison is saying, they don't know the tech but some how can determine full performance? Sounds like you're very gullible.

mabye you try reading from page 173 to 179 or here is a quote from the mod Shifty Geezer

 

On a par. Any performance advantages Wii U may have are offset by limitations (being small and low power draw), such that any increase in overall performance (nigh impossible to measure) above current gen will be fractional rather than a multiple.
Informed speculation based on a number of data as Function has outlined (including dev comments). It is nigh impossible for Nintendo to have produced a smaller, lower-power draw device that improves (at all, let alone significantly) on overall performance, especially when we know the memory is so damned slow. Only if they have secretly used a much smaller node is that possible.

I don't see how anyone can question the evidence. That's illogical
__________________

 


Except the is no evidence we know that the GPU is a new gpu chip using AMD tech in it no one in the PC scene has even seen it so to tell people that it can do this when they don't even know what the tech is is quite frankly beyond amusing and those sucking it up are gullible as even the person themselves has made assumptions and speculations which flat out says he's guessing. They don't even know what GPU it is and have been speculating for months whether it's a HD this or that now it's come to light that it's a new chip specifically built for the U that no one has ever seen, that alone sets alarm bells off on the speculations.



Wyrdness said:
ninjablade said:
Wyrdness said:
How can they tell the power when the GPU is a ground up component not based on any GPU by what Morrison is saying, they don't know the tech but some how can determine full performance? Sounds like you're very gullible.

mabye you try reading from page 173 to 179 or here is a quote from the mod Shifty Geezer

 

On a par. Any performance advantages Wii U may have are offset by limitations (being small and low power draw), such that any increase in overall performance (nigh impossible to measure) above current gen will be fractional rather than a multiple.
Informed speculation based on a number of data as Function has outlined (including dev comments). It is nigh impossible for Nintendo to have produced a smaller, lower-power draw device that improves (at all, let alone significantly) on overall performance, especially when we know the memory is so damned slow. Only if they have secretly used a much smaller node is that possible.

I don't see how anyone can question the evidence. That's illogical
__________________

 


Except the is no evidence we know that the GPU is a new gpu chip using AMD tech in it no one in the PC scene has even seen it so to tell people that it can do this when they don't even know what the tech is is quite frankly beyond amusing and those sucking it up are gullible as even the person themselves has made assumptions and speculations which flat out says he's guessing. They don't even know what GPU it is and have been speculating for months whether it's a HD this or that now it's come to light that it's a new chip specifically built for the U that no one has ever seen, that alone sets alarm bells off on the speculations.

 i already know what gpu, its 99% not a new gpu that no has ever seen before, you know how much a thing like that would cost and for what, exactly, what reason would nintendo make a new chip from the ground up, your kiddfing me right, you question DF and beyond 3d anylasis of a misimprintation of what jim said, anyway i'm done with thread anybody wanna message me, pm me.