By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Wii U graphics power finally revealed - "we can now finally rule out any next-gen pretensions for the Wii U"

ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:

we do have the specs for the most part http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

look under my sig those are all confirmed specs barring some kind of miracle.

The specs are incomplete, and some of them are assumptions rather than fact.


did you read the article, most of it is cold hard facts, at first some people said eurogamer rushed the article but its going on 2 days now and nobody corrected anything, even neogaf are accepting the 352 gflops numbers now here is quote i found intersting in the comment section.

 

Am I the only one being shocked by the fact that the Wii U CPU is basically the same type that I had in my 1998 PowerMac G3?! Am I correct here, it's basically a triple-core overclocked PowerPC 750?!! ROFLMAO! No wonder the poor ports and performance, the GPU on that thing is no powerhouse by any means but that ancient CPU architecture is choking it blue. A 3DFX Voodoo2 would have been a better match. They'd be better off if they'd stuck a current low-end, dual-core celeron in there instead but I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm...

I read the article; there's plenty of unknown factors and guesstimates. And I'm not sure how a troll post helps your point?

i don't see how its a troll post it actaully makes sense, why they put such a weak cpu in the machine, anyway i believe DF, they have a amazing track record, no reason to doubt them and nobody has proved them wrong. 

"I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm..." That's trolling. People who drop those kind of lines strike me as people who are not going to give an objective perspective on Nintendo hardware. Not to mention, it hasn't been proven that the Wii U CPU is a "triple core overclocked PowerPC 750", in fact it's most likely not. It might be based on it for backwards compatibility purposes, but newer chips are often "based" on older ones because they're a direct descendent, just with improvements that come over time like large/faster caches, more cores, etc.

Nobody has proven Digital Foundry's analysis right either. What will prove it one way or another are games that come out once developers get a handle on the system's innards.



Around the Network
ninjablade said:

actually we know about the cpu more the gpu, its confirmed to same as wii 3 cores and over clocked , hector marcan confirmed this a few days ago, i doubt DF gonna be corrected, they are always on point.

Correction: We know that it's BASED on the 3 of the Wii's CPU. There's a lot more to CPU design than the basic architecture. Kind of like how the Pentium 4 was able to use the same instructions as the Pentium 3, plus some more. The Wii CPU functionality is included within the Wii U CPU, but we don't know anything more than that with regards to functionality... except that the Wii U's CPU has OOE, whereas the Wii CPU didn't - there's ONE difference we already know of.

And again, DF got their information from NeoGAF, who themselves have said that the DF article isn't right. About 50% of the chip isn't accounted for by the analysis on which DF based their article, and NeoGAF are still working to try to figure out what that half of the chip is actually doing. It's not empty space on the chip.



curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:

we do have the specs for the most part http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

look under my sig those are all confirmed specs barring some kind of miracle.

The specs are incomplete, and some of them are assumptions rather than fact.


did you read the article, most of it is cold hard facts, at first some people said eurogamer rushed the article but its going on 2 days now and nobody corrected anything, even neogaf are accepting the 352 gflops numbers now here is quote i found intersting in the comment section.

 

Am I the only one being shocked by the fact that the Wii U CPU is basically the same type that I had in my 1998 PowerMac G3?! Am I correct here, it's basically a triple-core overclocked PowerPC 750?!! ROFLMAO! No wonder the poor ports and performance, the GPU on that thing is no powerhouse by any means but that ancient CPU architecture is choking it blue. A 3DFX Voodoo2 would have been a better match. They'd be better off if they'd stuck a current low-end, dual-core celeron in there instead but I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm...

I read the article; there's plenty of unknown factors and guesstimates. And I'm not sure how a troll post helps your point?

i don't see how its a troll post it actaully makes sense, why they put such a weak cpu in the machine, anyway i believe DF, they have a amazing track record, no reason to doubt them and nobody has proved them wrong. 

"I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm..." That's trolling. People who drop those kind of lines strike me as people who are not going to give an objective perspective on Nintendo hardware. Not to mention, it hasn't been proven that the Wii U CPU is a "triple core overclocked PowerPC 750", in fact it's most likely not. It might be based on it for backwards compatibility purposes, but newer chips are often "based" on older ones because they're a direct descendent, just with improvements that come over time like large/faster caches, more cores, etc.

Nobody has proven Digital Foundry's analysis right either. What will prove it one way or another are games that come out once developers get a handle on the system's innards.


seems to me like he's a nintendo fan thats updset about the cpu but anything negative is called trolling. as i 360 fan, DF  broke the first real news about ps4 being more powerful and i believed them, i will alway trust DF and beyond3d, some those guys on neogaf are using wishful thinking instead of hard facts, most of them thrakter, bigassian claimg 600 flops, while more reasonable members told them 300-350 gflops.



ninjablade said:

seems to me like he's a nintendo fan thats updset about the cpu but anything negative is called trolling. as i 360 fan, DF  broke the first real news about ps4 being more powerful and i believed them, i will alway trust DF and beyond3d, some those guys on neogaf are using wishful thinking instead of hard facts, most of them thrakter, bigassian claimg 600 flops, while more reasonable members told them 300-350 gflops.

In other words, you're placing your trust in DF. Which you're welcome to do. But that doesn't make it "cold hard fact", and again, there's about 50% of the GPU that is still unaccounted for. And as I pointed out, DF got their "information" from NeoGAF in the first place... except that, where NeoGAF described it as ongoing analysis that is subject to change, DF suggested that it was certain information.



Aielyn said:
ninjablade said:

actually we know about the cpu more the gpu, its confirmed to same as wii 3 cores and over clocked , hector marcan confirmed this a few days ago, i doubt DF gonna be corrected, they are always on point.

Correction: We know that it's BASED on the 3 of the Wii's CPU. There's a lot more to CPU design than the basic architecture. Kind of like how the Pentium 4 was able to use the same instructions as the Pentium 3, plus some more. The Wii CPU functionality is included within the Wii U CPU, but we don't know anything more than that with regards to functionality... except that the Wii U's CPU has OOE, whereas the Wii CPU didn't - there's ONE difference we already know of.

And again, DF got their information from NeoGAF, who themselves have said that the DF article isn't right. About 50% of the chip isn't accounted for by the analysis on which DF based their article, and NeoGAF are still working to try to figure out what that half of the chip is actually doing. It's not empty space on the chip.

neogaf got there info from beyond3d, they have been saying 300-350flops for the longest just on math alone its impossible to be more then that cause of the 30watt power draw for the whole system and, at first neogaf though it was 160gflops, DF did not get there info from neogaf they are tch experts and have connections in the industry, they did there anlaysis and thats that they came up with, if you mean they got the pic from neogaf then fine.



Around the Network
ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:

we do have the specs for the most part http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

look under my sig those are all confirmed specs barring some kind of miracle.

The specs are incomplete, and some of them are assumptions rather than fact.


did you read the article, most of it is cold hard facts, at first some people said eurogamer rushed the article but its going on 2 days now and nobody corrected anything, even neogaf are accepting the 352 gflops numbers now here is quote i found intersting in the comment section.

 

Am I the only one being shocked by the fact that the Wii U CPU is basically the same type that I had in my 1998 PowerMac G3?! Am I correct here, it's basically a triple-core overclocked PowerPC 750?!! ROFLMAO! No wonder the poor ports and performance, the GPU on that thing is no powerhouse by any means but that ancient CPU architecture is choking it blue. A 3DFX Voodoo2 would have been a better match. They'd be better off if they'd stuck a current low-end, dual-core celeron in there instead but I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm...

I read the article; there's plenty of unknown factors and guesstimates. And I'm not sure how a troll post helps your point?

i don't see how its a troll post it actaully makes sense, why they put such a weak cpu in the machine, anyway i believe DF, they have a amazing track record, no reason to doubt them and nobody has proved them wrong. 

"I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm..." That's trolling. People who drop those kind of lines strike me as people who are not going to give an objective perspective on Nintendo hardware. Not to mention, it hasn't been proven that the Wii U CPU is a "triple core overclocked PowerPC 750", in fact it's most likely not. It might be based on it for backwards compatibility purposes, but newer chips are often "based" on older ones because they're a direct descendent, just with improvements that come over time like large/faster caches, more cores, etc.

Nobody has proven Digital Foundry's analysis right either. What will prove it one way or another are games that come out once developers get a handle on the system's innards.


seems to me like he's a nintendo fan thats updset about the cpu but anything negative is called trolling. as i 360 fan, DF  broke the first real news about ps4 being more powerful and i believed them, i will alway trust DF and beyond3d, some those guys on neogaf are using wishful thinking instead of hard facts, most of them thrakter, bigassian claimg 600 flops, while more reasonable members told them 300-350 gflops.

You can be negative without trolling. You could just say that you're disappointed with the chip's apparent lack of power, no need for over-the-top cheap shot like the Wii/GC CPU line, which is misleading because while Espresso is probably based on Broadway for backwards compatibility, it's most likely had much more done to upgrade it than just an overclock and two extra cores. Can you really see Assassin's Creed 3 being run on three 1.2GHz Broadway cores?



ninjablade said:
neogaf got there info from beyond3d, they have been saying 300-350flops for the longest just on math alone its impossible to be more then that, at first neogaf though it was 160gflops, DF did not get there info from neogaf they are tch experts and have connections in the industry, they did there anlaysis and thats that they came up with, if you mean they got the pic from neogaf then fine.

NeoGAF got SOME info from Beyond3D. Beyond3D are just as stumped about the other 50% of the chip as NeoGAF are. And DF actually said IN THE ARTICLE ITSELF that they got it from NeoGAF. Or are you suggesting that DF are lying about where they got the info from?



Aielyn said:
ninjablade said:
neogaf got there info from beyond3d, they have been saying 300-350flops for the longest just on math alone its impossible to be more then that, at first neogaf though it was 160gflops, DF did not get there info from neogaf they are tch experts and have connections in the industry, they did there anlaysis and thats that they came up with, if you mean they got the pic from neogaf then fine.

NeoGAF got SOME info from Beyond3D. Beyond3D are just as stumped about the other 50% of the chip as NeoGAF are. And DF actually said IN THE ARTICLE ITSELF that they got it from NeoGAF. Or are you suggesting that DF are lying about where they got the info from?


no they said they did there own anlaysis they got the gpu pic from neogaf but not there analysis.



zero129 said:
ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:
curl-6 said:
ninjablade said:

we do have the specs for the most part http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

look under my sig those are all confirmed specs barring some kind of miracle.

The specs are incomplete, and some of them are assumptions rather than fact.


did you read the article, most of it is cold hard facts, at first some people said eurogamer rushed the article but its going on 2 days now and nobody corrected anything, even neogaf are accepting the 352 gflops numbers now here is quote i found intersting in the comment section.

 

Am I the only one being shocked by the fact that the Wii U CPU is basically the same type that I had in my 1998 PowerMac G3?! Am I correct here, it's basically a triple-core overclocked PowerPC 750?!! ROFLMAO! No wonder the poor ports and performance, the GPU on that thing is no powerhouse by any means but that ancient CPU architecture is choking it blue. A 3DFX Voodoo2 would have been a better match. They'd be better off if they'd stuck a current low-end, dual-core celeron in there instead but I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm...

I read the article; there's plenty of unknown factors and guesstimates. And I'm not sure how a troll post helps your point?

i don't see how its a troll post it actaully makes sense, why they put such a weak cpu in the machine, anyway i believe DF, they have a amazing track record, no reason to doubt them and nobody has proved them wrong. 

"I guess they got a ridiculously good deal on tons of left-over GC/Wii CPU's laying around the manufacturers warehouse. Facepalm Nintendo, facepalm..." That's trolling. People who drop those kind of lines strike me as people who are not going to give an objective perspective on Nintendo hardware. Not to mention, it hasn't been proven that the Wii U CPU is a "triple core overclocked PowerPC 750", in fact it's most likely not. It might be based on it for backwards compatibility purposes, but newer chips are often "based" on older ones because they're a direct descendent, just with improvements that come over time like large/faster caches, more cores, etc.

Nobody has proven Digital Foundry's analysis right either. What will prove it one way or another are games that come out once developers get a handle on the system's innards.


seems to me like he's a nintendo fan thats updset about the cpu but anything negative is called trolling. as i 360 fan, DF  broke the first real news about ps4 being more powerful and i believed them, i will alway trust DF and beyond3d, some those guys on neogaf are using wishful thinking instead of hard facts, most of them thrakter, bigassian claimg 600 flops, while more reasonable members told them 300-350 gflops.

See you only like to believe the stuff that agrees with what you want to believe without taking other facts into account. This is all ill say on the subject to you as clearly you have no idea what your talking about when it comes to this..

yes i believe what i see, so far people are saying 352 gflops, when they come out with a new numer and everybody agrees on the explaination then fine, is that a problem or should i start believing in wishful thinking. the gpu is hard to figure out a bit cause no consoel ever came with a tablet and then you have bc on the gpu, so its bit customized.



 

ethomaz said:
Seems like the Nextbox have "special sauce" units to help and free up the GPU... so less work to GPU... in the end the performance is near the PS4.

It's like that...

PS4 power: 5 task per cycle
260 power:  3 task per cycle

You have to run 5 tasks in both GPU but the Nextbox can do 2 tasks in the "special sauce" units... so at the end both give you the same performance. Of course that a dummy example lol... and these tasks are not for graphics at all (it is but not shaders specifics)...

Source? 

ethomaz said:
Another point is that PS4 will reserve only 896 shaders units (1.4TFLOPS) for graphics... the others 256 shaders units (400GFLOPS) will be free to developer use for anything (graphics, GPGPU, etc).

I already explained to you why this makes no sense. All the shaders in HD7000 are the same. Sony does not need to use different shaders since they don't care about BC with PS3 on a hardware level. If you had 2 arms, would you only do push-ups with 1 arm? If you have 1152 shaders, you can use all of them. Unified shader + compute architecture of Graphics Core next allows ANY shader to do compute or graphical work. You do not need to "free up" 256 Shaders for GPGPU tasks. This is not how game code works. 

ethomaz said:
At the end what will show how good the console perform is the optimizations and full use of the hardware.

Ya and PS4 seems to be easier to code for/optimize directly to the metal of the hardware and supposedly has a lower OS overhead based on comments from developers.

http://www.edge-online.com/news/the-next-xbox-always-online-no-second-hand-games-50gb-blu-ray-discs-and-new-kinect/

+1 Sony

0 MS

So far not a single rumor has said anything positive about Xbox 720 over PS4. We've heard consistent rumors that Wii U was underpowered/not a true next generation console on the hardware side and that proved true. Is it a coincidence that nearly every rumor out there is claiming PS4 is a superior console on the hardware level than Xbox 720? If MS plans to integrate Kinect 2.0 into every console, unless they take larger losses on the hardware, they have to cut costs somewhere else. If MS chose Kinect 2.0 in hopes of targeting casual gamers/making it a stand-out console feature, they have less $ left for a powerful CPU+GPU setup.