By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

Now it's matter of time to know the final GPU specs from Wii U.

The console is already hacked.

http://www.ps3news.com/console-news/video-wii-u-teaser-the-homebrew-channel-on-wii-u-by-fail0verflow/

Sony really did a good job with security in PS3 o.O.



Around the Network

Here's new chart, based on VoodooPower Ratings (thanks to BlueFalcon for link) http://alienbabeltech.com/abt/viewtopic.php?p=41174

  Approximate Card Config VP Rating vs 360
Xbox 360 2x2600XT*0.625 48(*5):24:8 (VLIW5) 14.6 1.0
WiiU (Redwood LE@550MHz) HD5550(DDR3) 320:16:8 (VLIW5) 21.8 1.5
WiiU (Redwood@550MHz) HD5570(DDR3)*0.85 400:20:8 (VLIW5) 25.5 1.7
WiiU (Turks@550MHz) HD6570(DDR3)*0.85 480:24:8 (VLIW5) 28.9 2.0
A10-5700 (7660D@760MHz)   384:24:8 (VLIW4) 31 2.1
A10-5700 + HD6670 HD6770 384:24:8 (VLIW4) + 480:24:8 (VLIW5) 76 5.2
HD7770   640:40:16 (GCN) 93 6.4
HD7970m HD7870*0.85 1280:80:32 (GCN) 147 10.1

PS3 is too difficult to estimate (I'd say around 12-13, but not too sure), so no "vs PS3".

EDIT: Multipliers in "Approximate Card" are to compensate for differences in clock.



@HoloDust seems a reasonable table... I like it.



The twitter guy give us a little more about the Wii's CPU.

- "we suspect a cross between the 750CL and the 750FX but it's unclear. The SMP is new anyway."
- "no VMX, Just paired singles, like the Broadway. It's a 750 (perhaps close to 750FX?) with SMP, no more, no less."



Shinobi-san said:
darkknightkryta said:

I actually like that set up.  Though I can only see problems with it depending on what Microsoft does.  Like I can see porting between consoles is gonna be a problem again.

Rumour has it that both Durango and Orbis are going to be based on the same tec.

Also about the APU + 8770 rumour. While i agree those are decent specs for next gen (a tad low though), im more excited about the fact that they have moved to jaguar instead of bobcat. This opens up a range of possibilities for the GPU size of the APU alone. Which makes it possible to leave out the 8770 entirely.

I would imagine this would help a lot with keeping the power usage very low.

Also the shift from older amd tech to GCN is a very good sign :)

Obiously this is all rumour though...but given these rumours making a lot of sense i would say this is pretty much spot on. The main thing to keep in mind though when discussing early dev kits is that they are usually only used to mimmick the final specs. Which makes the lone APU theory entirely plausible.

No but I mean like, the cpu portion of the APU is gonna be very weak, it's only really gonna be good for A.I.  The GPU on the APU should be powerful enough to handle all physics in a game plus some graphic pre-processing, essentially replacing Cell.  Send the processed image to the GPU to be drawn and you'll have a very powerful setup with low cost.  Problem is if Microsoft goes for a single GPU plus powerful CPU processor (The rumoured FX processor should be plenty more powerful then what's going in the APU), then physics is gonna be CPU based which the PS4 is gonna struggle with.  If devs make games for the next xbox in mind it's gonna have troubles going back to the PS4 and you're going to have the problems that the Wii U has right now and the current problems between the PS3 and the 360 will remain into the next generation of hardware.



Around the Network
darkknightkryta said:
Shinobi-san said:
darkknightkryta said:

I actually like that set up.  Though I can only see problems with it depending on what Microsoft does.  Like I can see porting between consoles is gonna be a problem again.

Rumour has it that both Durango and Orbis are going to be based on the same tec.

Also about the APU + 8770 rumour. While i agree those are decent specs for next gen (a tad low though), im more excited about the fact that they have moved to jaguar instead of bobcat. This opens up a range of possibilities for the GPU size of the APU alone. Which makes it possible to leave out the 8770 entirely.

I would imagine this would help a lot with keeping the power usage very low.

Also the shift from older amd tech to GCN is a very good sign :)

Obiously this is all rumour though...but given these rumours making a lot of sense i would say this is pretty much spot on. The main thing to keep in mind though when discussing early dev kits is that they are usually only used to mimmick the final specs. Which makes the lone APU theory entirely plausible.

No but I mean like, the cpu portion of the APU is gonna be very weak, it's only really gonna be good for A.I.  The GPU on the APU should be powerful enough to handle all physics in a game plus some graphic pre-processing, essentially replacing Cell.  Send the processed image to the GPU to be drawn and you'll have a very powerful setup with low cost.  Problem is if Microsoft goes for a single GPU plus powerful CPU processor (The rumoured FX processor should be plenty more powerful then what's going in the APU), then physics is gonna be CPU based which the PS4 is gonna struggle with.  If devs make games for the next xbox in mind it's gonna have troubles going back to the PS4 and you're going to have the problems that the Wii U has right now and the current problems between the PS3 and the 360 will remain into the next generation of hardware.

Fair point :)

But lets rather look at it this way, the ports the wii U got happend really quickly right? And then look at the games it got ports of:

Batman, COD, Mass Effect, Darksiders, Ninja Gaiden, Assasins Creed 3 etc. These games are the pinnacle of visual fidelity of this gen. I know theres GOW 3 and U3 etc. but lets face it those are the exceptions. Back when COD 4 released it was a MAJOR step up from all the PS2 ports the consoles were getting.

It took developers about 2 - 3 years to get to this level of quality on the PS3 and 360, first party devs got their quicker but for most it was 2 - 3 years. But look how quickly and i assume easily it was done on the Wii U. People dont seem to realise this.

Now imagine a game made from the ground up specifically for the Wii U in 2 - 3 years time? It will probably look better than the best of the current gen and thats despite the MUCH weaker CPU. Its all about designing games according to the hardware you have. Theres no one single way of developing games...

And i think if Sony, MS, and manufacturers all go with this GPU heavy push then developers will need to adapt.

You might also notice that a lot of the next gen showcases, demos, tech demos etc. are all very gpu focused. Might be a sign of things to come...

Edit: The comparison with the Wii U is also a bit skewed as the Wii U has a very bad CPU, as in its close to a Wii cpu. Thats just bad all around. And while the APU cpu rumoured to be in the PS4 are no power houses they will significantly better than the Wii U cpu.

And then theres the other fact of the Wii U's gpu which is a low end desktop card thats downclocked, yet it's still able to match current gen. That just shows what 6 years can do in terms of tech.

I strongly believe that if the PS4 does not have a gimmick that will add significant costs to the console (like the gamepad for the Wii U) then theres simply no way it won't be a hell of a lot more powerfull than current gen's. And thats just by default.



Intel Core i7 3770K [3.5GHz]|MSI Big Bang Z77 Mpower|Corsair Vengeance DDR3-1866 2 x 4GB|MSI GeForce GTX 560 ti Twin Frozr 2|OCZ Vertex 4 128GB|Corsair HX750|Cooler Master CM 690II Advanced|

ethomaz said:
Now it's matter of time to know the final GPU specs from Wii U.

The console is already hacked.

http://www.ps3news.com/console-news/video-wii-u-teaser-the-homebrew-channel-on-wii-u-by-fail0verflow/

Sony really did a good job with security in PS3 o.O.

But the system is on Wii mode. The homebrew is technically running of a Wii so the hard work was pretty much already done. So the system hasn't been hack yet, that will happen when the homebrew runs on the normal Wii U.



Nintendo and PC gamer

Saw a interesting reply in neogaf,he gave a data about 3 possible way for 1.6GHz nextbox CPU(if 1.6GHz rumor is true)
http://www.neogaf.com/forum/showpost.php?p=44997411&postcount=573
"Just to go into a little more detail, I feel from various (and conflicting) rumours there are three possibilities:

IBM PowerPC A2 based

This fits the four-cores, four-threads-per-core rumour, and the 1.6GHz rumour. The cores are about 6.58mm² on a 45nm process, so on a 32nm process you could fit four cores and 8MB eDRAM cache within 30mm² or so, which is pretty small for a console CPU (about the same die size as Wii U's 45nm "Espresso" CPU). Power draw would be around 10W at 1.6GHz.

AMD Jaguar based

The Jaguar architecture is designed to go up to 2GHz, so 1.6GHz would be a reasonable clock for it in a console environment. It's a single-threaded architecture. At 28nm each core (including 512KB cache) is about 3.2mm². Designed for 2-4 cores, but an eight-core chip would come to about 30mm² or so as well. I can't find data on power draw, but ~10W would probably be a good guess here also.

AMD Bulldozer based

I'm including Piledriver, Steamroller, etc. here. The "eight-core"* Bulldozer is 315mm² at 32nm, which is fucking huge for a console CPU (you'll notice that it's literally ten times the size of eight Jaguar cores). It pulls 125W at its stock speed of 3.6GHz, and if they were using it in Durango they'd have to clock it down massively to prevent it melting the console (possibly even to 1.6GHz). In theory they could use a "four core" variant at about half the size, which would put it at roughly the same size as Xenon was at 90nm, but still be somewhat of a power-hog.

*I put eight-core in quotation marks because they aren't really eight-core chips. They have four modules on-board, and each module is something half-way between a dual-threaded core and two independent cores. A "four-core" variant would then be a dual-module variant, in reality.

There are also the rumours of an Intel chip, but I don't put much faith in it, as the logic seemed to be "It has AVX support, therefore it must be Intel" (not true, both Jaguar and Bulldozer support AVX), and it claimed it was an 8-core chip. Intel's only 8-core chips are extremely expensive Xeon server processors, and the only architecture they could use to cram 8 cores in a console-friendly die is Cedarview (Atom 32nm), which doesn't support AVX."



D-Joe said:

Saw a interesting reply in neogaf,he gave a data about 3 possible way for 1.6GHz nextbox CPU(if 1.6GHz rumor is true)
http://www.neogaf.com/forum/showpost.php?p=44997411&postcount=573
"Just to go into a little more detail, I feel from various (and conflicting) rumours there are three possibilities:

IBM PowerPC A2 based

This fits the four-cores, four-threads-per-core rumour, and the 1.6GHz rumour. The cores are about 6.58mm² on a 45nm process, so on a 32nm process you could fit four cores and 8MB eDRAM cache within 30mm² or so, which is pretty small for a console CPU (about the same die size as Wii U's 45nm "Espresso" CPU). Power draw would be around 10W at 1.6GHz.

AMD Jaguar based

The Jaguar architecture is designed to go up to 2GHz, so 1.6GHz would be a reasonable clock for it in a console environment. It's a single-threaded architecture. At 28nm each core (including 512KB cache) is about 3.2mm². Designed for 2-4 cores, but an eight-core chip would come to about 30mm² or so as well. I can't find data on power draw, but ~10W would probably be a good guess here also.

AMD Bulldozer based

I'm including Piledriver, Steamroller, etc. here. The "eight-core"* Bulldozer is 315mm² at 32nm, which is fucking huge for a console CPU (you'll notice that it's literally ten times the size of eight Jaguar cores). It pulls 125W at its stock speed of 3.6GHz, and if they were using it in Durango they'd have to clock it down massively to prevent it melting the console (possibly even to 1.6GHz). In theory they could use a "four core" variant at about half the size, which would put it at roughly the same size as Xenon was at 90nm, but still be somewhat of a power-hog.

*I put eight-core in quotation marks because they aren't really eight-core chips. They have four modules on-board, and each module is something half-way between a dual-threaded core and two independent cores. A "four-core" variant would then be a dual-module variant, in reality.

There are also the rumours of an Intel chip, but I don't put much faith in it, as the logic seemed to be "It has AVX support, therefore it must be Intel" (not true, both Jaguar and Bulldozer support AVX), and it claimed it was an 8-core chip. Intel's only 8-core chips are extremely expensive Xeon server processors, and the only architecture they could use to cram 8 cores in a console-friendly die is Cedarview (Atom 32nm), which doesn't support AVX."

^ All those seemed possible, although 4 cores seems too low imo, that is if the Kinect 2.0 takes more than 1 core. Or maybe Kinect will use another CPU.



Nintendo and PC gamer

osed125 said:
D-Joe said:

Saw a interesting reply in neogaf,he gave a data about 3 possible way for 1.6GHz nextbox CPU(if 1.6GHz rumor is true)
http://www.neogaf.com/forum/showpost.php?p=44997411&postcount=573
"Just to go into a little more detail, I feel from various (and conflicting) rumours there are three possibilities:

IBM PowerPC A2 based

This fits the four-cores, four-threads-per-core rumour, and the 1.6GHz rumour. The cores are about 6.58mm² on a 45nm process, so on a 32nm process you could fit four cores and 8MB eDRAM cache within 30mm² or so, which is pretty small for a console CPU (about the same die size as Wii U's 45nm "Espresso" CPU). Power draw would be around 10W at 1.6GHz.

AMD Jaguar based

The Jaguar architecture is designed to go up to 2GHz, so 1.6GHz would be a reasonable clock for it in a console environment. It's a single-threaded architecture. At 28nm each core (including 512KB cache) is about 3.2mm². Designed for 2-4 cores, but an eight-core chip would come to about 30mm² or so as well. I can't find data on power draw, but ~10W would probably be a good guess here also.

AMD Bulldozer based

I'm including Piledriver, Steamroller, etc. here. The "eight-core"* Bulldozer is 315mm² at 32nm, which is fucking huge for a console CPU (you'll notice that it's literally ten times the size of eight Jaguar cores). It pulls 125W at its stock speed of 3.6GHz, and if they were using it in Durango they'd have to clock it down massively to prevent it melting the console (possibly even to 1.6GHz). In theory they could use a "four core" variant at about half the size, which would put it at roughly the same size as Xenon was at 90nm, but still be somewhat of a power-hog.

*I put eight-core in quotation marks because they aren't really eight-core chips. They have four modules on-board, and each module is something half-way between a dual-threaded core and two independent cores. A "four-core" variant would then be a dual-module variant, in reality.

There are also the rumours of an Intel chip, but I don't put much faith in it, as the logic seemed to be "It has AVX support, therefore it must be Intel" (not true, both Jaguar and Bulldozer support AVX), and it claimed it was an 8-core chip. Intel's only 8-core chips are extremely expensive Xeon server processors, and the only architecture they could use to cram 8 cores in a console-friendly die is Cedarview (Atom 32nm), which doesn't support AVX."

^ All those seemed possible, although 4 cores seems too low imo, that is if the Kinect 2.0 takes more than 1 core. Or maybe Kinect will use another CPU.

Well in July,there's neogaf member said nextbox will use 1.6GHz Jaugar@8 cores(^the reply also mentioned about 8 Jaugar cores),this new rumor also matched the old rumor

And new Kinect won't take any core of CPU,please don't overestimated Kinect,not only that,Kinect won't be most important thing of nextbox,yes they need casual market,but doesn't mean only Kinect can do that