By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

Yes, that's something that everyone can see in solid numbers instead of speculation. Of course we will have to see the actual hardware running to know what that difference will entail, but it really helps to see this.



Before the PS3 everyone was nice to me :(

Around the Network
HoloDust said:
Why, thanks guys :)

I'll keep adding entries if new rumours appear.


I like.

Once WiiU is in wild and we know exact specs... I'd like to steal your work for the 2nd post in the OP in regards to GPU comparison.



superchunk said:
HoloDust said:
Why, thanks guys :)

I'll keep adding entries if new rumours appear.


I like.

Once WiiU is in wild and we know exact specs... I'd like to steal your work for the 2nd post in the OP in regards to GPU comparison.

Please do, I made it so anyone can easily compare rumoured tech for next-gen vs current-gen, so I think your 2nd post is perfect place for quick referencing it.



That sounds great! Instead of scanning through the whole thread it'll be easy to find =D



I'm on Twitter @DanneSandin!

Furthermore, I think VGChartz should add a "Like"-button.

The table is cool but remember these numbers alone does not mean much.

Edit- Seems like the Orbis's devkits have a additional AMD Radeon HD 7670 too http://wewantashrubbery.blogspot.jp/2012/11/ps4-secrets-leaked-and-its-gonna-kick.html



Around the Network
ethomaz said:

The table is cool but remember these numbers alone does not mean much.

Edit- Seems like the Orbis's devkits have a additional AMD Radeon HD 7670 too (http://wewantashrubbery.blogspot.jp/2012/11/ps4-secrets-leaked-and-its-gonna-kick.html).


7670 is rebranded 6670, so that's covered in table - I was suspecting PS4 might end up with that sort of performance, which is not that great honestly. BTW, your link caught ) and . at the end, so it dosn't work, so I'm reposting it

http://wewantashrubbery.blogspot.jp/2012/11/ps4-secrets-leaked-and-its-gonna-kick.html



drkohler said:
The AMD FX8... series of processors is ideally suited for XBoxNext due to its design pecularities - I must remember everyone that XBoxNext has to perform two entirely separate things in parallel: run games and run Kinect2. What GPU MS will use is anyone's guess at this point (unless you have the latest dev kit). They have all the money to go 78xx if they want to.

Yes, I know nobody will like my estimates. You'd rather like to read that PS4/NextBox will be 6 times more powerful than WiiU. Not going to happen. Ever.

I don't think you read my post carefully. As I stated before, there are no power consumption limitations of fitting an HD7950M mobile chip into a PS4/Xbox next. You started talking about die sizes which I already addressed when I said the primary reason we may not see a faster GPU such as 7950M is because of cost.  On that point, the die size of 7950M is not 365mm^2 since it's just a mobile version of Pitcairn, not a mobile version of the desktop HD7900 series.

Pitcairn has a die size of 212 mm^2.

http://techreport.com/review/22573/amd-radeon-hd-7870-ghz-edition

Your other point is that FX8000 CPU is perfect for Xbox but it would be the exact same mistake that Sony did with PS3 - spend too much  $ on the CPU and not enough on the GPU. FX8000 series consumes way too much power compared to A10-5800K/ or even FX-4350.

Without the videocard, FX8150/8350 use more than 200W of power at load:

Even if you drop the clocks to 3.0ghz, you are still going to be at 150W I bet. The other issue of going with FX8000 series means you just blew a huge chunk of $ on a component that has almost no effect on gaming performance and then strapped with a low-end GPU like HD7770? Most modern games have been GPU limited in the last 5 years and especially so for consoles, since consoles do not have MMOs or strategy games which are 2 CPU limited gaming genres. What that means is a smart console designer would spend more $ on the GPU and less on the CPU within a given limited budget. For most of PS3's life, even though it had a faster CPU than 360, the console was GPU limited. This is why so many games run faster and have better textures on the 360.

If MS wants more cores on the cheap, they'd be better off going with a 4 core / 4 threads per core PowerPC 7+ architecture. Of course in games, this CPU would still lose to a 4 core A10. But at least there is a viable alternative if they really wanted more cores without blowing the power consumption out of the water if they had gone with FX8000. 

HoloDust said:

7670 is rebranded 6670, so that's covered in table - I was suspecting PS4 might end up with that sort of performance, which is not that great honestly. BTW, your link caught ) and . at the end, so it dosn't work, so I'm reposting it

http://wewantashrubbery.blogspot.jp/2012/11/ps4-secrets-leaked-and-its-gonna-kick.html

 

The rumor of APU + HD7670/6670 has floated around for a long time now. The HD6670/7670M rumor is nearly 1 year old!

Jan 27, 2012 -- http://www.pcgamer.com/2012/01/27/xbox-720-to-feature-radeon-hd6670/

Would you believe a 1 year old rumor when Xbox 720 is nearly 1 year away (that would make this rumor 2 years old by the time it launches).

The problem with this setup is A10+HD6670 ~ HD6770 in performance only. Such GPU would only be 50% faster than HD4770/4830 (assuming that's in the Wii U). That would be a huge dissapointment from Sony and MS as HD6770 cannot play modern games at 1080P maxed out and it definitely cannot do it at 1080P @ 60 fps.

 

If MS and Sony are serious about making next generation consoles actually have a serious graphical improvement, anything less than HD7770 would be a failure.

Since other rumors have floated around the idea of a "custom A10 APU", it's not unreasonable that AMD could design an A10 + GCN APU specifically for next generation consoles. The unified shader architecture in the Xenos GPU of Xbox 360 was 1 year ahead of ATI's PC roadmap. Based on that, we cannot discount the possibility that AMD can provide a purpose-built APU with GCN. There are issues of going with VLIW4 of current Trinity APU since it's poor at tessellation and it lacks proper compute architecture that many new games already use to accelerate complex visuals such as contact hardening shadows, global illumination, ambient occlusion (Sleeping Dogs, Sniper Elite V2, Dirt Showdown, Hitman Absolution all use compute shaders to accelerate graphics).

It would be a major mistake to not have GCN in a next generation PS4/Xbox next simply because if future games use even more compute features to accelerate graphics/visuals, the next generation PS4/Xbox next will be hopelessly inefficient for next gen graphics effects. For that reason I am going to stick with my bold prediction that PS4 at least will have a GPU based on GCN.

And finally, another reason that GCN fits much better is because it is already manufactured on the latest and most efficient 28nm node, while HD7670M (~ HD6670) is a 40nm GPU:

http://www.notebookcheck.net/AMD-Radeon-HD-7670M.69483.0.html

What that means is it would cost you millions of dollars to do a re-spin of this GPU to a modern 28nm node.  Why would a company spend millions of dollars of taking a rebadged HD6670 GPU and shrinking it to 28nm when it would be more cost effective to use an off-the-shelf mobile low-end GCN part? And if the APU is custom-built, we'd end up with APU (GCN) + GCN crossfire. That solves both issues with 1 stone - you end up with a far faster and more future proof GPU architecture and save $ not shrinking HD7670M to 28nm.

Of course I am just speculating as I do not have insider info but to me the HD7670M/6670 rumor doesn't sound realistic since that GPU is way too slow and it's ancient by today's standards. 

Given that Sony picked a very powerful mobile GPU for PS Vita, why would PS4 use such a budget low-end GPU all of a sudden despite Sony making an even greater push to make PS4 one of its 3 core businesses? HD6770 is barely 4.5x faster than the GPUs in PS3 / Xbox 360. Such a performance increase is too weak to carry a 2013 console for another 7-8 years.



Wrong thread.



BlueFalcon said:
drkohler said:
The AMD FX8... series of processors is ideally suited for XBoxNext due to its design pecularities - I must remember everyone that XBoxNext has to perform two entirely separate things in parallel: run games and run Kinect2. What GPU MS will use is anyone's guess at this point (unless you have the latest dev kit). They have all the money to go 78xx if they want to.

Yes, I know nobody will like my estimates. You'd rather like to read that PS4/NextBox will be 6 times more powerful than WiiU. Not going to happen. Ever.

I don't think you read my post carefully. As I stated before, there are no power consumption limitations of fitting an HD7950M mobile chip into a PS4/Xbox next. You started talking about die sizes which I already addressed when I said the primary reason we may not see a faster GPU such as 7950M is because of cost.  On that point, the die size of 7950M is not 365mm^2 since it's just a mobile version of Pitcairn, not a mobile version of the desktop HD7900 series.

Pitcairn has a die size of 212 mm^2.

http://techreport.com/review/22573/amd-radeon-hd-7870-ghz-edition

Your other point is that FX8000 CPU is perfect for Xbox but it would be the exact same mistake that Sony did with PS3 - spend too much  $ on the CPU and not enough on the GPU. FX8000 series consumes way too much power compared to A10-5800K/ or even FX-4350.

Without the videocard, FX8150/8350 use more than 200W (1) of power at load:

Even if you drop the clocks to 3.0ghz, you are still going to be at 150W I bet. The other issue of going with FX8000 series means you just blew a huge chunk of $ on a component that has almost no effect on gaming performance and then strapped with a low-end GPU like HD7770? Most modern games have been GPU limited in the last 5 years and especially so for consoles, since consoles do not have MMOs or strategy games which are 2 CPU limited gaming genres. What that means is a smart console designer would spend more $ on the GPU and less on the CPU within a given limited budget. For most of PS3's life, even though it had a faster CPU than 360, the console was GPU limited. This is why so many games run faster and have better textures on the 360.

If MS wants more cores on the cheap, they'd be better off going with a 4 core / 4 threads per core PowerPC 7+ architecture. Of course in games, this CPU would still lose to a 4 core A10. But at least there is a viable alternative if they really wanted more cores without blowing the power consumption out of the water if they had gone with FX8000. 

HoloDust said:

7670 is rebranded 6670, so that's covered in table - I was suspecting PS4 might end up with that sort of performance, which is not that great honestly. BTW, your link caught ) and . at the end, so it dosn't work, so I'm reposting it

http://wewantashrubbery.blogspot.jp/2012/11/ps4-secrets-leaked-and-its-gonna-kick.html

 

The rumor of APU + HD7670/6670 has floated around for a long time now. The HD6670/7670M rumor is nearly 1 year old! (2)

Jan 27, 2012 -- http://www.pcgamer.com/2012/01/27/xbox-720-to-feature-radeon-hd6670/

Would you believe a 1 year old rumor when Xbox 720 is nearly 1 year away (that would make this rumor 2 years old by the time it launches).

The problem with this setup is A10+HD6670 ~ HD6770 in performance only. Such GPU would only be 50% faster than HD4770/4830 (assuming that's in the Wii U). That would be a huge dissapointment from Sony and MS as HD6770 cannot play modern games at 1080P maxed out and it definitely cannot do it at 1080P @ 60 fps.

 

If MS and Sony are serious about making next generation consoles actually have a serious graphical improvement, anything less than HD7770 would be a failure.

Since other rumors have floated around the idea of a "custom A10 APU", it's not unreasonable that AMD could design an A10 + GCN APU specifically for next generation consoles. The unified shader architecture in the Xenos GPU of Xbox 360 was 1 year ahead of ATI's PC roadmap. Based on that, we cannot discount the possibility that AMD can provide a purpose-built APU with GCN. There are issues of going with VLIW4 of current Trinity APU since it's poor at tessellation and it lacks proper compute architecture that many new games already use to accelerate complex visuals such as contact hardening shadows, global illumination, ambient occlusion (Sleeping Dogs, Sniper Elite V2, Dirt Showdown, Hitman Absolution all use compute shaders to accelerate graphics).

It would be a major mistake to not have GCN in a next generation PS4/Xbox next simply because if future games use even more compute features to accelerate graphics/visuals, the next generation PS4/Xbox next will be hopelessly inefficient for next gen graphics effects. For that reason I am going to stick with my bold prediction that PS4 at least will have a GPU based on GCN.

And finally, another reason that GCN fits much better is because it is already manufactured on the latest and most efficient 28nm node, while HD7670M (~ HD6670) is a 40nm GPU:

http://www.notebookcheck.net/AMD-Radeon-HD-7670M.69483.0.html

What that means is it would cost you millions of dollars to do a re-spin of this GPU to a modern 28nm node.  Why would a company spend millions of dollars of taking a rebadged HD6670 GPU and shrinking it to 28nm when it would be more cost effective to use an off-the-shelf mobile low-end GCN part? And if the APU is custom-built, we'd end up with APU (GCN) + GCN crossfire. That solves both issues with 1 stone - you end up with a far faster and more future proof GPU architecture and save $ not shrinking HD7670M to 28nm.

Of course I am just speculating as I do not have insider info but to me the HD7670M/6670 rumor doesn't sound realistic since that GPU is way too slow and it's ancient by today's standards. 

Given that Sony picked a very powerful mobile GPU for PS Vita, why would PS4 use such a budget low-end GPU all of a sudden despite Sony making an even greater push to make PS4 one of its 3 core businesses? HD6770 is barely 4.5x faster than the GPUs in PS3 / Xbox 360. Such a performance increase is too weak to carry a 2013 console for another 7-8 years.

(1) FX-8100 is 95W rated, so it can easily fit into power envelope (others are rated at 125W). And if you look at higly multi-threaded tests when all their cores are fully used, FX parts do quite good against Intel.

(2) I know 6670 is an old rumour, it's rumour never the less, so it's covered in table - I said I'm suspecting, not hoping, that they might go that way, which in my opinion would be quite bad. I'm hoping for at least 7770 in PS4 (in addition to 7660D inside A10) for combined equivalent of around 7850 performance.

As for Xbox, I think they can afford to put whatever they like inside - 7970m always looked like perfect candidate to me, both in proccesing power and TDP, but if Sony did actually go with lower specs, Microsoft wouldn't be that hard pressed to go guns blazing - on the other hand, there were rumours of even Tahiti based GPUs, so who knows.



More bullshit lol

http://ja.scribd.com/doc/112824858/アセンブリ