By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

D-Joe said:

More cost?in my country,even 1.5TB SATA 3,it's only 668$HKD(=around 86$US)

It's pretty simple,"useful" or not,they will provides SATA 3,and SATA 2 and SATA 3's price are almost same.

More cost to implement it into Durango... I guess the SATA 3 controller is more expensive but that's just a guess.



Around the Network
HoloDust said:
superchunk said:
More tweaking of info.... fill free to assist me with specific info and especially ranking/comparison info now that parts are starting to become clearer.

 

  Approximate GPU Performance Config VP Rating vs 360
Xbox 360 2 x HD2600XT * 0.625 240:24:8 (VLIW5) 14.6 1.0
WiiU (Redwood LE@550MHz) HD5550(DDR3) 320:16:8 (VLIW5) 21.8 1.5
WiiU (Redwood@550MHz) HD5570(DDR3) * 0.85 400:20:8 (VLIW5) 25.5 1.7
WiiU (Turks@550MHz) HD6570(DDR3) * 0.85 480:24:8 (VLIW5) 28.9 2.0
Xbox 720 (Rumoured) HD7770GHz 768:48:16 (GCN) 94 6.4
PS4 (Rumoured) HD7870 * 0.78 1152:72:32 (GCN) 135 9.2

Nice. Can you throw the Wii in there?



ethomaz said:

D-Joe said:

More cost?in my country,even 1.5TB SATA 3,it's only 668$HKD(=around 86$US)

It's pretty simple,"useful" or not,they will provides SATA 3,and SATA 2 and SATA 3's price are almost same.

More cost to implement it into Durango... I guess the SATA 3 controller is more expensive but that's just a guess.

It doesn't,and even this rumored spec are 100% true,it still ready to support SATA 3,not 2

Right only very few SSD can reach SATA 2 limit,but it's not like MS only sold Durango for 2.5years then suddenly stop it,yeah,we don't need SATA 3 now,but no one knows future



So, chunk, If I´m not mistaken, you were assuming (don´t know if you still are) the Wii U - Durango/Orbis gap would be like the one we saw with PS2 - GC/Xbox, with the Wii U being 'the PS2', while Durango/Orbis would be 'GC/Xbox'...now, if the latest info is correct, would the gap be larger than you thought?

I´m not talking Wii - PS3/X360 gap here, but larger than PS2 - GC/Xbox was, something in the middle of those two situations.



JGarret said:
So, chunk, If I´m not mistaken, you were assuming (don´t know if you still are) the Wii U - Durango/Orbis gap would be like the one we saw with PS2 - GC/Xbox, with the Wii U being 'the PS2', while Durango/Orbis would be 'GC/Xbox'...now, if the latest info is correct, would the gap be larger than you thought?

I´m not talking Wii - PS3/X360 gap here, but larger than PS2 - GC/Xbox was, something in the middle of those two situations.

At this point I still think its PS2 to Xbox, yes. But this is also hard for some to grasp as many give too much credit to PS2 and even try to argue it was at or better than Gamecube... it clearly isn't.

Also, keep in mind the Wii to Ps360 gap was pretty much almost generational as it not only included a raw power gap, but also a lack of certain technological features. Whereas WiiU will have all the same technical features and a similar architecture, just less raw power.

This should mean differences more like the settings on an average PC game. WiiU would be medium and others max.... or WiiU required and others recommended and so on.



Around the Network
drkohler said:
Note that this list is only valid once the manufacturing lines are optimised (maximum yields). Initial prices for cpus/gpus are very considerably higher, so there is a reason why the next console parts are not as new as you wish them to be.

HD7000 series has been out for more than a year. HD7770Ghz is not exactly setting the world on fire (and neither is a 768 Steam Processor 800mhz HD8000 series rumored in the next Xbox). Also, we still have almost a year before Xbox 720/PS4 launch. That means prices on these GPUs will only fall further. Here is my point: Xbox 360's Xenos GPU cost MS $141 to purchase/manufacture. The console retailed for $399, despite manufacturing costs roughly estimated at $525 for it. Now, the current rumor that Xbox 720 will have a 768 Steam Processor 800mhz (1.2 Tflop) AMD GPU tells us the performance is roughly similar to an HD7770Ghz. The HD7770Ghz retails at places like Newegg for only $100-110 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814202011).

OK, now look at the 2 charts I linked that show Median retail prices vs. Total cost to purchase the graphics cards by AIBs. If HD7770Ghz retails for roughly $100-110, the total cost to purchase this GPU directly by MS by Q4 2013 will likely be no more than $50-60. Essentially if MS prices Xbox 720 for $350-400, we are getting a short end of the stick this time. It's either MS is investing the budget towards other features like Kinect 2.0, screens in the controllers, etc. OR they are pulling a Nintendo and not taking huge losses on the hardware. Either way, the rumored specs for Xbox 720 on the GPU side are disappointing:

http://www.eurogamer.net/articles/df-hardware-next-gen-xbox-specs-leak



D-Joe said:
It doesn't,and even this rumored spec are 100% true,it still ready to support SATA 3,not 2

Right only very few SSD can reach SATA 2 limit,but it's not like MS only sold Durango for 2.5years then suddenly stop it,yeah,we don't need SATA 3 now,but no one knows future

SATA 2 seems like a huge oversight for a high-end console that's sticking around for 6-8 years. Even now modern SSDs are coming very close to 600MB/sec SATA 3 limits, which means they'd be completely bottlenecked by SATA 2. 

http://images.anandtech.com/graphs/graph6553/52540.png

http://images.anandtech.com/graphs/graph6553/52541.png

But if MS sticks with their idiotic policy of proprietary HDDs, then it's a moot point anyway if the HDDs they sell for Xbox 720 are mechanical ones. If Sony's PS4 allows for the same level of flexibility as PS3 did in terms of HDD swapping, then SATA 3 support on PS4 will be a lot more important than SATA 3 support on Xbox 720. The proprietary nature of MS's HDD solution could mean no advantages offered by SATA 3 in terms of performance unless mechanical drives were to get to 300MB/sec reads/writes in the next 6-7 years (and looking at how slowly they have evolved in the last 10 years, it doesn't look like that's going to happen anyway). I think if you want to use very fast HDD sub-system, PS4 will be the only way to go anyway.



I posted this in another thread but I thought it's worth of consideration in this thread too.

DDR3 of Xbox 720 vs. GDDR5 of PS4 - the impact on the GPU performance.

 

If you take the rumored level of performance in the Xbox 720 that is estimated at around HD7770Ghz in floating point, recall that this GPU normally has 4500mhz GDDR5 over 128-bit bus feeding it. Replace GDDR5 with DDR3-2133mhz over the same 128-bit bus and your GPU's memory bandwidth will fall from 72GB/sec to just 34 GB/sec!!! Guess what happens? Your GPU's performance will fall 40-50% with this reduction in memory bandwidth.

This likely explains why MS is going to use eSRAM/eDRAM for the GPU because 8GB of DDR3 is going to be shared with the GPU (i.e., the GPU is going to be severely memory bandwidth bottlenecked).

In simplest terms, you can take a 1.2Tflop GPU and compare it to a 1.84Tflop GPU on the same GPU architecture, but if the 1st GPU's memory bandwidth just got neutured by DDR3, the performance of that GPU will drop like a rock! Effectively, you are no longer comparing a 1.2Tflop GPU to a 1.8Tflop one because the former GPU can't work at full capacity any longer as it's memory bandwidth bottlenecked.

Example for an Nvidia GPU -- GT 640's Floating Point performance is fixed but look at the dramatic effect of swapping out DDR3 for GDDR5 on actual gaming performance.

GT 640 DDR3 = 46 VP

GT 640 GDDR5 = 68 VP (+48% faster) by just swapping out DDR3 for GDDR5. 

http://alienbabeltech.com/abt/viewtopic.php?p=41174

Example for an AMD GPU:

3DMark11

HD6670 GDDR3 = 1594 marks

HD6670 GDDR5 = 2479 marks (+55%)

http://www.overclockers.com/forums/showthread.php?t=710062

Memory bandwidth for the GPU is like vital performance enhancing nutrients for a sports athlete. 

 

On paper, things are looking MUCH worse for the Xbox 720's GPU. Not only is the GPU's floating point rumored to be 50% less than that of PS4, but the actual memory subsystem that feeds the GPU is dramatically inferior to PS4's rumored 4GB GDDR5 setup. The performance difference between an HD7770Ghz DDR3 and an 85% HD7870 GDDR5 is going to be more than double in graphical capability, not 50%, because memory bandwidth is required to keep the GPU working at full capacity (specifically the memory bandwidth feeds the GPU's ROPs).

The Render Output Unit, often abbreviated as "ROP", and sometimes called (perhaps more properly) Raster Operations Pipeline, is one of the final steps in the rendering process of modern 3D accelerator boards. If you neuter GPU's memory bandwidth, you neuter the ROPs and thus the final stage in the graphical rendering process stage. It's like putting Toyota Prius tires on a 700hp rear-wheel drive supercar. 32MB of eSRAM is not going to be the answer to this problem because it's not sufficient enough in size.

 

In practice, Xbox 720's DDR3 memory bandwidth seems to be estimated at just 68GB/sec. 1.76Tflops HD7850 has 154GB/sec:

http://www.gpureview.com/show_cards.php?card1=678&card2=677

If Sony retains GDDR5 for its GPU, Xbox 720's GPU is going to be significantly slower. Also, if we look at the diagram for rumored Xbox 720 specs, DDR3 has to communicate through the NorthBridge prior to reaching the GPU. That introduces additional latencies. If PS4 has GDDR5 and at least some of it is dedicated to the GPU, not only will the memory subsystem feeding the GPU will be miles faster than Xbox 720's, but the GPU will be able to access it much quicker (much like it does on a real modern graphics card where the GPU communicates with GDDR5 directly on the same PCB).

Now take the Xbox 720's GPU with its rumored 50% lower GPU power and consider that it also appears crippled by shared DDR3 memory, it stands to reason that PS4's GPU with its dedicated GDDR5 will mop the floor with it.  There is a reason all high-end GPUs on the PC use dedicated GDDR5.....so far I am not impressed with these Xbox 720 specs.

@ SuperChunk, 

In your PS4/Orbis chart, the GPU is rumored to be an 800mhz clocked 18 Compute Units HD7970M part (or a 20% downclocked, 10% Compute Unit cut down HD7870 desktop part). You have it at 850mhz 20 Compute Units which are actual specs of the full-fledged HD7970M.



BlueFalcon said:

I posted this in another thread but I thought it's worth of consideration in this thread too.

DDR3 of Xbox 720 vs. GDDR5 of PS4 - the impact on the GPU performance.

 

If you take the rumored level of performance in the Xbox 720 that is estimated at around HD7770Ghz in floating point, recall that this GPU normally has 4500mhz GDDR5 over 128-bit bus feeding it. Replace GDDR5 with DDR3-2133mhz over the same 128-bit bus and your GPU's memory bandwidth will fall from 72GB/sec to just 34 GB/sec!!! Guess what happens? Your GPU's performance will fall 40-50% with this reduction in memory bandwidth.

This likely explains why MS is going to use eSRAM/eDRAM for the GPU because 8GB of DDR3 is going to be shared with the GPU (i.e., the GPU is going to be severely memory bandwidth bottlenecked).

In simplest terms, you can take a 1.2Tflop GPU and compare it to a 1.84Tflop GPU on the same GPU architecture, but if the 1st GPU's memory bandwidth just got neutured by DDR3, the performance of that GPU will drop like a rock! Effectively, you are no longer comparing a 1.2Tflop GPU to a 1.8Tflop one because the former GPU can't work at full capacity any longer as it's memory bandwidth bottlenecked.

Example for an Nvidia GPU -- GT 640's Floating Point performance is fixed but look at the dramatic effect of swapping out DDR3 for GDDR5 on actual gaming performance.

GT 640 DDR3 = 46 VP

GT 640 GDDR5 = 68 VP (+48% faster) by just swapping out DDR3 for GDDR5. 

http://alienbabeltech.com/abt/viewtopic.php?p=41174

Example for an AMD GPU:

3DMark11

HD6670 GDDR3 = 1594 marks

HD6670 GDDR5 = 2479 marks (+55%)

http://www.overclockers.com/forums/showthread.php?t=710062

Memory bandwidth for the GPU is like vital performance enhancing nutrients for a sports athlete. 

 

On paper, things are looking MUCH worse for the Xbox 720's GPU. Not only is the GPU's floating point rumored to be 50% less than that of PS4, but the actual memory subsystem that feeds the GPU is dramatically inferior to PS4's rumored 4GB GDDR5 setup. The performance difference between an HD7770Ghz DDR3 and an 85% HD7870 GDDR5 is going to be more than double in graphical capability, not 50%, because memory bandwidth is required to keep the GPU working at full capacity (specifically the memory bandwidth feeds the GPU's ROPs).

The Render Output Unit, often abbreviated as "ROP", and sometimes called (perhaps more properly) Raster Operations Pipeline, is one of the final steps in the rendering process of modern 3D accelerator boards. If you neuter GPU's memory bandwidth, you neuter the ROPs and thus the final stage in the graphical rendering process stage. It's like putting Toyota Prius tires on a 700hp rear-wheel drive supercar. 32MB of eSRAM is not going to be the answer to this problem because it's not sufficient enough in size.

 

In practice, Xbox 720's DDR3 memory bandwidth seems to be estimated at just 68GB/sec. 1.76Tflops HD7850 has 154GB/sec:

http://www.gpureview.com/show_cards.php?card1=678&card2=677

If Sony retains GDDR5 for its GPU, Xbox 720's GPU is going to be significantly slower. Also, if we look at the diagram for rumored Xbox 720 specs, DDR3 has to communicate through the NorthBridge prior to reaching the GPU. That introduces additional latencies. If PS4 has GDDR5 and at least some of it is dedicated to the GPU, not only will the memory subsystem feeding the GPU will be miles faster than Xbox 720's, but the GPU will be able to access it much quicker (much like it does on a real modern graphics card where the GPU communicates with GDDR5 directly on the same PCB).

Now take the Xbox 720's GPU with its rumored 50% lower GPU power and consider that it also appears crippled by shared DDR3 memory, it stands to reason that PS4's GPU with its dedicated GDDR5 will mop the floor with it.  There is a reason all high-end GPUs on the PC use dedicated GDDR5.....so far I am not impressed with these Xbox 720 specs.

@ SuperChunk, 

In your PS4/Orbis chart, the GPU is rumored to be an 800mhz clocked 18 Compute Units HD7970M part (or a 20% downclocked, 10% Compute Unit cut down HD7870 desktop part). You have it at 850mhz 20 Compute Units which are actual specs of the full-fledged HD7970M.

Yeah so i don't know why neogaf people think it will HD7770 level,in fact,even they think about mid/low range,at least use retail HD8000

I don't know Sony,but Durango clearly will use HD8000

And most wtf thing is some people in neogaf said Durango will have some magic assist hardware to make their "rumored HD7770" on par or even little better than PS4's "rumored 85% HD7870/7970M" lol



D-Joe said:
BlueFalcon said:

I posted this in another thread but I thought it's worth of consideration in this thread too.

DDR3 of Xbox 720 vs. GDDR5 of PS4 - the impact on the GPU performance.

 

If you take the rumored level of performance in the Xbox 720 that is estimated at around HD7770Ghz in floating point, recall that this GPU normally has 4500mhz GDDR5 over 128-bit bus feeding it. Replace GDDR5 with DDR3-2133mhz over the same 128-bit bus and your GPU's memory bandwidth will fall from 72GB/sec to just 34 GB/sec!!! Guess what happens? Your GPU's performance will fall 40-50% with this reduction in memory bandwidth.

This likely explains why MS is going to use eSRAM/eDRAM for the GPU because 8GB of DDR3 is going to be shared with the GPU (i.e., the GPU is going to be severely memory bandwidth bottlenecked).

In simplest terms, you can take a 1.2Tflop GPU and compare it to a 1.84Tflop GPU on the same GPU architecture, but if the 1st GPU's memory bandwidth just got neutured by DDR3, the performance of that GPU will drop like a rock! Effectively, you are no longer comparing a 1.2Tflop GPU to a 1.8Tflop one because the former GPU can't work at full capacity any longer as it's memory bandwidth bottlenecked.

Example for an Nvidia GPU -- GT 640's Floating Point performance is fixed but look at the dramatic effect of swapping out DDR3 for GDDR5 on actual gaming performance.

GT 640 DDR3 = 46 VP

GT 640 GDDR5 = 68 VP (+48% faster) by just swapping out DDR3 for GDDR5. 

http://alienbabeltech.com/abt/viewtopic.php?p=41174

Example for an AMD GPU:

3DMark11

HD6670 GDDR3 = 1594 marks

HD6670 GDDR5 = 2479 marks (+55%)

http://www.overclockers.com/forums/showthread.php?t=710062

Memory bandwidth for the GPU is like vital performance enhancing nutrients for a sports athlete. 

 

On paper, things are looking MUCH worse for the Xbox 720's GPU. Not only is the GPU's floating point rumored to be 50% less than that of PS4, but the actual memory subsystem that feeds the GPU is dramatically inferior to PS4's rumored 4GB GDDR5 setup. The performance difference between an HD7770Ghz DDR3 and an 85% HD7870 GDDR5 is going to be more than double in graphical capability, not 50%, because memory bandwidth is required to keep the GPU working at full capacity (specifically the memory bandwidth feeds the GPU's ROPs).

The Render Output Unit, often abbreviated as "ROP", and sometimes called (perhaps more properly) Raster Operations Pipeline, is one of the final steps in the rendering process of modern 3D accelerator boards. If you neuter GPU's memory bandwidth, you neuter the ROPs and thus the final stage in the graphical rendering process stage. It's like putting Toyota Prius tires on a 700hp rear-wheel drive supercar. 32MB of eSRAM is not going to be the answer to this problem because it's not sufficient enough in size.

 

In practice, Xbox 720's DDR3 memory bandwidth seems to be estimated at just 68GB/sec. 1.76Tflops HD7850 has 154GB/sec:

http://www.gpureview.com/show_cards.php?card1=678&card2=677

If Sony retains GDDR5 for its GPU, Xbox 720's GPU is going to be significantly slower. Also, if we look at the diagram for rumored Xbox 720 specs, DDR3 has to communicate through the NorthBridge prior to reaching the GPU. That introduces additional latencies. If PS4 has GDDR5 and at least some of it is dedicated to the GPU, not only will the memory subsystem feeding the GPU will be miles faster than Xbox 720's, but the GPU will be able to access it much quicker (much like it does on a real modern graphics card where the GPU communicates with GDDR5 directly on the same PCB).

Now take the Xbox 720's GPU with its rumored 50% lower GPU power and consider that it also appears crippled by shared DDR3 memory, it stands to reason that PS4's GPU with its dedicated GDDR5 will mop the floor with it.  There is a reason all high-end GPUs on the PC use dedicated GDDR5.....so far I am not impressed with these Xbox 720 specs.

@ SuperChunk, 

In your PS4/Orbis chart, the GPU is rumored to be an 800mhz clocked 18 Compute Units HD7970M part (or a 20% downclocked, 10% Compute Unit cut down HD7870 desktop part). You have it at 850mhz 20 Compute Units which are actual specs of the full-fledged HD7970M.

Yeah so i don't know why neogaf people think it will HD7770 level,in fact,even they think about mid/low range,at least use retail HD8000

I don't know Sony,but Durango clearly will use HD8000

And most wtf thing is some people in neogaf said Durango will have some magic assist hardware to make their "rumored HD7770" on par or even little better than PS4's "rumored 85% HD7870/7970M" lol


Yeah, I hope so. Because according to Bluefalcon. A GDDR3/wGPU sounds like a VERY VERY bad combination, up to the point where the GPU itself cant even run at full power.  Personally I think that MS will develop two consoles. One with a kinect included (a mildly slower casual-targeted console) and one with a higher spec'd more expensive console. 

 

But hey, its all rumors. I mean a few months ago, every one has been saying that Xbox is going to be much much faster than the Ps4. But now its saying the exact opposite thing. Its all a rollercoeaster ride, and the ride hasn't finished yet. 



Yay!!!