By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

JEMC said:

To put some perspective here, let me post this wikipedia info

http://en.wikipedia.org/wiki/Xbox_360_hardware#List_of_revisions

 

Codename
CPU GPU HDMI Power Supply In Production Date Released
Xenon 90 nm 90 nm No 203 W No November 2005
Zephyr 90 nm 90 nm Yes 203 W No July 2007
Falcon 65 nm 90 nm Yes 175 W No Late September 2007
Opus 65 nm 90 nm No 175 W No June 2008
Jasper 65 nm 65 nm Yes 150 W No September 2008
Trinity (Valhalla) 45 nm (combined "Vejle" chip)[27] Yes 135 W No June 2010
Corona 45 nm (combined chip)[28] Yes 115 W Yes August 2011

The first Xbox360 used +200W and the newest one, with all the optimizations, the single chip for both the CPU and GPU, etc still uses more than 100W.

Even including the efficency of the power supply, that's still a lot.

I'd also like to note that just because the Power supply is rated at a certain spec doesn't mean that is actually what the entire system draws.   In fact, you really don't want it running too close to peak at all.   A 20% headroom is about ideal for a console.



The rEVOLution is not being televised

Around the Network
Viper1 said:
JEMC said:

To put some perspective here, let me post this wikipedia info

http://en.wikipedia.org/wiki/Xbox_360_hardware#List_of_revisions

 

Codename
CPU GPU HDMI Power Supply In Production Date Released
Xenon 90 nm 90 nm No 203 W No November 2005
Zephyr 90 nm 90 nm Yes 203 W No July 2007
Falcon 65 nm 90 nm Yes 175 W No Late September 2007
Opus 65 nm 90 nm No 175 W No June 2008
Jasper 65 nm 65 nm Yes 150 W No September 2008
Trinity (Valhalla) 45 nm (combined "Vejle" chip)[27] Yes 135 W No June 2010
Corona 45 nm (combined chip)[28] Yes 115 W Yes August 2011

The first Xbox360 used +200W and the newest one, with all the optimizations, the single chip for both the CPU and GPU, etc still uses more than 100W.

Even including the efficency of the power supply, that's still a lot.

I'd also like to note that just because the Power supply is rated at a certain spec doesn't mean that is actually what the entire system draws.   In fact, you really don't want it running too close to peak at all.   A 20% headroom is about ideal for a console.

Power consumption of current-gen consoles:

https://docs.google.com/viewer?a=v&q=cache:xGJhVr3FGYkJ:https://wpweb2.tepper.cmu.edu/ceic/pdfs/CEIC_11_01.pdf+&hl=en&pid=bl&srcid=ADGEEShXys6w-mmVsQMjoEebGXWxq_dypu1mJzeC64P4TR6KRBqli6ZK5b3eJ673NhuXXp98CZu5qJJAezdB9rl_gm0nK2Udgrei1b6Q1XDUTq-ySBwkUzRVpo-LJPhN1_3yDI8GquCN&sig=AHIEtbTFeF34BsguPfmcNVn2k6Wi5D73dg

That 20% is just about right from that table (172W for original 360, so with 20% headroom it's just in area of its 203W rated power supply). "Fat" PS3 has 380W power supply, replaced with 250W unit in "Slim" and 190W in "Superslim". I'd say quite a lot of watts, even for more hungry GPU/CPU combos of today in that "Fat" model.



HoloDust said:

Power consumption of current-gen consoles:

https://docs.google.com/viewer?a=v&q=cache:xGJhVr3FGYkJ:https://wpweb2.tepper.cmu.edu/ceic/pdfs/CEIC_11_01.pdf+&hl=en&pid=bl&srcid=ADGEEShXys6w-mmVsQMjoEebGXWxq_dypu1mJzeC64P4TR6KRBqli6ZK5b3eJ673NhuXXp98CZu5qJJAezdB9rl_gm0nK2Udgrei1b6Q1XDUTq-ySBwkUzRVpo-LJPhN1_3yDI8GquCN&sig=AHIEtbTFeF34BsguPfmcNVn2k6Wi5D73dg

That 20% is just about right from that table (172W for original 360, so with 20% headroom it's just in area of its 203W rated power supply). "Fat" PS3 has 380W power supply, replaced with 250W unit in "Slim" and 190W in "Superslim". I'd say quite a lot of watts, even for more hungry GPU/CPU combos of today in that "Fat" model.

Sony went nuts with their original PS3 model power supply.  The system at peak would pull about 205-210 watts but they gave the thing a 380 watt power supply.  Talk about lost efficiency there.  That thing probably wasted more watts than the entire PS3 consumes in its current format.



The rEVOLution is not being televised

regin2005 said:
While all these numbers are nice and all (8 core this, 3 gbDDR3 that) what REAL difference will these super powered consoles do that their current consoles (PS3, X360) cannot.

Is the so-called power of the PS3 already tapped out?
Can visuals really get any better than they are already now on 360/PS3? If the consoles will be at LEAST twice as strong, will anyone be able to tell the difference?


We'll have to wait and see how well the new MGS game is, the new trailer that ran off a PC that looked better then anything out there today is apparently suppose to be pretty close graphically on the PS3.



Viper1 said:

Sony went nuts with their original PS3 model power supply.  The system at peak would pull about 205-210 watts but they gave the thing a 380 watt power supply.  Talk about lost efficiency there.  That thing probably wasted more watts than the entire PS3 consumes in its current format.

Huh? Check EVERY power supply for PCs. Check the efficiency vs load curves and learn.  If you do, then you might realise why in every forum, people tell people to buy a PS that runs between 40-60% of max capacity...



Around the Network
drkohler said:
Viper1 said:

Sony went nuts with their original PS3 model power supply.  The system at peak would pull about 205-210 watts but they gave the thing a 380 watt power supply.  Talk about lost efficiency there.  That thing probably wasted more watts than the entire PS3 consumes in its current format.

Huh? Check EVERY power supply for PCs. Check the efficiency vs load curves and learn.  If you do, then you might realise why in every forum, people tell people to buy a PS that runs between 40-60% of max capacity...

Console vs PC.    A PC oftens has a vary widly ranging operating wattage.   Idle to peak.   A console often has a very smaller operating range.

Look at the PSU's of every single other console in history (including different models of the PS3 itself) and you'll find a huge difference between the original PS3 and ALL other consoles or models ever.

 

By the way, I build PC's as a side job.   I am that guy on the forums telling you to get a 90% rated PSU at 150% of peak load (you don't want 40%-60% of peak load...that would be very bad).



The rEVOLution is not being televised

drkohler said:
JEMC said:

I agree. Those 8GB of RAM, the HDD and the optical drive will use at least 20W.

And yes, some people are going to be very disapponted.

Look, the newest devkit has an 8core AMD processor in it. You can't shell out dev kits with such a processor and then release a box with lower cores cpu and yell "Fooled you". (Regardless that current PC software rarely uses 4 cores at its fullest). Add the point that Kinect2 probably requires 2-4 cores on its own, an FX8xxx processor is a logical choice (PS4 probably could get away with a 4-6core FXxxx processor). 125W TDP means that 8cores are fully running, something that probaly is not happening in real life at all. Add a refresh generation and clock a little slower and you are _way below_ 100W actual use.

The rule is simple: You want something that is 2-3 times as fast as an XBox360/PS3, you will generate more heat than an XBox360/PS3.

Any 'XBox 720' development kit that has been released by Microsoft at this point in time is likely a pre-alpha development kit and the hardware in it will not be similar to what is actually released.

If I was to guess I would suspect that the pre-alpha development kit would run a virtual machine to roughly approximate the performance of the 'XBox 720', and any advanced features that would be built into the GPU would likely be emulated on the CPU (as best as they could be). With this kind of approach you could provide a development kit 2+ years before you released your system, while most development kits that are based on system hardware only become available 6 to 12 months prior to the release of the system. Of course, the VM would (likely) not be a true emulation of hardware, and lower level development would have to wait for more final hardware.

Ultimately, the real downside to this approach is that your CPU has to be dramatically more powerful than what you're going to actually release because it has to run the VM and emulate GPU features at the same time. If this was the approach Microsoft was taking, they might choose a CPU like the AMD FT-8350 as their pre-alpha CPU even though their final CPU would be a quad core PowerPC processor that had far less processing power; because you could get development kits produced and on the desks of developers for a little over $1000 per system. In fact, if this software development kit was well created, it would enable indie and small developers to target the performance of the 'XBox 720' without ever needing to pay for an expensive hardware development kit from Microsoft.



HappySqurriel said:
drkohler said:
JEMC said:

I agree. Those 8GB of RAM, the HDD and the optical drive will use at least 20W.

And yes, some people are going to be very disapponted.

Look, the newest devkit has an 8core AMD processor in it. You can't shell out dev kits with such a processor and then release a box with lower cores cpu and yell "Fooled you". (Regardless that current PC software rarely uses 4 cores at its fullest). Add the point that Kinect2 probably requires 2-4 cores on its own, an FX8xxx processor is a logical choice (PS4 probably could get away with a 4-6core FXxxx processor). 125W TDP means that 8cores are fully running, something that probaly is not happening in real life at all. Add a refresh generation and clock a little slower and you are _way below_ 100W actual use.

The rule is simple: You want something that is 2-3 times as fast as an XBox360/PS3, you will generate more heat than an XBox360/PS3.

Any 'XBox 720' development kit that has been released by Microsoft at this point in time is likely a pre-alpha development kit and the hardware in it will not be similar to what is actually released.

If I was to guess I would suspect that the pre-alpha development kit would run a virtual machine to roughly approximate the performance of the 'XBox 720', and any advanced features that would be built into the GPU would likely be emulated on the CPU (as best as they could be). With this kind of approach you could provide a development kit 2+ years before you released your system, while most development kits that are based on system hardware only become available 6 to 12 months prior to the release of the system. Of course, the VM would (likely) not be a true emulation of hardware, and lower level development would have to wait for more final hardware.

Ultimately, the real downside to this approach is that your CPU has to be dramatically more powerful than what you're going to actually release because it has to run the VM and emulate GPU features at the same time. If this was the approach Microsoft was taking, they might choose a CPU like the AMD FT-8350 as their pre-alpha CPU even though their final CPU would be a quad core PowerPC processor that had far less processing power; because you could get development kits produced and on the desks of developers for a little over $1000 per system. In fact, if this software development kit was well created, it would enable indie and small developers to target the performance of the 'XBox 720' without ever needing to pay for an expensive hardware development kit from Microsoft.

Is absolutely correct.

Practically all early alpha kits are actually virtual machine environments and often have very little to do with the actual target hardware.



The rEVOLution is not being televised

@HappySqurriel: Just let me say thanks for such an informative post.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

You should check this: http://www.ign.com/articles/2012/11/01/report-ps4-dev-kits-surface-details-inside

A new version of the PlayStation 4 dev kit is currently being distributed to developers with the final version expected next year, according to a new report.

Anonymous sources have reportedly told VG247 that new versions of the Orbis kit are winging their way to developers, replete with Blu-ray support and housed in the humble cases of normal PCs.

This is apparently the second iteration of the dev kit; the first, which appeared earlier this year, was in essence just a graphics card, while this version is now a "modified PC". The report asserts that the next update will come in January, when it'll be close to final specifications, with the ultimate version landing with devs next summer.

The shipping of the Orbis kit apparently follows a series of meetings held by Sony in the US this week, where the company explained what the machine was designed to do and how to get the most out of it. Interestingly, at these meetings it's been claimed that Sony didn't refer to the machine as "the PlayStation 4" at all, instead opting to use the "Orbis" title at all times.

The dev kits are apparently based on the AMD’s A10 APU series and come with either 8GB or 16GB of RAM, as well as the Blu-ray drive already mentioned and a 256GB hard drive as standard. This is to ensure that the console will be able to run 1080p60 games in 3D.

The Orbis kits have both Wi-Fi and Ethernet connectivity, as well as HDMI out slots; so pretty much exactly what you'd expect to find on your current PlayStation 3. However, the big reported difference comes with the UI, which has been designed to be more fluid and allow extensive navigation anywhere on the system simply by pressing the PS button mid-game. This was demoed to the assembled masses by purchasing DLC from the PS Store without quitting the game.

 



Nintendo and PC gamer