By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

D-Joe said:
JEMC said:
D-Joe said:
JEMC said:

As far as I know, the lowest 8 core FX chip is the FX-8120 and is rated at 125W.

well,FX8300 is 95W

From AMD's site

http://www.amd.com/us/products/desktop/processors/amdfx/Pages/amdfx-model-number-comparison.aspx

There are only 2 FX 83xx models and both feature a TDP of 125W.

But I agree that I was wrong, the FX 8100 (that wasn't in Anandtech's chart that I took the numbers of) is an 8 core CPU with a TDP of 95W. My bad.

Check "FX-8300"
http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors

is FX 8300 not released in US?

I doesn't appear in the Spanish site either.

And to be honest, I trust AMD more than wikipedia.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
JEMC said:
D-Joe said:
JEMC said:
D-Joe said:
JEMC said:

As far as I know, the lowest 8 core FX chip is the FX-8120 and is rated at 125W.

well,FX8300 is 95W

From AMD's site

http://www.amd.com/us/products/desktop/processors/amdfx/Pages/amdfx-model-number-comparison.aspx

There are only 2 FX 83xx models and both feature a TDP of 125W.

But I agree that I was wrong, the FX 8100 (that wasn't in Anandtech's chart that I took the numbers of) is an 8 core CPU with a TDP of 95W. My bad.

Check "FX-8300"
http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors

is FX 8300 not released in US?

I doesn't appear in the Spanish site either.

And to be honest, I trust AMD more than wikipedia.

I get it,8300 is for OEM(atleast right now),that's why it doesn't appear in AMD official site
http://wccftech.com/amd-ships-fx-8300-fx6350-oem-systems/



D-Joe said:
JEMC said:
D-Joe said:

Check "FX-8300"
http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors

is FX 8300 not released in US?

I doesn't appear in the Spanish site either.

And to be honest, I trust AMD more than wikipedia.

I get it,8300 is for OEM(atleast right now),that's why it doesn't appear in AMD official site
http://wccftech.com/amd-ships-fx-8300-fx6350-oem-systems/

Ok, it exists... but it'a 125W chip.

Still, I was wrong because there are 95W 8 core CPUs from AMD, but they are from the FX 81xx series that ones which the FX 83xx series replaces.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
D-Joe said:
JEMC said:
D-Joe said:

Check "FX-8300"
http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors

is FX 8300 not released in US?

I doesn't appear in the Spanish site either.

And to be honest, I trust AMD more than wikipedia.

I get it,8300 is for OEM(atleast right now),that's why it doesn't appear in AMD official site
http://wccftech.com/amd-ships-fx-8300-fx6350-oem-systems/

Ok, it exists... but it'a 125W chip.

Still, I was wrong because there are 95W 8 core CPUs from AMD, but they are from the FX 81xx series that ones which the FX 83xx series replaces.

The news just made some mistake,6350 should be the 125W one,8300 is 95W

http://support.asus.com/cpusupport/detail.aspx?SLanguage=en&p=1&m=m5a97%20pro&cpu=fx-8300(fd8300wmw8khk,3.2ghz,8c,95w,rev.c0,am3+)&pcb=all&sincebios=1604&memo=

http://support.asus.com/cpusupport/detail.aspx?SLanguage=en&p=1&m=M5A97%20PRO&cpu=FX-6350(FD6350FRW6KHK,3.9GHz,6C,125W,rev.C0,AM3+)&pcb=ALL&sincebios=1604&memo=

http://technewspedia.com/amd-introduces-its-new-microprocessors-fx-8300-and-fx-6350/

And i don't think they will use Bulldozer,if they choose AMD FX for next gen Xbox



@D-Joe: Ok, you win. Besides the FX 81xx series the 83xx ones also draws 95W.

And, as pointed by Zarx and Viper1 in other posts, I agree that they won't use them. Too power hungry.

I imagine a console with a CPU using 40-50W (because of lower clocks and all that), and a GPU* that uses about 70-80W Put that together in a box along with the RAM, HDD, optical drive, etc. Add the little space between them that restricst airflow and the impractical use of high speed fans to cool it and you are pushing it to its limits.

*I get the 70-80W for the GPU applying the not-always-true rule that each gen of cards makes the second best have the same performance than the last gen best ones (HD 6970 = HD7870, GTX580 = GTX660Ti, etc). Since they where using a GTX570 as a base, that is an HD7850 which would translate to an HD 8750 at rougly 100-110W. Reduce a bit the clocks to make it more manageable and get better yields and you end with around 70-80W.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

Well i don't believe either,FX just too hot for consoles lol



For Sony and Microsoft I would expect their entire systems to run in the 80-120 Watt range, with the CPU accounting for (roughly) 1/3 of that and the GPU accounting for the other (roughly) 2/3. This would put the CPU using roughly 25 to 40 Watts, and the GPU using roughly 55 to 80 watts.



wow that's a lot of information. I will be updating neXtBox soon with generalizations surrounding this rumor. Nothing specific as it still seems too advanced for a console when you consider heat, cost, and power usage.



HappySqurriel said:
For Sony and Microsoft I would expect their entire systems to run in the 80-120 Watt range, with the CPU accounting for (roughly) 1/3 of that and the GPU accounting for the other (roughly) 2/3. This would put the CPU using roughly 25 to 40 Watts, and the GPU using roughly 55 to 80 watts.

To put some perspective here, let me post this wikipedia info

http://en.wikipedia.org/wiki/Xbox_360_hardware#List_of_revisions

 

Codename
CPU GPU HDMI Power Supply In Production Date Released
Xenon 90 nm 90 nm No 203 W No November 2005
Zephyr 90 nm 90 nm Yes 203 W No July 2007
Falcon 65 nm 90 nm Yes 175 W No Late September 2007
Opus 65 nm 90 nm No 175 W No June 2008
Jasper 65 nm 65 nm Yes 150 W No September 2008
Trinity (Valhalla) 45 nm (combined "Vejle" chip)[27] Yes 135 W No June 2010
Corona 45 nm (combined chip)[28] Yes 115 W Yes August 2011

The first Xbox360 used +200W and the newest one, with all the optimizations, the single chip for both the CPU and GPU, etc still uses more than 100W.

Even including the efficency of the power supply, that's still a lot.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
HappySqurriel said:
For Sony and Microsoft I would expect their entire systems to run in the 80-120 Watt range, with the CPU accounting for (roughly) 1/3 of that and the GPU accounting for the other (roughly) 2/3. This would put the CPU using roughly 25 to 40 Watts, and the GPU using roughly 55 to 80 watts.

To put some perspective here, let me post this wikipedia info

http://en.wikipedia.org/wiki/Xbox_360_hardware#List_of_revisions

 

Codename
CPU GPU HDMI Power Supply In Production Date Released
Xenon 90 nm 90 nm No 203 W No November 2005
Zephyr 90 nm 90 nm Yes 203 W No July 2007
Falcon 65 nm 90 nm Yes 175 W No Late September 2007
Opus 65 nm 90 nm No 175 W No June 2008
Jasper 65 nm 65 nm Yes 150 W No September 2008
Trinity (Valhalla) 45 nm (combined "Vejle" chip)[27] Yes 135 W No June 2010
Corona 45 nm (combined chip)[28] Yes 115 W Yes August 2011

The first Xbox360 used +200W and the newest one, with all the optimizations, the single chip for both the CPU and GPU, etc still uses more than 100W.

Even including the efficency of the power supply, that's still a lot.

That is a lot, but that wattage is for the entire system. Not just the CPU or GPU. You have to figure all the other components, HDD, DVD, ports, memory, etc use AT LEAST 25% to 30% of that. So at launch, the MAX I'd see for CPU and GPU is 150W.

Some of these GPUs and CPUs being tossed out here in this thread would have no conceivable way to acheive that without a dramatic increase in total wattage.