By using this site, you agree to our Privacy Policy and our Terms of Use. Close
superchunk said:
JEMC said:

To put some perspective here, let me post this wikipedia info

http://en.wikipedia.org/wiki/Xbox_360_hardware#List_of_revisions

 

Codename
CPU GPU HDMI Power Supply In Production Date Released
Xenon 90 nm 90 nm No 203 W No November 2005
Zephyr 90 nm 90 nm Yes 203 W No July 2007
Falcon 65 nm 90 nm Yes 175 W No Late September 2007
Opus 65 nm 90 nm No 175 W No June 2008
Jasper 65 nm 65 nm Yes 150 W No September 2008
Trinity (Valhalla) 45 nm (combined "Vejle" chip)[27] Yes 135 W No June 2010
Corona 45 nm (combined chip)[28] Yes 115 W Yes August 2011

The first Xbox360 used +200W and the newest one, with all the optimizations, the single chip for both the CPU and GPU, etc still uses more than 100W.

Even including the efficency of the power supply, that's still a lot.

That is a lot, but that wattage is for the entire system. Not just the CPU or GPU. You have to figure all the other components, HDD, DVD, ports, memory, etc use AT LEAST 25% to 30% of that. So at launch, the MAX I'd see for CPU and GPU is 150W.

Some of these GPUs and CPUs being tossed out here in this thread would have no conceivable way to acheive that without a dramatic increase in total wattage.

I agree. Those 8GB of RAM, the HDD and the optical drive will use at least 20W.

And yes, some people are going to be very disapponted.

But hey! Look at what they are achieving with that 2005-2006 hardware. Imagine what they will be able to do.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.