By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

superchunk said:
JEMC said:

To put some perspective here, let me post this wikipedia info

http://en.wikipedia.org/wiki/Xbox_360_hardware#List_of_revisions

 

Codename
CPU GPU HDMI Power Supply In Production Date Released
Xenon 90 nm 90 nm No 203 W No November 2005
Zephyr 90 nm 90 nm Yes 203 W No July 2007
Falcon 65 nm 90 nm Yes 175 W No Late September 2007
Opus 65 nm 90 nm No 175 W No June 2008
Jasper 65 nm 65 nm Yes 150 W No September 2008
Trinity (Valhalla) 45 nm (combined "Vejle" chip)[27] Yes 135 W No June 2010
Corona 45 nm (combined chip)[28] Yes 115 W Yes August 2011

The first Xbox360 used +200W and the newest one, with all the optimizations, the single chip for both the CPU and GPU, etc still uses more than 100W.

Even including the efficency of the power supply, that's still a lot.

That is a lot, but that wattage is for the entire system. Not just the CPU or GPU. You have to figure all the other components, HDD, DVD, ports, memory, etc use AT LEAST 25% to 30% of that. So at launch, the MAX I'd see for CPU and GPU is 150W.

Some of these GPUs and CPUs being tossed out here in this thread would have no conceivable way to acheive that without a dramatic increase in total wattage.

I agree. Those 8GB of RAM, the HDD and the optical drive will use at least 20W.

And yes, some people are going to be very disapponted.

But hey! Look at what they are achieving with that 2005-2006 hardware. Imagine what they will be able to do.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
JEMC said:

I agree. Those 8GB of RAM, the HDD and the optical drive will use at least 20W.

And yes, some people are going to be very disapponted.

Look, the newest devkit has an 8core AMD processor in it. You can't shell out dev kits with such a processor and then release a box with lower cores cpu and yell "Fooled you". (Regardless that current PC software rarely uses 4 cores at its fullest). Add the point that Kinect2 probably requires 2-4 cores on its own, an FX8xxx processor is a logical choice (PS4 probably could get away with a 4-6core FXxxx processor). 125W TDP means that 8cores are fully running, something that probaly is not happening in real life at all. Add a refresh generation and clock a little slower and you are _way below_ 100W actual use.

The rule is simple: You want something that is 2-3 times as fast as an XBox360/PS3, you will generate more heat than an XBox360/PS3.



So I guess xbox infinity will be kicking wii u's ass quite nicely. Good.



drkohler said:
JEMC said:

I agree. Those 8GB of RAM, the HDD and the optical drive will use at least 20W.

And yes, some people are going to be very disapponted.

Look, the newest devkit has an 8core AMD processor in it. You can't shell out dev kits with such a processor and then release a box with lower cores cpu and yell "Fooled you". (Regardless that current PC software rarely uses 4 cores at its fullest). Add the point that Kinect2 probably requires 2-4 cores on its own, an FX8xxx processor is a logical choice (PS4 probably could get away with a 4-6core FXxxx processor). 125W TDP means that 8cores are fully running, something that probaly is not happening in real life at all. Add a refresh generation and clock a little slower and you are _way below_ 100W actual use.

The rule is simple: You want something that is 2-3 times as fast as an XBox360/PS3, you will generate more heat than an XBox360/PS3.

I think you don't get my comment. What I said is that people expect (re-read this thread and other on this site to see that it's true) some kind of hardware that is out of the realm of what is possible for a console (GTX680's and the like).

Yes, the latest rumors say that the nextbox will use an 8 core CPU. Fine. If you look back you'll see that I find it odd to change the 8 core Intel chip they were using for and AMD one because of the difference in performance: you can get a lot more from the Intel chip running slower than from an AMD one. And using less power!

And another thing. Modern GPUs use a lot more power than the CPUs. If they are mad enough to go with a 100W CPU, what kind of GPU will they use?Remember that the important thing here is balance: getting a combo CPU-GPU where neither becomes a bottleneck for the other.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

drkohler said:

Add the point that Kinect2 probably requires 2-4 cores on its own

No chance



Around the Network
D-Joe said:

drkohler said:

Add the point that Kinect2 probably requires 2-4 cores on its own

No chance

I think it's safe to say 1 core is dedicated to Kinect.



JEMC said:

And another thing. Modern GPUs use a lot more power than the CPUs. If they are mad enough to go with a 100W CPU, what kind of GPU will they use?Remember that the important thing here is balance: getting a combo CPU-GPU where neither becomes a bottleneck for the other.


Well, 7970m (which is mobile, slightly downclocked version of HD 7870) is rated at 75W (though some sources state up to 100W). Now how they achieve that in mobile versions of their chips has always eluded me. But if that is correct number, CPU+GPU would give (95+75) TDP of 170W. Which I think would put it (when all other stuff is added) well within bounds of original Xbox360. But yeah, as you said, I wouldn't bet on 7970/680 equivalent inside any next-gen console.



JEMC said:

Yes, the latest rumors say that the nextbox will use an 8 core CPU. Fine. If you look back you'll see that I find it odd to change the 8 core Intel chip they were using for and AMD one because of the difference in performance: you can get a lot more from the Intel chip running slower than from an AMD one. And using less power!

And another thing. Modern GPUs use a lot more power than the CPUs. If they are mad enough to go with a 100W CPU, what kind of GPU will they use?Remember that the important thing here is balance: getting a combo CPU-GPU where neither becomes a bottleneck for the other.

I think people forget that MS/Sony started designing their next consoles years ago. Intel makes its own chips, and never did any customizing (or "whoring out"). So the path for MS was either continue with PPC or go AMD, for Sony it was either continue with Cell or go AMD. Now comes the fact that around that time, AMD started to appraise their "NextGenCPUsThatBlowIntelOutOfTheWater" technology. And AMD needed/needs customers to survive, unlike Intel. They currently survive by keeping prices as low as possible. In hindsight, seeing the rumoured CPUs, it is clear that both MS and Sony joined the AMD bandwagon years ago. Probably aiming at quad-core CPUs but since the "..." chips turned out to be duds, they now plan with 8core AMD CPUs (which are/will be way, way, way cheaper than Intel parts).

As for the GPU, I think you are wrong here. An AMD 7770 GPU (80 Watts) has about 3 times the shader count (640, 160*4 vs 240, 48*5) as the Xbox chip and runs at twice the clock, so it is around 6 times faster than what XBox360/PS3 have. That is more than enough power for 1080p/60Hz. Add a refresh and toss out all the PC stuff consoles don't need and you have a powerful GPU that runs less than 60 Watts.

AMD 8core architecture is actually ideally suited for stuff like Kinect2, so expect to see 1 (2cores) or even 2 (4cores if camera resolution is very high) groups reserved to Kinect2. Of course, not using Kinect2, hence freeing all cores, will give the XBoxNext enourmous power.



HoloDust said:

Well, 7970m (which is mobile, slightly downclocked version of HD 7870) is rated at 75W (though some sources state up to 100W). Now how they achieve that in mobile versions of their chips has always eluded me. But if that is correct number, CPU+GPU would give (95+75) TDP of 170W. Which I think would put it (when all other stuff is added) well within bounds of original Xbox360. But yeah, as you said, I wouldn't bet on 7970/680 equivalent inside any next-gen console.

(Numbers taken from Anandtech)

 HD 7970  HD 7900M
Stream Processors 2048 1280
Texture Units 128 80
ROPs 32 32
Core Clock  925MHz  850MHz
Memory Clock  1.375GHz (5.5GHz effective)  4.8GHz
Memory Bus Width  384-bit  256-bit
Memory  3GB GDDR5  2GB GDDR5

As you see, the cards are not exactly the same.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

ethomaz said:
D-Joe said:

drkohler said:

Add the point that Kinect2 probably requires 2-4 cores on its own

No chance

I think it's safe to say 1 core is dedicated to Kinect.

That just overestimated

Not only power,people even overestimated next-gen Kinect's cost