By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Volterra_90 said:
JEMC said:

But I wasn't talking about the electricity cost, but the amount of heat that the components produce.

The PS4 and X1 use more than 100W while gaming (source: extremetech.com) but that is for the whole device: CPU, GPU, RAM, HDD and optical drive. With the 8350, just the CPU would already generate more heat that a PS4/X1, and you should have to add a similar amount for the rest of the console.

You can't put 200W of hardware into a shoe sized box and expect to run cool without big heatsinks and noisy fans, which no one wants.

Yeah, I mean, if you already have a component which consumes up to 100W, probably the entire box would consume more than 200W, and, well, you'll need a big box with really noisy fans. And it'll consume double the amount they already consume. Am I right? (As I said, not an hardware expert).

Actually, it has nothing to do with hardware but rather physics. It's the law of conservation of energy: "Energy can not be created nor destroyed, only transformed" (or something like that). 

In this case, it means that 100W of electricity = 100W of heat.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.