By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General Discussion - Question about energy usage of electronics

Hey, hopefully someone can help me out with this.

I'm going to make it pretty simple, and assume 100% load all of the time (not because it is realistic, but because I'm curious about how energy usage is actually measured).

Imagine I have a PS3 and it has a 250W PSU. If it is working at 100% load, 24 hours a day, how much power am I usage? Does the 250W refer to the amount of power used in an hour, or some different time period, or will it work out differently?

I'm just wondering about how I would calculate the running cost of a device. While it's obviously more difficult with something that would vary wildly (for example, a gaming PC, which would run at a much lower load when it is only being used for checking emails vs. gaming), I'm just wondering how the cost would be calculated, assuming 100% load (or basically, assuming the maximum usage at all times).

If I'm paying 20c per KWh with the above PS3, does that mean I'd be paying 5c (250W/1KWh) per hour?



Around the Network

1) That assumption is not particularly valid. I would estimate closer to half that, but never mind.

2) Actually answering your question.

There are two seperate concepts to keep track of.
Power (measured in watts)
and energy (measured in watt hours).

You could refer to the total amount of 'juice' stored in a battery, which is more technically the total power in the battery and is thus measured in watts. It is meaningless to talk about whether the total power stored in a battery is 'per hour'. Whether you drain your battery quickly or slowly it contains the same power.

You could talk about the ability of the battery to deliver charge quickly, this is then the energy consumption of whatever the battery controls. This is measured in watt hours (Wh), which are simply equal to the average power (in watts) times the number of hours.

So if you run a device at 250 W for 1 hour, you use 250 Wh of power.
If you use a 1 W device for 250 hours, you use 250 Wh of power.

tl;dr 'power used per hour is meaningless, you want energy per hour, which can be worked out by simply turning that 250 W into 250 Wh.

As for cost, you are correct that you would be paying 5c per hour assuming that is the cost of energy in your area.

You can get devices that measure the power and energy used by a device, such as described here http://www.choice.com.au/reviews-and-tests/household/energy-and-water/saving-energy/power-meters-review-and-compare.aspx

Any questions?



I believe 250W means it's using that many watts a second. So let me do the math....I got 21,600,000 watts a day if it really is using that much a day. Now if you were to really do this tell me how high your electric bill got.



Yep, that's right. the wattage is per hour. But only light bulbs run at the full wattage constantly.

http://news.cnet.com/8301-17938_105-10318727-1.html

PS3 Slim PS3 (60G, first gen)
Standby 0.36 1.22
Idling 75.19 171.35.
Blu-ray movie 80.90 172.79.
Game / idling 95.35 203.84
Game / playing 96.24 206.90.
YouTube 85.08 181.28

So playing a game on a ps3 slim at 20c per KWh costs you less then 2c per hour.

You can get simple energy meters to stick between the wall and power bar, fun to put before the amplifier and see how high you can get the wattage turning up the music :)



techhunter80 said:

I believe 250W means it's using that many watts a second. So let me do the math....I got 21,600,000 watts a day if it really is using that much a day. Now if you were to really do this tell me how high your electric bill got.


No, the phrase "using 250 watts per second" is meaningless.

 

A watt is a joule per second, and is a measure of power, not energy.

 

You can say "using 1 joule per second", but you can't say "using 1 watt per second"



Around the Network
GreyianStorm said:

If I'm paying 20c per KWh with the above PS3, does that mean I'd be paying 5c (250W/1KWh) per hour?

You can see it already from the units that that's true

(250 * watts * 20 * cents ) / (1000 * watts * hour ) = (5 * cents ) / hour

Treat units as another thing to be multiplied and cancelled, it works out cleanly.



I just remembered what the device in question was: a kettle.

I saw a kettle in a local shop, and on the box they had 3000W in big writing (does that really sell kettles to anyone?) and was curious about what exactly it meant. I assumed it meant that, if the kettle was boiling water for 1 hour straight (assuming it was working at full capacity the entire time, which wouldn't actually happen in the real world), it would use 3000Wh, or 3KWh, so 60c to run for 1 hour (in the 20c example given above).

I'm aware that my 100% load example isn't realistic, I just thought it work make it easier to get an answer (although, I suppose so long as I had chosen any simple enough percentage, 50%, 25% etc, it would have been pretty simple still).

It's nice to know for sure though, how this stuff works. I always assumed it measured the hourly usage (at 100% load), but it's nice to have it confirmed. A friend of mine is looking into building his own computer and has been asking what kind of power costs to expect, so it's nice to be able to give him an absolute maximum (PSU Wattage * 24) for daily usage costs, and then obviously explain that it won't be running at 100% load and (most likely) won't even be turned on 24 hours per day.

Thanks for your help!



A kettle is basically a heating rod inside, so a 3kW kettle will most likely run at 3kW for about 1min to boil the water.

With computers ensure you tell him to get a PSUwith a high efficiency like 90% anything less (ie 75%) is just unacceptable and your wasting energy and money haha.



 

 

Just get a kill a watt meter and plug your device in it to see after a few days of usage. Also comes in handy if you are into building your own PCs to see how big of a power supply it needs.