By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Hey, hopefully someone can help me out with this.

I'm going to make it pretty simple, and assume 100% load all of the time (not because it is realistic, but because I'm curious about how energy usage is actually measured).

Imagine I have a PS3 and it has a 250W PSU. If it is working at 100% load, 24 hours a day, how much power am I usage? Does the 250W refer to the amount of power used in an hour, or some different time period, or will it work out differently?

I'm just wondering about how I would calculate the running cost of a device. While it's obviously more difficult with something that would vary wildly (for example, a gaming PC, which would run at a much lower load when it is only being used for checking emails vs. gaming), I'm just wondering how the cost would be calculated, assuming 100% load (or basically, assuming the maximum usage at all times).

If I'm paying 20c per KWh with the above PS3, does that mean I'd be paying 5c (250W/1KWh) per hour?