Is your computer running up your electric bill?

By Dave Wallace - RIVER BENDER - October 2011

We recently received a notice from Progress Energy that we're using far more electricity than our neighbors. The notice said our neighbors (approximately 100 in the survey) used 1218 kWh per month on average while we used 1856 kWh. It also said that 70% of our neighbors set their thermostats to 78-degrees or turned them off completely before leaving home. If that's true, our solution is probably simple. All we have to do is raise our thermostat up from our setting at 72-degrees and start turning off the air conditioning when we leave home. We'll try 75-degrees as a starter and see if it reduces our bill.

But my wife insists that leaving the computer on all day isn't helping our electric bill and I should turn it off because every little bit helps. Is she right?

Electric energy usage is measured in kilowatt-hours (1000 watts used for one hour is one kilowatt). The formula is: Wattage x Hours Used per Day ÷ 1000 = daily Kilowatt-hour (kWh) consumption. For example, a 100-watt light bulb burning for 10 hours uses one kilowatt-hour (kWh). To determine an approximate cost from Progress Energy you can use $0.10 per kilowatt-hour so it will cost you 10-cents to burn that 100 watt bulb 10 hours. Let's now talk about computers.

The amount of power consumed by computers varies. If your desktop PC is new with high-end graphics and dual-core then it probably has a 500 watt power supply. But the average power supply is probably around 300 watts so let's use this for an example and throw in another 80 watts in case you're still using an old 17" CRT monitor. Let's say you leave it on from 6AM to 10PM every day or 16 hours/day. Using the formula we have (380 x 16) ÷ 1000 = 6.08 kWh per day. At $0.10 per kWh our cost is 61-cents per day, or $18/month. Hey, that doesn't look good. Maybe my wife is right. Maybe I should turn the computer off.

The problem with the above analysis is that just because a computer or monitor has a power supply of a given wattage it doesn't mean that it consumes that wattage constantly. The most power used is when it's actually computing. A computer with a 300-watt power supply probably uses only about 60 watts routinely and 200 watts peak under heavy load. And if you're smart you're using Windows power management to automatically turn off your monitor and put your computer in standby or hibernate mode after so many minutes of non-use. This causes power consumption to drop dramatically to under 10 watts which is trivial. Most laptops only use 15-60 watts so I wouldn't even worry about turning them off as long as they're plugged in.

Conclusion: Let's say I use my computer for only 8 hours a day instead of 16 and average 100 watts plus 80 watts for the monitor during that time. The rest of the time the computer switches to standby with the monitor off so we can ignore the small power drain. Using the formula we now have (180 x 8) ÷ 1000 = 1.44 kWh. At $0.10 per kWh the cost is 14-cents a day or $4.32/month. I've probably exaggerated my use for the example. The point is that unless you happen to be a heavy full-time user, your computer will have little effect on your power bill. I do turn mine off a night. If you're trying to reduce power consumption in your home look at appliances that use lots of power (watts) like air conditioning/heating, water heater, clothes dryer, etc. Google "reduce electric bill."