5 minutes read

Category:

Last updated: 06/26/2020

The watt as a unit of power was first proposed in 1882 by the German engineer and inventor Carl William Siemens, within the then-existing system of practical units. It was officially adopted as the so-called "international watt” in 1908. In 1948, the "international" watt was redefined to the “absolute watt”. This definition is still in use today: One watt is the quantity of energy transferred in one second, or 1 joule (J). In 1960, the absolute watt was adopted by the International System of Units (SI) as the unit of power. (One absolute watt equals 1.00019 international watts from before 1948.)

See if you could be getting a better deal:

According to the Energy Information Administration (EIA), the average retail price for a kWh of electricity across the U.S. was 10.48 cents in 2017. The actual price paid for a kWh varied from 7.79 cents to 26.05 cents from the cheapest rates in Louisiana to the most expensive rates in Hawaii. In New York, residential electricity customers paid an average rate of 14.74 cents for every kWh they used that year. On top of these retail prices, other charges apply for electricity customers, such as a base charge, delivery and distribution charges as well as other surcharges, taxes and fees..

By definition, energy (E) equals power (P) multiplied by time (t). Hence, one kilowatt-hour (E) is representing the energy delivered at a rate of 1 kilowatt (P) for the period of 1 hour (t). In other commonly used units, 1 kWh represents an energy expenditure of 3,600,000 joules or 3,412 British thermal units (Btu).

A very common way to express the work a kilowatt-hour can do is that you can turn on a 100-watt light bulb for ten hours. Since nowadays 100-watt light bulbs are not that common anymore, the 10-watt equivalent LED bulb will run for about 100 hours. In other words, you can turn on 10 LED bulbs using the same energy (and for the same cost) as one incandescent light bulb.

In the U.S. it can be hard to estimate the operating cost of your appliances since manufacturers typically don’t advertise the power consumption of their devices. Below is a list of some major household appliances and their power ratings, together with the average time they’re typically being used on an average day. The estimated cost to run the appliances is based on the average New York electricity rate of 14.74 cents or $0.15 per kWh.

Using the above example, leaving a 100-watt light bulb on for 24 hours will cost a New York resident an average of 35.4 cents ($0.35) for using 2.4 kWh.

Running an average 3,800-watt (3.8 kW) 24,000 Btu central air conditioning unit will cost $2.24 a day if used for 4 hours each day, using 15.2 kWh. A more efficient 16 SEER rated 24,000 Btu air conditioner needs about 2,800 watts of power and uses only 11.2 kWh or $1.65 to run for 4 hours.

A standard washing machine uses up to 1,300 watts or 1.3 kW of power. In average, it’s running for about an hour per load, using up to 1.3 kWh for that load. In average, a New York resident will pay a little over 19 cents to wash a load of clothes. An Energy Star rated washing machine can use as little as 500 watts or 0.5 kW, using only 0.5 kWh with a cost of 37.07 cents ($0.37) per load.

An electric clothes dryer pulls up to 4,000 watts or 4 kW while a gas heated dryer only needs around 350 watts. The operating cost is 59 cents ($0.59) and 1.4 cents ($0.01) respectively per one-hour load.

A standard dishwasher has a power rating of 1,300 watts or 1.3 kW and usually runs for about 4 hours per load. That means, one load will need about 5.2 kWh of energy, which will cost you 76.7 cents or $0.77.

An average 65-inch LED TV consumes about 130 watts and uses 0.13 kWh per hour. According to a Nielsen report, the average American adult watches a little over five hours per day, using 0.52 kWh. This comes back to just 7.66 cents (less than $0.08) a day.

Looking at it the other way round, one kWh lets you operate your 100-watt light bulb for 10 hours, the 1.3-kW dishwasher for about 46 minutes and the 3.8-kW AC unit for almost 16 minutes. A ceiling fan using 70 watts could run 15.4 hours until it uses up one kWh of electricity.

When you look at your natural gas bill, you’ll see a different unit is being used for billing. In the U.S., utility companies and retail energy providers typically bill their customers per *therm* of natural gas. One U.S. therm equals 100,000 Btus or 29.3 kWh. The formula to convert therms to kilowatt-hours is the number of therms multiplied by 29.30011. Natural gas is sometimes also measured in cubic feet (cf or cu. ft.), especially when it’s delivered in small quantities. The energy equivalent of 100 cf (commonly referred to as 1 CCF) equals one therm. The average retail price of natural gas in New York was $1.204 per therm in 2017.

The use of different units for the different types of energy makes it hard to compare the two. To be able to compare the efficiency of electric appliances with natural gas powered appliances you can convert the therms to kWh using the above formula. This is important, for example if you need to decide whether to buy a new electric or natural gas furnace. You can then compare the energy efficiency of the two devices and also the cost to operate each device.