Electricity Used to Remove A Watt of Heat with the AC

apostolakisl

Senior Member
I have been trying to figure out how much it actually costs to run electrical items in your house during the cooling season.

I have assumed that every watt of electricity used by something contained within the house ultimately turns into heat in the house and thus needs to be removed.

How much electricity it takes to remove this heat is the question. This is a topic of great significance in server rooms as they use tons of electricity to get rid of the heat the servers create. But I can not find anything but guesstimates.

I have read that one watt burned uses between .3 and 1 watt to remove via the AC. That is a big spread! The only factor that can affect that is the efficiency of the AC. Insulation is not relevant since I am not trying to figure out electricity used to remove heat the leaked in from outside, just heat created inside. Somewhere there must be some equation that uses the SEER value or something to calculate the electricity used to remove the heat from 1 watt. It is quite easy to find the equation to convert watt into BTU, and btu's moved per ton of AC is out there, but watts used by AC to move those BTU's eludes me.
 
When you say "burned" are you taking into account the efficiency of the power supply?
May I ask why you're trying to calculate this?
 
When you say "burned" are you taking into account the efficiency of the power supply?
May I ask why you're trying to calculate this?

I am curious to know if I switch to a 10 watt led instead of 65 watt incandescent, aside from saving 55 watts, what am I saving in reduced cooling costs.

The efficiency of the item using the watt isn't relevant. The led is more efficient, therefore it uses 10 watts to make the same amount of light. But it will still make the same amount of heat as a 10 watt incandescent. A watt consumed turns into x number of btus of heat regardless of what used it or how efficient it was. Efficiency just means less watts used, not less heat per watt. This is assuming that the item and any of its byproducts are completely contained within the conditioned space of the house and that you haven't converted some of the electrical energy into some form of energy besides heat. With light bulbs, all the energy ends up as heat inside the house (unless I suppose if you are pointing the bulb out a window so some of the light escapes and turns into heat when it hits something outside, but that is pretty insignificant)
 
I would have to say that depending on where you live that the lower heat dissipation will help you when the A/C is on but hurt you when the heat is on. It might balance out.
 
I would have to say that depending on where you live that the lower heat dissipation will help you when the A/C is on but hurt you when the heat is on. It might balance out.

I live in Texas. We have the AC on for 9 months out of the year. It does help with the heating, but it isn't efficient. The heat pump makes 3 times as much heat with the same amount of electricity.
 
Depends on lighting efficiency. If the light is LED light, its efficiency can be as high as 75%, so only 25% becomes to heat. If the power is consumed by LED TV, then 50% of the power becomes light, escaped through windows,
 
Lou,

Coeficient of performance (COP) of a A/C unit is defined as the ratio of energy moved to energy used to do so. Many A/C units are rated in SEER values. From Wikepedia, COP = SEER/3.792. Typical values for SEER for hi efficiency units are 16 to 20, so let's say 18. So COP is 18/3.792 or 4.74.

If a light bulb is giving off 1 watt, it will take a hi efficiency A/C unit 1/COP (1/4.74), or 0.21 watts to remove that energy. You can plug in your own values.
 
Depends on lighting efficiency. If the light is LED light, its efficiency can be as high as 75%, so only 25% becomes to heat. If the power is consumed by LED TV, then 50% of the power becomes light, escaped through windows,

That is true initially. But eventually 100% turns into heat. When they say a bulb is 75% efficient at turning electricity into light, that means that 75% of the energy leaves the bulb as light. But that light ends up hitting objects in the house and getting converted to heat. Of course some of the light may go out the window so this isn't exactly perfect, but it's pretty close. 100% of the energy consumed has to be accounted for and with light bulbs of any sort it ends up as heat.

Lou,

Coeficient of performance (COP) of a A/C unit is defined as the ratio of energy moved to energy used to do so. Many A/C units are rated in SEER values. From Wikepedia, COP = SEER/3.792. Typical values for SEER for hi efficiency units are 16 to 20, so let's say 18. So COP is 18/3.792 or 4.74.

If a light bulb is giving off 1 watt, it will take a hi efficiency A/C unit 1/COP (1/4.74), or 0.21 watts to remove that energy. You can plug in your own values.

Thanks Sandpiper. That is exactly what I was looking for. I wasn't googling the right stuff. I'll have to look at it closer since that COP value must relate to outside temp as well. SEER is an average of sorts based on typical seasonal temps so converting SEER to COP would presumably be giving you the COP average based on whatever seasonal values they used to get the SEER (which probably are kinder than my local temps). On a 110 degree day the heat pump is less efficient than on an 80 degree day at moving heat out.
 
I found that SEER is caculated using the following outside temp breakdown during the cooling season. I clearly can not use a SEER to COP conversion since where I live it is regularly 105 degrees. In fact last summer we had 80 days of 100 degree temps and the temp never dropped below 80 for 2 months. And the temp would be more than 100 degrees for 12 hours a day.

67F----21.4%
72F----23.1%

77F----21.6%
82F----16.1%
87F----10.4%
92F----5.2%
97F----1.8%
102F---.4%
-------------
Total = 100%
 
You are on the right track. I knew the SEER to COP conversion was an approximation, but I didn't have time to research it. But yes, the COP will vary according to delta temperature (inside vs outside). A google search of "COP versus temperature" will show what I'm talking about.
 
I managed to come up with a detailed spec sheet on my unit. It is fairly complicated and there are a million variables. But, most of them don't change the numers that much. My SEER 19 unit works out to this.

Assumptions:
the house is 76 degrees with 50% humidity inside in the summer.
the house is 70 degrees in the winter.

Heating mode
Tout COP
17 2.9
27 3.3
37 3.7
47 4.1
57 4.5
67 4.8

Cooling
Tout COP
75 4.9
85 4.3
95 3.7
105 3.1
115 2.7

So on a typical summer afternoon when it is 105 outside, burning 100 watts less inside the house on the lights or whatever reduces the electric consumption of the house by about 100 + 100/3.1 = 132 watts

On a typical dead of winter day when it is 37 outside, burning 100 watts less inside the house on the lights reduces the electric consumption of the house by about 100 - 100/3.7 = 73 watts.

If our weather were equally distributed between cooling and heating, it would seem that 100 watts saved in lighting, is, well, 100 watts saved since the summer and winter would more or less balance out. But since I am running the AC about 2x as much as the heat, I am probably on average saving about 110 or 115 watts for every 100 watts less I burn on light bulbs.
 
Back
Top