I know there are some pretty scientific types here, and wondering if anyone has any math to go with this question, or has done any experiments along this line.
I have to air condition almost year round. I now have Home Assistant installed where I know inside and outside temperatures, set point on the AC, and can control it all in real time.
So... what.
Here's my theory. Make that hypothesis. First some assumptions (well, facts without quantification):
Once the air outside temp is near its minimum, crank the set point way down and cool the house substantially below the normal temperature while the cost (due to thermal efficiency) is lowest. Most days there's a 15F difference in high and low, +/-, outside air temp. That means the theoretical COP goes from 53 to 28 (give or take), which is pretty big. What is the actual change? Not a clue.
During the heat of the day go into the period a few degrees cooler, and try to run the A/C less, letting the temperature climb maybe a couple degrees above ideal.
As you go into evening bring it back a bit more, with moderate efficiency (it's a bit cooler outside) so people are comfortable for dinner, relaxing, etc.
Then repeat over night, crashing the temperature again over night.
I have done this with various parameters for some time, and find it is not really noticeable. Cold temperatures while sleeping (like 70-71) I find pleasant, and do not notice much as I wake. With it then coasting it is mid-70's by mid-morning. The problem is mid-afternoon when I let it get to 78 or so, but it is tolerable.
I am thinking of adding fan control to this, so I can equalize the temperature more in peripheral parts of the house without cooling. The 78 is not really the issue, but 78 at the thermostat means 80-81 in some rooms on the sunny side.
But... here's the problem... I have no idea if it works, where "works" is defined as being more efficient overall. After all, I am excessively cooling (in a sense) during part of the day. Does the reduced cooling mid-day pay it back?
I really have no idea. I have tried a variety of ways to test it, but day to day variations caused by sun vs shade (even if the same temperature), other unrelated power usage (I am not monitoring the A/C specifically for power draw, only run time), cooking, etc. have pretty large impacts. I'm also told that power usage is not constant, e.g. an hour of runtime at night may different amount of power than during the day (motor loading varies by temperature spread, or so I believe). So just looking at cumulative runtime on two days with and without the algorithm applied is apparently not a valid comparison either.
Most related postings online speak toward presence sensors and such; not to timing and differential costs of cooling.
Anyone done this? Seen math or studies?
Linwood
I have to air condition almost year round. I now have Home Assistant installed where I know inside and outside temperatures, set point on the AC, and can control it all in real time.
So... what.
Here's my theory. Make that hypothesis. First some assumptions (well, facts without quantification):
- Cooling is more efficient when the outside air is cooler. Exactly how much I do not know (I know the theoretical formula, but not how it translates for a real A/C).
- (Side note: For some people electricity may be less expensive at night as well; it is not here).
- Heating and cooling is more about the solid mass of the interior than just cooling the air, e.g. there is a reservoir of heat (or lack thereof) which has, for want of a better term, inertia.
Once the air outside temp is near its minimum, crank the set point way down and cool the house substantially below the normal temperature while the cost (due to thermal efficiency) is lowest. Most days there's a 15F difference in high and low, +/-, outside air temp. That means the theoretical COP goes from 53 to 28 (give or take), which is pretty big. What is the actual change? Not a clue.
During the heat of the day go into the period a few degrees cooler, and try to run the A/C less, letting the temperature climb maybe a couple degrees above ideal.
As you go into evening bring it back a bit more, with moderate efficiency (it's a bit cooler outside) so people are comfortable for dinner, relaxing, etc.
Then repeat over night, crashing the temperature again over night.
I have done this with various parameters for some time, and find it is not really noticeable. Cold temperatures while sleeping (like 70-71) I find pleasant, and do not notice much as I wake. With it then coasting it is mid-70's by mid-morning. The problem is mid-afternoon when I let it get to 78 or so, but it is tolerable.
I am thinking of adding fan control to this, so I can equalize the temperature more in peripheral parts of the house without cooling. The 78 is not really the issue, but 78 at the thermostat means 80-81 in some rooms on the sunny side.
But... here's the problem... I have no idea if it works, where "works" is defined as being more efficient overall. After all, I am excessively cooling (in a sense) during part of the day. Does the reduced cooling mid-day pay it back?
I really have no idea. I have tried a variety of ways to test it, but day to day variations caused by sun vs shade (even if the same temperature), other unrelated power usage (I am not monitoring the A/C specifically for power draw, only run time), cooking, etc. have pretty large impacts. I'm also told that power usage is not constant, e.g. an hour of runtime at night may different amount of power than during the day (motor loading varies by temperature spread, or so I believe). So just looking at cumulative runtime on two days with and without the algorithm applied is apparently not a valid comparison either.
Most related postings online speak toward presence sensors and such; not to timing and differential costs of cooling.
Anyone done this? Seen math or studies?
Linwood