I have been battling with low-voltage lighting for a week now.
The on-topic part is that I am controlling them using Intermatic's HA04 outdoor switches (which work but don't play as nicely as the rest of Vizia-RF-only dimmers, switches and controllers I use).
Now for the OT stuff:
I have fairly long runs of 200-300ft. That means fairly significant voltage drop over the length. I have been doubling up 12-gauge wires and I am getting there.
I have by now tried various transformers. I have easy access to three brands:
- Malibu (Home Depot)
- Hampton Bay (Home Depot)
- TDC Power (chinese), which came as part of a cheap Lamps Plus bundle.
I have tried the 300 Malibu, the 600W Hampton Bay and the 150W TDC Power.
The totally baffling thing is that I get very different voltage drops from the different brands. I have about 140W of load, evenly distributed. All transformers (with the Hampton Bay on "high") manage to drive around 12.5V at the transformer. At the end of the line, I get about:
- 10V from the TDC
- 9V from the Malibu
- 7V from the Hampton Bay
This is as measured by my BK voltmeter. This is somewhat corroborated by my kill-a-watt where the TDC pulls 145W while the Hampton Bay around 95.
I won't be able to do much more experimentation as I have already returned the Malibu and will the Hampton Bay pretty soon.
I am a bit confused by the results. I can understand different driving voltages under load, but that's not really the case. The only explanation I may have is the power efficiency (I/V phase difference) of the various transformers. Although, as the loads are basically purely resistive, it's a bit confusing.
Another possible (could be the same) reason is that the Malibu and HamptonBay are both higher power, so they may operate in a less efficient range (but what does that mean?). I do remember seeing warning that the transformers should be use at >1/4 or 1/2 load. But I am still missing the why part. Another funny thing is that the HamptonBay docs say that with 12 gauge, you can go to 80ft and 300W while the Malibu says 200ft and 300W... as if HamptonBay was aware that their transformers have a worse IR drop that Malibu's.
The only thing I can think of is to put a scope and look at the I vs. V waveforms. I don't think I want to get to that. Does anyone know what's going on or can point me to some related info? I have searched the www, but all I have found is manufacturers boasting about the drop _at the transformer_, not at the end of the line.
Thanks in advance.
Laurent
The on-topic part is that I am controlling them using Intermatic's HA04 outdoor switches (which work but don't play as nicely as the rest of Vizia-RF-only dimmers, switches and controllers I use).
Now for the OT stuff:
I have fairly long runs of 200-300ft. That means fairly significant voltage drop over the length. I have been doubling up 12-gauge wires and I am getting there.
I have by now tried various transformers. I have easy access to three brands:
- Malibu (Home Depot)
- Hampton Bay (Home Depot)
- TDC Power (chinese), which came as part of a cheap Lamps Plus bundle.
I have tried the 300 Malibu, the 600W Hampton Bay and the 150W TDC Power.
The totally baffling thing is that I get very different voltage drops from the different brands. I have about 140W of load, evenly distributed. All transformers (with the Hampton Bay on "high") manage to drive around 12.5V at the transformer. At the end of the line, I get about:
- 10V from the TDC
- 9V from the Malibu
- 7V from the Hampton Bay
This is as measured by my BK voltmeter. This is somewhat corroborated by my kill-a-watt where the TDC pulls 145W while the Hampton Bay around 95.
I won't be able to do much more experimentation as I have already returned the Malibu and will the Hampton Bay pretty soon.
I am a bit confused by the results. I can understand different driving voltages under load, but that's not really the case. The only explanation I may have is the power efficiency (I/V phase difference) of the various transformers. Although, as the loads are basically purely resistive, it's a bit confusing.
Another possible (could be the same) reason is that the Malibu and HamptonBay are both higher power, so they may operate in a less efficient range (but what does that mean?). I do remember seeing warning that the transformers should be use at >1/4 or 1/2 load. But I am still missing the why part. Another funny thing is that the HamptonBay docs say that with 12 gauge, you can go to 80ft and 300W while the Malibu says 200ft and 300W... as if HamptonBay was aware that their transformers have a worse IR drop that Malibu's.
The only thing I can think of is to put a scope and look at the I vs. V waveforms. I don't think I want to get to that. Does anyone know what's going on or can point me to some related info? I have searched the www, but all I have found is manufacturers boasting about the drop _at the transformer_, not at the end of the line.
Thanks in advance.
Laurent