Calculating low voltage wire size


Active Member
I don't trust myself on this, so help is appreciated. This is related to my post about using LED strip lights in my home's crawlspace...

LED strip light (24Vdc) uses 5w/ft. To keep it simple, assume a 100' total wiring distance, with 10 one foot LED strips spaced 10 feet apart (so 10 LED strip "lights", each one foot long). The strips will be connected in parallel, so I will use a T-tap every 10' to wire a one foot strip into the buss. So what wire gauge do I need for this to keep the brightness close to the same for all lights? FYI... I don't know what the maximum voltage drop allowed is for the strips. There are voltage drop calculators that give me the drop if there is only one light, but I don't know how the other lights in parallel affect the overall circuit.

If I instead use two 50' runs with five lights per run (both runs wired to the same transformer in parallel) instead of one 100' run, does that cut the voltage drop in half.

Bottom line is I'm trying to find the most efficient way to wire up a grid that is about 35' x 100', using one or maybe two transformers.

Even if you run 10 x 1' LED strips in a daisychain run (for 100' it would only be 9 x 1' strips) you would only have 5W x 9 = 45Watts.

I would wire a few main bus runs of standard 14/2 cable (#14 AWG) down the middle or at each end and interlace the cross wiring from each end. 45 Watts on a daisychain / 24 Vdc is less than 2 amperes. You should experience any voltage drop you would be able to detect.

Run the whole grid with standard 120v ac wiring cables and octagon boxes, mark a few with red warning stickers and connect the 24vdc supplies to it. That would likely be your cheapest wiring method and very low losses in voltage. You could even zip tie the LED strips to the 14/2 cables for a cheap mount.

If you have 9 x 1' strips every 10' in between spans for 35' width you would only have 9 x 3 x 5W = 135 Watts. At 24Vdc the total current would only be 135 / 24vdc = < 5.0 amperes