DIY Home Improvement Forum banner
1 - 11 of 11 Posts

· Registered
Joined
·
25 Posts
Discussion Starter · #1 ·
Hello. I’m trying to finalize my plan to install LED strip lighting under a covered porch. The porch is 36 feet long, with a post every 6 feet. I’m planning to have a separate strip for each of the 6 foot “sections”, with a separate controller so that I can change the color independently for each one. Since each strip will be less than 2 meters each, planning to go with 12V strips. I have a 12V power supply picked out that can handle the total wattage.
My question is with the RGBW extension cables. In the basic diagram below, I have the PSU in the upper right. The strips are shown in red, and extension cables in blue. How do I calculate the voltage drop through the extension cables? The longest run would be about 35 feet of extension before it reaches the strip. Most extension cables I see are 22AWG. Would that be sufficient? Do I need to find something at 20 or 18AWG?
 

Attachments

· Very Stable Genius
Joined
·
4,665 Posts
Too lazy to do the math, but you'll use E=IR and resistance of 22awg Cu
is about 16 ohms per 1000ft..
Actually...don't think I could do the math if i wanted to...Don't think op gave
wattage/current of lights. Likely need to find current first using 12V and
wattage spec of light(s).
 

· Very Stable Genius
Joined
·
4,665 Posts
OK, got it.....and I'll try to be less lazy......

35ft of 22awg @ 16ohms/1000ft = .56ohms
2 meter strip draws 2 amps
E=IR therefore Vd=2 x .56= 1.12V
So, the last strip will receive about 10.9Vdc rather than its intended
supply of 12Vdc.

Unfortunately I don't know if that'd cause a problem. My suggestion
would be to either contact the manufacturer with this info, or, increase
the conductor size, or (and this would by favourite) give it try with the
wire and light just piled next to the controller as an experiment.

Make sense?
 

· Registered
Joined
·
25 Posts
Discussion Starter · #8 ·
Thank you CodeMatters!! This is incredibly helpful. I was following your math and re-ran the numbers with a 24V strip. I originally thought that with the relatively short length of the actual LED strip 12V would make sense. But taking the long extension cable into consideration, there would be less voltage drop on a 24V strip. At least I know how to run the numbers and can try various combinations of strips and AWG on the extension. This community is always so helpful!
 

· Very Stable Genius
Joined
·
4,665 Posts
Pretty sure I made a mistake yesterday.
The value of 16ohms/1000ft represents the resistance of a single conductor,
not the combined supply + return resistance.
Therefore everything has to be doubled.
35' run uses 70' of conductor

70ft of 22awg @ 16ohms/1000ft = 1.12ohms
2 meter strip draws 2 amps
E=IR therefore Vd=2 x 1.12= 2.24V
So, the last strip will receive about 9.8Vdc rather than its intended
supply of 12Vdc.


Apologies for previous error.
 

· Registered
Joined
·
8,252 Posts
I quite agree that off-the-shelf 22 AWG extension cable is not appropriate. 18 AWG thermostat cable makes more sense for the longer runs.

The little I know about LEDs, anything longer than 16' needs 24v. I got this today from a online LED company.
That may be their preferred practice, but it's not a *truth*. There's more than one way to solve the voltage drop problem. They just don't want to tell you about it because they fret that it's not customer friendly. Feeder is another method, but I can see why an LED company might not like that method.

For OP's purposes, OP only has 6' sections since they'll be driven independently.
 

· Registered
Joined
·
25 Posts
Discussion Starter · #11 ·
I hadn’t considered using thermostat cable, but I see it’s 18 AWG. Since those are solid and all of the RGBW I’ve seen are stranded, will that make any difference? Planning to solder them to the strips. Thank you.
 
1 - 11 of 11 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top