Hello all. I am new to this forum but am very active in several forums. I'm a big believer in them and have saved a lot of time and money using them therefore I am reaching out to the experts here on this one.
Before I posted this, I conducted a search on this forum to be sure I do not re-post something that has already been answered. There were a few bits and pieces I was bale to string together but hope that I can get further help by explaining exactly what I am doing.
I am about to instal low voltage lighting around my driveway. To paint a picture of the layout, imagine a U-shaped driveway with the two entrances running perpendicular to the street. At the opposite end is my house and deck. I plan on running 12 gauge wire from the transformer on my deck in TWO SEPARATE runs, run A and run B.
Run A will total 192 feet with the first light at about the 40 foot mark. I will be putting a total of 6 spreader lights that are each 10 watts and one spot that will be 50 watts.
Run B will also begin at the transformer on the deck and will be a total of 228 feet of 12 gauge wire. The first light will be at approximately the 55 foot mark. Again, on this run I will be using 6 spreader lights at at 10w a piece and one spot at 50w.
I have calculated run A to have an approximate 1.53 volt drop. Run B will have a 1.82 volt drop.
My question is whether or not there will be sufficient power to all the lights on each run assuming that I utilize a multi tap transformer and use the 14 volt hookup for run A and a 16 (or the next higher) for run B. My main concern is that run B is such a long run, that even if I compensate for the voltage drop by using the higher hookup, there will still be a difference in the brightness of the two strings of lights.
Any advice or suggestions would be greatly appreciated.
Thank you
Before I posted this, I conducted a search on this forum to be sure I do not re-post something that has already been answered. There were a few bits and pieces I was bale to string together but hope that I can get further help by explaining exactly what I am doing.
I am about to instal low voltage lighting around my driveway. To paint a picture of the layout, imagine a U-shaped driveway with the two entrances running perpendicular to the street. At the opposite end is my house and deck. I plan on running 12 gauge wire from the transformer on my deck in TWO SEPARATE runs, run A and run B.
Run A will total 192 feet with the first light at about the 40 foot mark. I will be putting a total of 6 spreader lights that are each 10 watts and one spot that will be 50 watts.
Run B will also begin at the transformer on the deck and will be a total of 228 feet of 12 gauge wire. The first light will be at approximately the 55 foot mark. Again, on this run I will be using 6 spreader lights at at 10w a piece and one spot at 50w.
I have calculated run A to have an approximate 1.53 volt drop. Run B will have a 1.82 volt drop.
My question is whether or not there will be sufficient power to all the lights on each run assuming that I utilize a multi tap transformer and use the 14 volt hookup for run A and a 16 (or the next higher) for run B. My main concern is that run B is such a long run, that even if I compensate for the voltage drop by using the higher hookup, there will still be a difference in the brightness of the two strings of lights.
Any advice or suggestions would be greatly appreciated.
Thank you