Resistor/Led And Basic Electricity Questions - Electrical - DIY Chatroom Home Improvement Forum


Go Back   DIY Chatroom Home Improvement Forum > Home Improvement > Electrical

CLICK HERE AND JOIN OUR COMMUNITY TODAY...IT'S FREE!

Reply
 
Thread Tools Display Modes
Old 04-14-2012, 08:04 PM   #1
Newbie
 
Join Date: Apr 2012
Posts: 7
Rewards Points: 10
Default

Resistor/Led and Basic Electricity Questions


Hello Everyone,

First let me say that I am a newbie in the electricity world but I try my best to upgrade my knowledge.
I have some questions, most of them may be dumb, you've been warned !


1) I read that with a led you must ALWAYS put a resistor. Say we have a power supply 2V 1A and the led is a 2v 20mA, why put a resistor at all here ? The led will only draw the 20mA she needs from the power supply, she will never be "hit" by the full 1A that the power supply can deliver, am I wrong ?

2) If the voltage of the circuit is the bigger than the voltage of the led let's say power supply 12V 1A, led still 2V 20mA. We would need a drop in the voltage so we put a resistor. U = RI so R = 10/0.02 = 500 so we need to put a 500 ohm resistor to get 2V in the led. Also the power rating for the resistor should be at least 10x0.02 = 0.2W and I read that for safety we need to double that so we need a 0.4W 500 ohm resistor ( Hope I am right so far :D ). Alright so here the resistor is for dropping the voltage and not the current, I mean yes only 20mA will be able to pass through the resistor but the LED would only draw 20mA anyway ?

3) If we take the same configuration as 2) but instead of the resistor we put 5 leds 2V 20mA. It will do exactly the same as the resistor will it not ?

4) I read that paralleling a number of LEDs powered directly from a voltage source, they won’t all have the same light output. I don't understand why, each led should draw the amount of current it needs so they will glow exactly the same if these are the same leds, won't it ?

5) If we still have a power supply 12V 1A and we put in series first 5 leds 2V 20mA and then a led 2V 100mA at the end. Since the current is the same everywhere in series, will it be 20mA and the last led will be dimmed or simply won't light on or will the current be 100mA and the 5 other leds will die ?

6) Power supply 12V 1A. We only put in series 5 leds 2V or 4 leds 2V + 1 led 3V. There will be too much voltage, how will it be spread accross the leds ?

Thanks in advance,
Khivar

Advertisement

Khivar is offline   Reply With Quote
Old 04-14-2012, 08:44 PM   #2
Member
 
Join Date: Jan 2012
Location: IL
Posts: 858
Rewards Points: 732
Default

Resistor/Led and Basic Electricity Questions


1) Yes you always need a resistor. Something has to limit the current or the LED junction (basically a diode) will overheat and fuse (weld) together. The forward voltage of a diode varies with temperature, current, and batch to batch variances. However the voltage won't vary much based on current so you can't think of it as a resistor. In otherwords a 1.5V LED running at 20mA is not a 75 Ohm resistor. The idea of a series resistor is to limit current despite these variables.

3) No. You can put several LEDs in series but you need a resistor to limit the current for the same reasons you need it in #1 above. You don't need a resistor per LED but you need one in each series leg.

5) Current will be established by your series resistor. The supply voltage minus the forward voltage of all series diodes divided by 20mA will be the series resistor to use.

4) Don't parallel LEDs ever. The junction variances vary as noted in #1 so you might be severely stressing one and not the other. Series is OK with a series resistor but avoid parallel.

6) Voltage will be split based on the characteristics of the LED. They should all be close to the same if they are the same type of LED but some of these high output LEDs have multiple die on the substrate.

Advertisement


Last edited by curiousB; 04-14-2012 at 08:47 PM.
curiousB is offline   Reply With Quote
Old 04-14-2012, 10:14 PM   #3
I=E/R
 
a7ecorsair's Avatar
 
Join Date: May 2010
Location: Minnesota
Posts: 2,052
Rewards Points: 1,000
Default

Resistor/Led and Basic Electricity Questions


As pointed out, diodes cannot be thought of as a resistor when analyzing current flow. Diode are more like a switch and when reverse biased they are off and when forward biased they are on. When forward biased, diodes will have a fairly constant voltage drop across the junction so the circuit has to be designed to limit the current to the manufactures specification. Germanium diodes have a forward bias voltage drop of .2 volts and silicone diodes have a drop of .7 volts. LEDs have different voltage drops depending on several factors.
In your example you are using LEDs that exhibit a 2 volt drop when forward biased and a design current of 20 ma. You now have to choose a resistor that will provide the correct circuit resistance so there is 20 ma flowing. If you choose the wrong sized resistor you could end up with 30 ma flowing and the LED will allow this for awhile but it will run hotter than its design and have a short life. If the resistor is too large the LED will not forward bias.
a7ecorsair is offline   Reply With Quote
Old 04-14-2012, 10:38 PM   #4
Newbie
 
Join Date: Apr 2012
Posts: 7
Rewards Points: 10
Default

Resistor/Led and Basic Electricity Questions


Thanks for your answers !

1) If I understand correctly somehow if the voltage rise up then the current rise up and will damage the led.. Does that actually mean that in a series with no resistor when giving exactly 2V to a diode which specifications 2V 20mA, the diode will not automatically draw 20mA ? For example depending of the batch ( or the heat ), at a voltage of 2V this diode could draw for example 25mA of current and that will mess it up that's why we need a resistor to regulate the current ?

3) Okay makes sense.

4) Ok I understand what you say if you're talking about :

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|---------------------------------|
---------------------====== LED2 ========

If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them.

+ ============R==== LED1 ======= ===== -
-------------------|-------------------------------- -|
--------------------===R==== LED2 =======


But in parallel like that it should work, shouldn't it ?

6)

+ ==== R ==== LED1 ====== LED2 ======== LED3 ===== -

LEDS = 2V 20mA
12V Power Supply, the Resistance takes waste 6V.

Let's say that LED1 comes from a bad batch and somehow requires 2.5V to work where the other require 2V.
Since only 6V will be available what voltage will each LED get ?
Since they all have 20mA they will glow exactly the same despite having different voltage ?

Thanks again !

Last edited by Khivar; 04-14-2012 at 10:48 PM.
Khivar is offline   Reply With Quote
Old 04-14-2012, 10:47 PM   #5
I=E/R
 
a7ecorsair's Avatar
 
Join Date: May 2010
Location: Minnesota
Posts: 2,052
Rewards Points: 1,000
Default

Resistor/Led and Basic Electricity Questions


You did not understand what I wrote.
Draw yourself two circuits fed by 12 volts with a LED that is forward biased at 2 volts and a resistor. In one circuit use a 500 ohm resistor and in the other use a 400 ohm resistor. Now, calculate the current flow in the circuit and voltage drop across the resistor.
a7ecorsair is offline   Reply With Quote
Old 04-14-2012, 11:13 PM   #6
Newbie
 
Join Date: Apr 2012
Posts: 7
Rewards Points: 10
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by a7ecorsair View Post
You did not understand what I wrote.
Draw yourself two circuits fed by 12 volts with a LED that is forward biased at 2 volts and a resistor. In one circuit use a 500 ohm resistor and in the other use a 400 ohm resistor. Now, calculate the current flow in the circuit and voltage drop across the resistor.
The resistor takes 10V then with the 500 ohm resistor there is 20mA in the circuit and with the 400 ohm there is 25mA in the circuit. In which part of my post am I wrong ?
Khivar is offline   Reply With Quote
Old 04-15-2012, 12:06 AM   #7
Semi-Pro Electro-Geek
 
Join Date: Jul 2009
Location: Arizona, USA
Posts: 2,572
Rewards Points: 2,040
Default

Resistor/Led and Basic Electricity Questions


You're not thinking about this right. LED's function differently than most loads that you're used to thinking about. Most loads are constant-voltage devices, where you apply a fixed voltage and it "draws" current "as needed" to satisfy its power requirements. These loads can be modeled as a positive resistance which results in a certain current flow at a certain voltage. LED's are not like this. They are constant-current devices. You supply a fixed current, and the voltage across the LED varies according to factors beyond your control. The rated voltage for an LED is a very rough estimate and varies tremendously depending on manufacturing tolerances and operating conditions. LED's do have positive resistance, but it's very low and does not account for most of the voltage drop across them.* The resistor in series with an LED cannot be modeled as a voltage dropping resistor, it needs to be modeled as a current limiting resistor. Your calculations are correct but the approach is different.

If you connect an LED to a constant voltage source and slowly increase the voltage from zero while monitoring current, you will see that the current remains essentially zero until you reach very close to the operating voltage of the LED. Then it suddenly increases way beyond the ratings. Connecting a "2V" LED to a 2.0V power supply will likely result in either nothing or a blown LED.

Understanding that the voltage across an LED is unpredictable and varies quite a bit, and that the ohmic resistance is very low, it should be obvious why they can't be connected in parallel. They do not share current well. Each LED or series of LED's must be fed by a current source. The answer to your question 6 should also be apparent: there will be essentially no current flow, so the voltage across each LED will be indeterminate.

Another issue should also become apparent: The power supply voltage needs to be substantially higher than the voltage across the LED's if you are going to use a resistor to limit current. Otherwise the unpredictable nature of the LED voltage will result in highly variable current. Example: 6 LED's rated 20mA each at Vf=2V in series, with a 14V power supply. The predicted voltage across the resistor is 14-2*6=2V. The resistor value should be 2/0.02=100 ohms. But what happens if the LED voltage is 10% lower than the rated value? Now the voltage across the resistor is 3.2V and the current rises to 32mA, an increase of 60%. This will increase heating of the LED's, which reduces their voltage drop, which increases current flow, which heats them more... For high power LED applications the solution to this is to use a constant current power supply rather than a current limiting resistor. For low power applications where efficiency isn't a concern, the voltage across the current limiting resistor should probably be around half the supply voltage (maybe 1/3 absolute minimum).


*Most of the voltage across an LED results from the semiconductor bandgap - the potential difference required to push ANY current through the device. This is directly related to the color of the LED. Why? Because each electron that goes through the LED must have enough energy to create one photon out of the LED. Visible photon energies are in the range of 1-4 electron-volts, so each electron must have at least that much energy in order to make it through the LED. Blue photons have more energy than red, thus blue LED's require higher voltage than red ones. White LED's are not really white, they're blue with a fluorescent phosphor that converts blue light to white - so they need the same voltage as blue LED's. The semiconductor bandgap is temperature dependent too, with higher temperatures resulting in less voltage (and longer wavelengths of light produced) so the LED voltage will vary as the device heats up or cools down. This characteristic bandgap voltage is the reason LED's behave differently than many other loads.

Last edited by mpoulton; 04-15-2012 at 12:12 AM.
mpoulton is offline   Reply With Quote
Old 04-15-2012, 01:10 AM   #8
Newbie
 
Join Date: Apr 2012
Posts: 7
Rewards Points: 10
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by mpoulton View Post
You're not thinking about this right. LED's function differently than most loads that you're used to thinking about. Most loads are constant-voltage devices, where you apply a fixed voltage and it "draws" current "as needed" to satisfy its power requirements. These loads can be modeled as a positive resistance which results in a certain current flow at a certain voltage. LED's are not like this. They are constant-current devices. You supply a fixed current, and the voltage across the LED varies according to factors beyond your control. The rated voltage for an LED is a very rough estimate and varies tremendously depending on manufacturing tolerances and operating conditions. LED's do have positive resistance, but it's very low and does not account for most of the voltage drop across them.* The resistor in series with an LED cannot be modeled as a voltage dropping resistor, it needs to be modeled as a current limiting resistor. Your calculations are correct but the approach is different.

If you connect an LED to a constant voltage source and slowly increase the voltage from zero while monitoring current, you will see that the current remains essentially zero until you reach very close to the operating voltage of the LED. Then it suddenly increases way beyond the ratings. Connecting a "2V" LED to a 2.0V power supply will likely result in either nothing or a blown LED.
Thanks I think I got it !

Quote:
Understanding that the voltage across an LED is unpredictable and varies quite a bit, and that the ohmic resistance is very low, it should be obvious why they can't be connected in parallel. They do not share current well. Each LED or series of LED's must be fed by a current source. The answer to your question 6 should also be apparent: there will be essentially no current flow, so the voltage across each LED will be indeterminate.
I think I understood that, here's a quote of myself from earlier in this post :

Quote:
4) Ok I understand what you say if you're talking about :

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|---------------------------------|
---------------------====== LED2 ========

If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them.

+ ============R==== LED1 ======= ===== -
-------------------|-------------------------------- -|
--------------------===R==== LED2 =======


But in parallel like that it should work, shouldn't it ?
Quote:
Another issue should also become apparent: The power supply voltage needs to be substantially higher than the voltage across the LED's if you are going to use a resistor to limit current. Otherwise the unpredictable nature of the LED voltage will result in highly variable current. Example: 6 LED's rated 20mA each at Vf=2V in series, with a 14V power supply. The predicted voltage across the resistor is 14-2*6=2V. The resistor value should be 2/0.02=100 ohms. But what happens if the LED voltage is 10% lower than the rated value? Now the voltage across the resistor is 3.2V and the current rises to 32mA, an increase of 60%. This will increase heating of the LED's, which reduces their voltage drop, which increases current flow, which heats them more... For high power LED applications the solution to this is to use a constant current power supply rather than a current limiting resistor. For low power applications where efficiency isn't a concern, the voltage across the current limiting resistor should probably be around half the supply voltage (maybe 1/3 absolute minimum).
Never thought of that ! So basically when calculating a resistor you must calculate it's ohm value, it's wattage value and double it just to be safe and be sure that the voltage crossing the resistor is about half the power supply's.

Thanks
Khivar is offline   Reply With Quote
Old 04-15-2012, 09:01 AM   #9
I=E/R
 
a7ecorsair's Avatar
 
Join Date: May 2010
Location: Minnesota
Posts: 2,052
Rewards Points: 1,000
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by Khivar View Post
The resistor takes 10V then with the 500 ohm resistor there is 20mA in the circuit and with the 400 ohm there is 25mA in the circuit. In which part of my post am I wrong ?
Looks good to me; it is the resistor that sets the current flow. Maybe it is the way you have worded your statements saying the LED is 20 ma and always uses 20ma. Once the LED is forward biased the voltage drop is pretty constant but as you increase current, junction resistance decreases.
In the first example the junction resistance is 100 ohms
In the second example it is 80 ohms.
P=IČR
.02*.02*100= .04 watts
.025*025*80 = .05 watts
a7ecorsair is offline   Reply With Quote
Old 04-15-2012, 09:47 AM   #10
Member
 
joed's Avatar
 
Join Date: Mar 2005
Location: Welland, Ontario
Posts: 8,378
Rewards Points: 3,550
Blog Entries: 4
Default

Resistor/Led and Basic Electricity Questions


If you have a 2 volt rated LED then there is a resistor already built into it.
__________________
Do not PM with questions that can be asked in a forum. I will not respond.
joed is online now   Reply With Quote
Old 04-15-2012, 09:54 AM   #11
I=E/R
 
a7ecorsair's Avatar
 
Join Date: May 2010
Location: Minnesota
Posts: 2,052
Rewards Points: 1,000
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by joed View Post
If you have a 2 volt rated LED then there is a resistor already built into it.
I've not work directly with LEDs. PN junctions of germanium and silicon have fairly fixed voltage drops when forward biased.
Reading through this: http://en.wikipedia.org/wiki/Led
it looks like LEDs have some different physics. I don't see any mention of integrated resistance.
a7ecorsair is offline   Reply With Quote
Old 04-16-2012, 01:23 AM   #12
Newbie
 
Join Date: Apr 2012
Posts: 7
Rewards Points: 10
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by a7ecorsair View Post
Looks good to me; it is the resistor that sets the current flow. Maybe it is the way you have worded your statements saying the LED is 20 ma and always uses 20ma. Once the LED is forward biased the voltage drop is pretty constant but as you increase current, junction resistance decreases.
In the first example the junction resistance is 100 ohms
In the second example it is 80 ohms.
P=IČR
.02*.02*100= .04 watts
.025*025*80 = .05 watts
No you're right, that's what I thought when I first posted this thread, I thought a 20mA Led will always draw 20 mA !

Can anyone confirm to me if this is right or false :
Quote:
4) Ok I understand what you say if you're talking about :

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|---------------------------------|
---------------------====== LED2 ========

If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them.

+ ============R==== LED1 ======= ===== -
-------------------|-------------------------------- -|
--------------------===R==== LED2 =======


But in parallel like that it should work, shouldn't it ?
Khivar is offline   Reply With Quote
Old 04-16-2012, 02:34 AM   #13
Semi-Pro Electro-Geek
 
Join Date: Jul 2009
Location: Arizona, USA
Posts: 2,572
Rewards Points: 2,040
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by joed View Post
If you have a 2 volt rated LED then there is a resistor already built into it.
No. Why would you say this?
mpoulton is offline   Reply With Quote
Old 04-16-2012, 03:24 AM   #14
Newbie
 
Join Date: Apr 2012
Posts: 7
Rewards Points: 10
Default

Resistor/Led and Basic Electricity Questions


Hello again, I have some more questions to fully understand how it works

+ ======== LED =========== -

Power Supply : 1.8V 200mA ( I made that up, dunno if it actually exists )
LED : 2V 20mA
From my understanding here we don't need a resistor because since our power supply only deliver 1.8V, there is no way the LED would draw more than her normal voltage, and no higher voltage means no higher current so the current it will received will be a little less than its specification here and then the light output will be a little dimmer. Or am I wrong and the LED will drawn 1.8V and will be hit by the 500mA and die right away ?

+ ===== R ====== LED ====== -

Power Supply : 12V 1A
R : 500ohm
LED : 2V 20mA

We have a 12V power supply so we need to put a resistor with a 10/0.02 = 500ohm value.
We take a LED with specifications : 2V 20mA. That means that if these were perfect specifications when the led's voltage is at exactly 2V it will draw exactly 20mA ?

The current going through the LED will be determined by the U = RI of the resistor. So the current going through the LED will be determined by the voltage going through the resistor which is itself determined by the voltage going through the LED. But how is determined the voltage going through the LED ? In other terms, how the LED will decide : "I will take that much volts from the power supply". Is is fixed when it has been manufactured, independent from anything in the circuit ? And It corresponds to what, does it mean : if the led takes 2.1V it means that it would require that much power to draw 20mA if there was no resistor ? If another led from another batch takes 1.9V it also means that it would require that much power to draw 20mA if there was no resistor in the circuit ?

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|-------------------------------|
---------------------====== LED2 ========


If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them. If a LED dies, what will happen, the other one will be hit by 40mA and die too ?

The voltage crossing R will be equivalent to the power supply voltage - the voltage crossing AB. Since the voltage in each branch of a parallel circuit is the same, if LED1 takes 2.1V and LED2 takes 1.9V, which voltage will actually go accross each branch ?

Thanks again,
Khivar

Last edited by Khivar; 04-16-2012 at 03:28 AM.
Khivar is offline   Reply With Quote
Old 04-16-2012, 08:24 AM   #15
Member
 
joed's Avatar
 
Join Date: Mar 2005
Location: Welland, Ontario
Posts: 8,378
Rewards Points: 3,550
Blog Entries: 4
Default

Resistor/Led and Basic Electricity Questions


Quote:
Originally Posted by mpoulton View Post
No. Why would you say this?
It is designed to be used on a 2 volt supply. A true bare LED has a voltage of .7 volts. This one has been designed for 2 volts. They also make them for other voltage. I have used the 12 volt ones often. They don't need any resistors.

Now if you want to use the 2 volt LED on a different voltage than 2 volts then a resistor is going to be needed.

Advertisement

__________________
Do not PM with questions that can be asked in a forum. I will not respond.
joed is online now   Reply With Quote
Reply


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Some basic roofing questions tibberous Roofing/Siding 4 03-20-2012 05:49 PM
Basic (hopefully) electrical questions k2000lbs Electrical 10 04-27-2011 01:59 PM
Basic Bathroom Remodel Questions kwilliams512 Remodeling 0 02-26-2010 01:35 PM
Several basic questions re: outdoor floodlight install zinla Electrical 8 09-17-2009 11:39 PM
Basic questions... brackh HVAC 1 04-10-2008 07:02 AM




Top of Page | View New Posts