DIY Chatroom Home Improvement Forum

DIY Chatroom Home Improvement Forum (http://www.diychatroom.com/)
-   Electrical (http://www.diychatroom.com/f18/)
-   -   Resistor/Led and Basic Electricity Questions (http://www.diychatroom.com/f18/resistor-led-basic-electricity-questions-140293/)

Khivar 04-14-2012 08:04 PM

Resistor/Led and Basic Electricity Questions
 
Hello Everyone,

First let me say that I am a newbie in the electricity world but I try my best to upgrade my knowledge.
I have some questions, most of them may be dumb, you've been warned !


1) I read that with a led you must ALWAYS put a resistor. Say we have a power supply 2V 1A and the led is a 2v 20mA, why put a resistor at all here ? The led will only draw the 20mA she needs from the power supply, she will never be "hit" by the full 1A that the power supply can deliver, am I wrong ?

2) If the voltage of the circuit is the bigger than the voltage of the led let's say power supply 12V 1A, led still 2V 20mA. We would need a drop in the voltage so we put a resistor. U = RI so R = 10/0.02 = 500 so we need to put a 500 ohm resistor to get 2V in the led. Also the power rating for the resistor should be at least 10x0.02 = 0.2W and I read that for safety we need to double that so we need a 0.4W 500 ohm resistor ( Hope I am right so far :D ). Alright so here the resistor is for dropping the voltage and not the current, I mean yes only 20mA will be able to pass through the resistor but the LED would only draw 20mA anyway ?

3) If we take the same configuration as 2) but instead of the resistor we put 5 leds 2V 20mA. It will do exactly the same as the resistor will it not ?

4) I read that paralleling a number of LEDs powered directly from a voltage source, they won’t all have the same light output. I don't understand why, each led should draw the amount of current it needs so they will glow exactly the same if these are the same leds, won't it ?

5) If we still have a power supply 12V 1A and we put in series first 5 leds 2V 20mA and then a led 2V 100mA at the end. Since the current is the same everywhere in series, will it be 20mA and the last led will be dimmed or simply won't light on or will the current be 100mA and the 5 other leds will die ?

6) Power supply 12V 1A. We only put in series 5 leds 2V or 4 leds 2V + 1 led 3V. There will be too much voltage, how will it be spread accross the leds ?

Thanks in advance,
Khivar

curiousB 04-14-2012 08:44 PM

1) Yes you always need a resistor. Something has to limit the current or the LED junction (basically a diode) will overheat and fuse (weld) together. The forward voltage of a diode varies with temperature, current, and batch to batch variances. However the voltage won't vary much based on current so you can't think of it as a resistor. In otherwords a 1.5V LED running at 20mA is not a 75 Ohm resistor. The idea of a series resistor is to limit current despite these variables.

3) No. You can put several LEDs in series but you need a resistor to limit the current for the same reasons you need it in #1 above. You don't need a resistor per LED but you need one in each series leg.

5) Current will be established by your series resistor. The supply voltage minus the forward voltage of all series diodes divided by 20mA will be the series resistor to use.

4) Don't parallel LEDs ever. The junction variances vary as noted in #1 so you might be severely stressing one and not the other. Series is OK with a series resistor but avoid parallel.

6) Voltage will be split based on the characteristics of the LED. They should all be close to the same if they are the same type of LED but some of these high output LEDs have multiple die on the substrate.

a7ecorsair 04-14-2012 10:14 PM

As pointed out, diodes cannot be thought of as a resistor when analyzing current flow. Diode are more like a switch and when reverse biased they are off and when forward biased they are on. When forward biased, diodes will have a fairly constant voltage drop across the junction so the circuit has to be designed to limit the current to the manufactures specification. Germanium diodes have a forward bias voltage drop of .2 volts and silicone diodes have a drop of .7 volts. LEDs have different voltage drops depending on several factors.
In your example you are using LEDs that exhibit a 2 volt drop when forward biased and a design current of 20 ma. You now have to choose a resistor that will provide the correct circuit resistance so there is 20 ma flowing. If you choose the wrong sized resistor you could end up with 30 ma flowing and the LED will allow this for awhile but it will run hotter than its design and have a short life. If the resistor is too large the LED will not forward bias.

Khivar 04-14-2012 10:38 PM

Thanks for your answers !

1) If I understand correctly somehow if the voltage rise up then the current rise up and will damage the led.. Does that actually mean that in a series with no resistor when giving exactly 2V to a diode which specifications 2V 20mA, the diode will not automatically draw 20mA ? For example depending of the batch ( or the heat ), at a voltage of 2V this diode could draw for example 25mA of current and that will mess it up that's why we need a resistor to regulate the current ?

3) Okay makes sense.

4) Ok I understand what you say if you're talking about :

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|---------------------------------|
---------------------====== LED2 ========

If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them.

+ ============R==== LED1 ======= ===== -
-------------------|-------------------------------- -|
--------------------===R==== LED2 =======


But in parallel like that it should work, shouldn't it ?

6)

+ ==== R ==== LED1 ====== LED2 ======== LED3 ===== -

LEDS = 2V 20mA
12V Power Supply, the Resistance takes waste 6V.

Let's say that LED1 comes from a bad batch and somehow requires 2.5V to work where the other require 2V.
Since only 6V will be available what voltage will each LED get ?
Since they all have 20mA they will glow exactly the same despite having different voltage ?

Thanks again !

a7ecorsair 04-14-2012 10:47 PM

You did not understand what I wrote.
Draw yourself two circuits fed by 12 volts with a LED that is forward biased at 2 volts and a resistor. In one circuit use a 500 ohm resistor and in the other use a 400 ohm resistor. Now, calculate the current flow in the circuit and voltage drop across the resistor.

Khivar 04-14-2012 11:13 PM

Quote:

Originally Posted by a7ecorsair (Post 899130)
You did not understand what I wrote.
Draw yourself two circuits fed by 12 volts with a LED that is forward biased at 2 volts and a resistor. In one circuit use a 500 ohm resistor and in the other use a 400 ohm resistor. Now, calculate the current flow in the circuit and voltage drop across the resistor.

The resistor takes 10V then with the 500 ohm resistor there is 20mA in the circuit and with the 400 ohm there is 25mA in the circuit. In which part of my post am I wrong ?

mpoulton 04-15-2012 12:06 AM

You're not thinking about this right. LED's function differently than most loads that you're used to thinking about. Most loads are constant-voltage devices, where you apply a fixed voltage and it "draws" current "as needed" to satisfy its power requirements. These loads can be modeled as a positive resistance which results in a certain current flow at a certain voltage. LED's are not like this. They are constant-current devices. You supply a fixed current, and the voltage across the LED varies according to factors beyond your control. The rated voltage for an LED is a very rough estimate and varies tremendously depending on manufacturing tolerances and operating conditions. LED's do have positive resistance, but it's very low and does not account for most of the voltage drop across them.* The resistor in series with an LED cannot be modeled as a voltage dropping resistor, it needs to be modeled as a current limiting resistor. Your calculations are correct but the approach is different.

If you connect an LED to a constant voltage source and slowly increase the voltage from zero while monitoring current, you will see that the current remains essentially zero until you reach very close to the operating voltage of the LED. Then it suddenly increases way beyond the ratings. Connecting a "2V" LED to a 2.0V power supply will likely result in either nothing or a blown LED.

Understanding that the voltage across an LED is unpredictable and varies quite a bit, and that the ohmic resistance is very low, it should be obvious why they can't be connected in parallel. They do not share current well. Each LED or series of LED's must be fed by a current source. The answer to your question 6 should also be apparent: there will be essentially no current flow, so the voltage across each LED will be indeterminate.

Another issue should also become apparent: The power supply voltage needs to be substantially higher than the voltage across the LED's if you are going to use a resistor to limit current. Otherwise the unpredictable nature of the LED voltage will result in highly variable current. Example: 6 LED's rated 20mA each at Vf=2V in series, with a 14V power supply. The predicted voltage across the resistor is 14-2*6=2V. The resistor value should be 2/0.02=100 ohms. But what happens if the LED voltage is 10% lower than the rated value? Now the voltage across the resistor is 3.2V and the current rises to 32mA, an increase of 60%. This will increase heating of the LED's, which reduces their voltage drop, which increases current flow, which heats them more... For high power LED applications the solution to this is to use a constant current power supply rather than a current limiting resistor. For low power applications where efficiency isn't a concern, the voltage across the current limiting resistor should probably be around half the supply voltage (maybe 1/3 absolute minimum).


*Most of the voltage across an LED results from the semiconductor bandgap - the potential difference required to push ANY current through the device. This is directly related to the color of the LED. Why? Because each electron that goes through the LED must have enough energy to create one photon out of the LED. Visible photon energies are in the range of 1-4 electron-volts, so each electron must have at least that much energy in order to make it through the LED. Blue photons have more energy than red, thus blue LED's require higher voltage than red ones. White LED's are not really white, they're blue with a fluorescent phosphor that converts blue light to white - so they need the same voltage as blue LED's. The semiconductor bandgap is temperature dependent too, with higher temperatures resulting in less voltage (and longer wavelengths of light produced) so the LED voltage will vary as the device heats up or cools down. This characteristic bandgap voltage is the reason LED's behave differently than many other loads.

Khivar 04-15-2012 01:10 AM

Quote:

Originally Posted by mpoulton (Post 899215)
You're not thinking about this right. LED's function differently than most loads that you're used to thinking about. Most loads are constant-voltage devices, where you apply a fixed voltage and it "draws" current "as needed" to satisfy its power requirements. These loads can be modeled as a positive resistance which results in a certain current flow at a certain voltage. LED's are not like this. They are constant-current devices. You supply a fixed current, and the voltage across the LED varies according to factors beyond your control. The rated voltage for an LED is a very rough estimate and varies tremendously depending on manufacturing tolerances and operating conditions. LED's do have positive resistance, but it's very low and does not account for most of the voltage drop across them.* The resistor in series with an LED cannot be modeled as a voltage dropping resistor, it needs to be modeled as a current limiting resistor. Your calculations are correct but the approach is different.

If you connect an LED to a constant voltage source and slowly increase the voltage from zero while monitoring current, you will see that the current remains essentially zero until you reach very close to the operating voltage of the LED. Then it suddenly increases way beyond the ratings. Connecting a "2V" LED to a 2.0V power supply will likely result in either nothing or a blown LED.

Thanks I think I got it ! ;)

Quote:

Understanding that the voltage across an LED is unpredictable and varies quite a bit, and that the ohmic resistance is very low, it should be obvious why they can't be connected in parallel. They do not share current well. Each LED or series of LED's must be fed by a current source. The answer to your question 6 should also be apparent: there will be essentially no current flow, so the voltage across each LED will be indeterminate.
I think I understood that, here's a quote of myself from earlier in this post :

Quote:

4) Ok I understand what you say if you're talking about :

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|---------------------------------|
---------------------====== LED2 ========

If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them.

+ ============R==== LED1 ======= ===== -
-------------------|-------------------------------- -|
--------------------===R==== LED2 =======


But in parallel like that it should work, shouldn't it ?
Quote:

Another issue should also become apparent: The power supply voltage needs to be substantially higher than the voltage across the LED's if you are going to use a resistor to limit current. Otherwise the unpredictable nature of the LED voltage will result in highly variable current. Example: 6 LED's rated 20mA each at Vf=2V in series, with a 14V power supply. The predicted voltage across the resistor is 14-2*6=2V. The resistor value should be 2/0.02=100 ohms. But what happens if the LED voltage is 10% lower than the rated value? Now the voltage across the resistor is 3.2V and the current rises to 32mA, an increase of 60%. This will increase heating of the LED's, which reduces their voltage drop, which increases current flow, which heats them more... For high power LED applications the solution to this is to use a constant current power supply rather than a current limiting resistor. For low power applications where efficiency isn't a concern, the voltage across the current limiting resistor should probably be around half the supply voltage (maybe 1/3 absolute minimum).
Never thought of that ! So basically when calculating a resistor you must calculate it's ohm value, it's wattage value and double it just to be safe and be sure that the voltage crossing the resistor is about half the power supply's.

Thanks :)

a7ecorsair 04-15-2012 09:01 AM

Quote:

Originally Posted by Khivar (Post 899166)
The resistor takes 10V then with the 500 ohm resistor there is 20mA in the circuit and with the 400 ohm there is 25mA in the circuit. In which part of my post am I wrong ?

Looks good to me; it is the resistor that sets the current flow. Maybe it is the way you have worded your statements saying the LED is 20 ma and always uses 20ma. Once the LED is forward biased the voltage drop is pretty constant but as you increase current, junction resistance decreases.
In the first example the junction resistance is 100 ohms
In the second example it is 80 ohms.
P=IČR
.02*.02*100= .04 watts
.025*025*80 = .05 watts

joed 04-15-2012 09:47 AM

If you have a 2 volt rated LED then there is a resistor already built into it.

a7ecorsair 04-15-2012 09:54 AM

Quote:

Originally Posted by joed (Post 899379)
If you have a 2 volt rated LED then there is a resistor already built into it.

I've not work directly with LEDs. PN junctions of germanium and silicon have fairly fixed voltage drops when forward biased.
Reading through this: http://en.wikipedia.org/wiki/Led
it looks like LEDs have some different physics. I don't see any mention of integrated resistance.

Khivar 04-16-2012 01:23 AM

Quote:

Originally Posted by a7ecorsair (Post 899357)
Looks good to me; it is the resistor that sets the current flow. Maybe it is the way you have worded your statements saying the LED is 20 ma and always uses 20ma. Once the LED is forward biased the voltage drop is pretty constant but as you increase current, junction resistance decreases.
In the first example the junction resistance is 100 ohms
In the second example it is 80 ohms.
P=IČR
.02*.02*100= .04 watts
.025*025*80 = .05 watts

No you're right, that's what I thought when I first posted this thread, I thought a 20mA Led will always draw 20 mA !

Can anyone confirm to me if this is right or false :
Quote:

4) Ok I understand what you say if you're talking about :

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|---------------------------------|
---------------------====== LED2 ========

If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them.

+ ============R==== LED1 ======= ===== -
-------------------|-------------------------------- -|
--------------------===R==== LED2 =======


But in parallel like that it should work, shouldn't it ?

mpoulton 04-16-2012 02:34 AM

Quote:

Originally Posted by joed (Post 899379)
If you have a 2 volt rated LED then there is a resistor already built into it.

No. Why would you say this?

Khivar 04-16-2012 03:24 AM

Hello again, I have some more questions to fully understand how it works :)

+ ======== LED =========== -

Power Supply : 1.8V 200mA ( I made that up, dunno if it actually exists )
LED : 2V 20mA
From my understanding here we don't need a resistor because since our power supply only deliver 1.8V, there is no way the LED would draw more than her normal voltage, and no higher voltage means no higher current so the current it will received will be a little less than its specification here and then the light output will be a little dimmer. Or am I wrong and the LED will drawn 1.8V and will be hit by the 500mA and die right away ?

+ ===== R ====== LED ====== -

Power Supply : 12V 1A
R : 500ohm
LED : 2V 20mA

We have a 12V power supply so we need to put a resistor with a 10/0.02 = 500ohm value.
We take a LED with specifications : 2V 20mA. That means that if these were perfect specifications when the led's voltage is at exactly 2V it will draw exactly 20mA ?

The current going through the LED will be determined by the U = RI of the resistor. So the current going through the LED will be determined by the voltage going through the resistor which is itself determined by the voltage going through the LED. But how is determined the voltage going through the LED ? In other terms, how the LED will decide : "I will take that much volts from the power supply". Is is fixed when it has been manufactured, independent from anything in the circuit ? And It corresponds to what, does it mean : if the led takes 2.1V it means that it would require that much power to draw 20mA if there was no resistor ? If another led from another batch takes 1.9V it also means that it would require that much power to draw 20mA if there was no resistor in the circuit ?

+ ==== R ==== ====== LED1 ======== ===== -
--------------------|-------------------------------|
---------------------====== LED2 ========


If LED1 and LED2 are 20mA then theoretically the R would need to be 40mA but that would mean that any LED could draw up to 40mA therefore damaging them. If a LED dies, what will happen, the other one will be hit by 40mA and die too ?

The voltage crossing R will be equivalent to the power supply voltage - the voltage crossing AB. Since the voltage in each branch of a parallel circuit is the same, if LED1 takes 2.1V and LED2 takes 1.9V, which voltage will actually go accross each branch ?

Thanks again,
Khivar

joed 04-16-2012 08:24 AM

Quote:

Originally Posted by mpoulton (Post 899892)
No. Why would you say this?

It is designed to be used on a 2 volt supply. A true bare LED has a voltage of .7 volts. This one has been designed for 2 volts. They also make them for other voltage. I have used the 12 volt ones often. They don't need any resistors.

Now if you want to use the 2 volt LED on a different voltage than 2 volts then a resistor is going to be needed.


All times are GMT -5. The time now is 07:57 PM.