"Native" voltage for LEDs and current limiting

SamTexas

1 MW
Joined
Dec 29, 2010
Messages
2,712
Location
Houston, Texas
For lack of better terminology I'm using "native". What are the native voltages for white and red LEDs? I understand that different color LEDs have different voltage, and the voltage range is quite narrow. Where can I find a reliable listing of these voltages?

Current limiting: Let's say that y.xV is the native voltage for red LEDs. Let's say that I have a 10A, y.xV power supply. Can I connect the red LED to that power supply directly, or do I have to have an inline resistor to limit current? My guess is that I don't need a resistor because the voltages of the power source and the load match, but I'm not sure, thus the question.
 
LEDs do need current limiting, through resistors or current sources. How it's done, depends on how much current the LED needs and what's the power voltage.

For small LEDs (like round 5mm diameter ones), which take around 20mA, you can use a resistor, calculating it using Ohm's Law: R = (V - Vf) / 0.02 , where V is the feeding voltage and Vf is the LED's (forward) voltage drop (some values are: high brightness blue/white, 3.1V / 3.0V to 4V (depends on manufacturer, but more usually 3.4V); normal brightness red/green/yellow/orange/infra-red, 1.8V/2.2V/2V/2V/1.1V ).

If they're being fed by high voltages then you need to take into account the power dissipated as heat at the resistor and use a resistor with an appropriate power dissipation (use at least twice the needed dissipation). This power (Pr) can be calculated as Pr = 0.02 ^ 0.02 x R , where R is the resistor's value (assuming it was calculated for around 20mA LED current).

For high power LEDs, you need a switched current source, otherwise power dissipated would be impractically large.
 
Current limiting should be used almost 100% of the time when dealing with LED's. A power supply that says 3.3V @ 20mA does not mean it will only put out 20mA, this depends on many things.

You should try to avoid running NO current limiting resistor, even when it works fine without it. Lower power LED's are generally not binned, and ideally you should do this yourself. You want to closely match the vF on the LED's so that they all turn on at the same voltage, all want to take the same current at the same voltage. Avoid running more than one LED in parallel, since its not guaranteed that they will split current evenly. 30mA into a LED rated for 20 generally means a short lifespan. They are extremely sensitive devices.

Even if you have a well-regulated 12.0V supply that never moves from 12.0V, you should still use a resistor. You want to have sufficient resistance so that minor voltage fluctuations do not impact the current much. The more power you dump into the resistor, the less voltage fluctuations will impact the current being fed into the LED's. If you determined the vF for all of your 4 LED's to be exactly 3.4V @ 20mA, and you planned on putting this series string of 4 LED's in a car where the running voltage might be 13.6 or 13.8V, you would not need much, if any resistance to limit the current to 20mA. However, a minor voltage swing will have a substantial impact on the current being driven through the LED's, since the resistor is not dropping any real significant voltage. In this situation, its best to actually run 3 series LED's so that you can put a higher value resistor in place, which will allow for better current regulation on voltage transients.

The better the voltage regulation is, the less this is needed, but a resistor should just about ALWAYS be used. You have to ask how many you want to run, and what you want to run them off. Binning them yourself for vF is the best way to go, you just need a well regulated CC source, or a LED tester. If you only want to use a single LED off 12V+, it won't matter much, as the resistor will be in the 300-500+ ohm range. If your wanted a large array, better have tight regulation if you don't want a string dying quickly.

To put it simply, LED's don't have a 'voltage' that is just what they take. TYPICALLY White LED's run around 3.2V, but most of the cheap ones you get from China tend to be 3.4-3.6, and can swing .2V or more in a single batch. You look at what they claim it to be, than you need to characterize the batch yourself, to make SURE they are all closely matched, throw out duds, and ensure the colors are all closely matched as well. With high end, or high power LED's, this is generally done for you already, but it still won't hurt to bin them yourself into tight vF groups. You generally don't care much about vF, its just a voltage used for calculations, and lower is better, but you limit current, not voltage.
 
Are you talking about "standard" LEDs, that might be used for indicators, or high-power LEDs like for a headlight?

For regular, low-power LEDs, an in-line resistor is the usual way. Something like 220-330 ohms is typical for an LED off of 5V. The current is low, so it doesn't matter that a few mW will be lost in the resistor. You should use one resistor per LED, or you can string a few LEDs in series with a single resistor.

For high-power LEDs, a current-controlled regulator is required. It's NOT sufficient to use a standard voltage-regulated supply, where you've calibrated the voltage to the nominal Vf of the LED. You need a supply that will drive a specific amount of current through the LED. There are lots of LED drivers out on the market which are designed to do this. You also don't want to try and drive these LEDs in parallel. You can string several in series and power them with one driver, but you need one driver for each LED or string of LEDs in parallel. They will not share current well, so you need a driver for each string to ensure the proper current.
 
SamTexas said:
So when V and Vf are the same, R is zero and thus no current limiting is needed, right?

When an LED is crossed by a current, it causes a voltage drop, called Vf - you can measure that by placing a volt-meter on both LED legs. The Vf examples I gave above are Vf values at around 20mA.

The relation between V (the power supply voltage) and Vf is just that V must be higher than Vf. If the LED is going to drop Vf, it "must be fed" a higher voltage than that; I think that's easy to understand.

You can put LEDs in series as long as the total of the Vf's is still below V.
 
Guys, forgive my ignorance if im wrong but as far as i was aware (and i currently do this) an LED only needs resistance if the supply voltage is above the nominal voltage for the LED. For instance, if you have a 3.3v LED and you feed the LED 3.3v via a constant voltage driver then no current limiting should be needed? Isnt the resistor only to burn off the voltage above the LED voltage, for instance running a 3.3v LED on a 5V power supply the resistor would be in place to burn off the remaining 1.7v that the LED does not drop?

The way i understood it is that LED current is in relation to voltage. If you supply 3.3v and the LED uses all 3.3v (i.e. the diode drops all the voltage across it) then it will only take the current it needs. if however there is voltage left after the diode has dropped what it needs then the diode is forced to burn more current to drop the excess voltage, resulting in the LED burning out.
 
The thing is that, although we say "its a 3.3V LED", it's not. It's a 3.32V LED; another one is a 3.16V LED; the next one is a 3.29V LED; and so on. Seems like a tiny difference? Let's say your power supply is exactly 3.3V (which isn't, either!) and the LED is 3.29V; how much current will go through the LED if connected directly to the power supply? Ohm's Law says it's a huge current, since the only resistance is the one on the cables; if it's, say, 0.001 Ohm (and it is probably smaller), then you'd have (3.3V - 3.29V) / 0.001 = 10A! In practice your power supply probably won't have 10A to give, so the current will be lower; and the LED's Vf isn't 3.29V at 10A (if that were possible to have without it burning right away). If the power supply happens to be a little below LED's Vf, the current will be smaller, because Vf decreases as the current increases. And there's also temperature; Vf decreases as temperature incresases (~2.6mV/ºC), and as more current goes through the lower Vf becomes, and the lower Vf becomes, the more current goes through (sometimes until destruction if a thermal equilibrium is not reached - a situation called "thermal runaway", which can take any amount of time from seconds to months)!

When conditions are marginal, LEDs can still light up and work well for seconds, days, weeks, months or even years. Maybe with some variations on output intensity. Sometimes it's enough that an unusually hot day comes by to have it run into thermal runaway. You know when someone says "geee, I had this working for so long and then one day it just mysteriously stopped working, I have no clue why" ? That's the typical result of not using parts properly, of not setting the right working conditions.

So it's not a simple scenario as you can see. That's why it's better to set good working conditions instead of relying on the "it's working fine *now*", by having a higher V and an appropriate resistor, or, even better but usually not necessary for "normal" low power LEDs, a current source, which is a small circuit that is able to maintain a constant current through the LED no matter what temperature or V or Vf variations there are (one can be built from 1 transistor, 2 resistors and a zener diode, like this).
 
Put simply, an LED is not a voltage-operated device. It is a current-operated device.

So instead of worrying about what voltage to put across it, instead make a supply for it that will guarantee supply of whatever that particular LED's safe opearting current is, while also supplying the minimum voltage across it necessary to overcome it's voltage drop at that current.

Don't power it by a constant-voltage supply. Power it by a constant-current source. ;)
 
Njay said:
In practice your power supply probably won't have 10A to give, so the current will be lower; and the LED's Vf isn't 3.29V at 10A (if that were possible to have without it burning right away).

Not at all. There is a V/I curve for LEDs. It tends to be rather steep, but if you are careful it is workable. Also almost all lower end LED flashlights are direct drive off the battery. I run some 60W Bridgelux arrays directly off of A123 LiFePO packs. The nominal cell voltage is very close to the LED forward voltage. Works well. I do have an AVR based dimmer control circuit on most of them. One thing that I do is start backing off the PWM duty cycle if the cell voltage is over 3.2V (the upper 5% of the LiFePO cell charge).
 
Voltage is not a big concern to a LED. They only tell you so you know about what voltage it will be fed. Ideally, they should be current limited. With lower power stuff this is easy. Throw a resistor on it and you can easily get 20mA or so. 10-20% variations are generally fine.

With high power stuff, you often want to throw as many in a series string as you can, and get a driver to feed them all constant current. Flashlights and stuff really only get 'away' with no regulation since they assume the batteries used have high internal resistance, so they limit current fairly well. The voltage still needs to be close to the Vf of the LED, unless you are dealing with something like button cells. Those are incapable of delivering much power at all, so they often use two 3V cells in series to drive a single ~3.5V white LED. It's not being driven by 6V, there huge voltage sag effectively limiting the current.

The current for a LED tends to go up VERY VERY rapidly as you increase the input voltage. If the Vf is 3.5V, 3.6V might be enough to blow it. It all depends on the LED, but you don't care about Vf much at all. Current matters here.
 
texaspyro said:
Njay said:
In practice your power supply probably won't have 10A to give, so the current will be lower; and the LED's Vf isn't 3.29V at 10A (if that were possible to have without it burning right away).

Not at all. There is a V/I curve for LEDs. It tends to be rather steep, but if you are careful it is workable. Also almost all lower end LED flashlights are direct drive off the battery. I run some 60W Bridgelux arrays directly off of A123 LiFePO packs. The nominal cell voltage is very close to the LED forward voltage. (...)

I was talking about normal low power LEDs. "Very close" to which LED? Even white ones have different Vf "levels" depending on manufacturer and on. Too many ifs. I don't want to be careful; I want to have it working and forget about it, there are too many better things to worry about ;)
 
Back
Top