Will using a resistor in series with a LED to control its voltage increase the total energy expenditure?

by Exocytosis   Last Updated August 13, 2019 18:25 PM

This might sound like a stupid question, but I want a confirmation. I watched a video on Youtube about using LEDs. Those LEDs required a voltage around 2 volts at 20 mA.

In order to power one LED using a 5 volts power supply, the author used a resistor in series. He calculated he needed around 150 ohms (using U=RI, 5-2=3 volts, 3V/20mA=150ohms).

What I find disturbing is that the resistor, in order to control the voltage must be consuming energy too. P=UI, so 3x20mA=60mW, on top of the LED 2x20mA=40mW. In other words, adding +150% energy consumption to the actual need for lighting up the LED.

Am I missing something or is it typical to spend extra energy just to be able to use electronic components that require a lower voltage? And second question, is there a way to avoid doing it for this type of circuit (5V source, 2V LED)?



Answers 4


No, you are not missing anything. The energy consumed by the resistor is wasted but, if you were contemplating a circuit that used tens or hundreds of LEDs you might consider a buck regulator to step the LED circuit supply voltage down to maybe 3 volts and make a significant net power saving per LED drive.

You’ll still need a 50 ohm resistor but it will only be dropping around 1 volt and dissipating only 20 mW.

The good news is that many modern LEDs need only a couple of mA to obtain sufficient brightness for “standard” applications.

Andy aka
Andy aka
August 13, 2019 16:45 PM

Yes, that resistor wastes power.

If the author is using an LED for an indicator light, then they're wasting a lot more power by their choice of LED. An LED that needs 20mA to show up in a brightly-lit room is typical of 1970's technology. If you shop for higher-brightness LEDs you'll blast your eyeballs out at 20mA, and you'll find yourself stopping the thing down to 1mA or so. One such LED, with a matching resistor, would use 3.3mW at 3.3V, where a 20mA, 1.5V LED alone (never mind the resistor) would use 30mW.

The ultimate way to reduce the circuit power consumption would be to use the most efficient LEDs that you could find, and power them with a switching converter. A decent switching converter will have somewhere between 80% and 95% efficiency, so you'll use between 25 and 5% more power than just the LED. But you'd have to use one per LED (or LED string), and it's hard to justify a super-efficient switching converter for each indicator light.

TimWescott
TimWescott
August 13, 2019 16:53 PM

Without the resistor to limit current flow, the LED heats up, draws even more current, and burns out. On high power LEDs, an active circuit to control the current is used instead, acting the same as a switching power supply.

CrossRoads
CrossRoads
August 13, 2019 16:54 PM

One point I would like to mention is the circuits which are meant to be drive LEDs. The constant current circuitry can come in handy to save power when multiple LEDs are needed to be driven.

Umar
Umar
August 13, 2019 17:05 PM

Related Questions


Updated July 15, 2017 13:25 PM

Updated April 20, 2017 21:25 PM

Updated June 20, 2016 08:10 AM

Updated February 25, 2018 02:25 AM