POV-Ray : Newsgroups : povray.off-topic : Luniversity studies : Re: Luniversity studies Server Time
10 Oct 2024 17:20:16 EDT (-0400)
  Re: Luniversity studies  
From: Tom Austin
Date: 13 Nov 2008 10:13:41
Message: <491c4425$1@news.povray.org>
Invisible wrote:
>> "normal" ones work at all sorts of voltages, it even varies from piece 
>> to piece quite significantly.  But you never drive an LED directly by 
>> applying a fixed voltage, you always drive it by regulating the 
>> current to a fixed amount (like 20 mA or whatever).  Adding a series 
>> resistor to a raw LED is a quick and crude method of fixing the 
>> operating current.
> 
> This doesn't make sense to me.
> 
> Presumably the resistence of the LED is finite and fixed. How does 
> adding another resistor help? There are several schematics in my 
> electronics kit that involve LEDs and no resistors at all.
> 


Sorry I've been away and missed the fun.


I hook up a LED this way:

A LED will always drop or *consume* a fixed voltage - again, that's 
voltage not current.
A LED is NOT a resistor - in the most basic sense it generally will not 
prevent current from flowing through it once it starts flowing - it 
starts flowing at it's rated voltage drop.

The typical voltage drop for a basic red LED is 1.7 volts.
For most LEDs you can use this number for calculations.
Be aware, that different colors may have a higher voltage drop.
A LED will generally not light up unless your voltage across its leads 
goes above it's drop.

A LED likes to have between 10mA and 20mA - to be safe you can use 10mA 
in calculations.  The current will affect brightness, but too much 
current will cause it to burn out.  You are better off to select a 
*bright* LED with a high light output rating than try to get more light 
by putting more current through it.


So some easy calculations:

lets say 5v power source
1.7v LED
10mA current

the LED drops 1.7v so the rest of the circuit will drop 3.3v
(5v)-(1.7v)=(3.3v)

If don't put a resistor in then the current through the circuit will be:
I=V/R   from V=IR

I=(5v)/(0 ohm)
not good  too much current - things get hot
	(note:  LEDs should not get hot)

we want the current (I) to be about 10mA
so R=V/I   again from V=IR

R=(3.3v)/(10mA)
R=330 ohm


330 ohm happens to be a common resistor value
so put one in

if your source is 12v then you need a 1,030 ohm resistor - not a common 
value.  But 1,000 ohm is - so use it instead.

Typical resistor values are only +-10% anyway - so a 330 ohm resistor 
could be as low as 300 or as high as 360 and you don't know it.

So picking something slightly off from the *calculated* value is OK.
After all, you are just trying to make a LED light up - not try to 
maximize it's brightness.



Tom




















Different LEDs drop different voltages, but you can usually safely bet 
on 1.5V and still be safe.

Then there is the current rating of the LED - usually 10mA to 20mA - you 
can pick 15mA and be pretty safe.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.