View Single Post
Old 25-07-07, 03:02 PM   #19
embee
Member
Mega Poster
 
embee's Avatar
 
Join Date: Jul 2003
Location: Warwickshire
Posts: 2,804
Default Re: LED indicator advice

Quote:
Originally Posted by 2mths View Post
What I don't understand (seriously) is why so much power needs to be dissipated by the resistor?
In very simple terms, normal indicator lamps are around 21W nominal rating, which on (again nominal) 12V means a current draw of 21/12= 1.75Amps per lamp, so 3.5A total when two lamps are lit.

On this nominal basis each lamp works like a resistor of R=V/I or about 6.8 Ohms. Note that if you measure the resistance of a lamp filament when cold it will be a lot less than 6.8 Ohms, the resistance increases as the filament heats up.

An LED takes minimal current (and therefore power).

Flasher relays are designed intentionally so that if the current draw isn't what it should be, then it will flash at a fast rate to let you know something is wrong (a lamp has failed).

With LEDs on a standard relay you need to provide the extra load, so an extra 6.8 Ohm resistor (which happens to be a standard value, often termed 6_R_8 at each lamp position will do it.

Remember that each lamp was rated at 21Watts, so you're looking at that sort of heat dissipation when they are powered. But of course the lamps are only on about half the time, so the average power is around 10W. This will load 10W rated resistors pretty heavily but they'll probably survive. 25W might be a better choice if the oncost isn't too much.

The alternative is a relay intended for LEDs without the 3.5Amp expected current feature.
__________________
"Artificial Intelligence is no match for natural stupidity"

Last edited by embee; 25-07-07 at 03:05 PM.
embee is offline   Reply With Quote