LED lights: A Few Facts

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

Let's try to keep this thread on point which is to discuss the facts regarding lights that others who haven't been as interested in lights may not be aware of.

There is a place for graceful degradation. It can be useful in a backup light. You're free to disagree or to have other preferences but it's not a fact that there is no place for it. Again, there is a useful place for unregulated as well as regulated lights. Anything beyond that is opinion or preference.

Also, it's good that changes to industry standards have been brought up as well but again let's not make this thread primarily a debate about that.

I think everyone has provided valuable contributions to help clear up some confusions in this subject matter.

I don't see how anything we have discussed here is off topic? We have touched on ratings, regulations, testing labs, fundamentals of operation..... maybe we should keep this limited to LED light output ratings and not touch on battery runtime ratings? I think both are huge topics.

Also, just to clarify in my earlier post I mentioned that I agree that graceful degradation is good, but only as long as you get your run time at the specified output first. I would not plan a dive around graceful degradation.

Some of these LED canister lights are $1000+..... that is no cheap flashlight.
 
I don't see how anything we have discussed here is off topic? We have touched on ratings, regulations, testing labs, fundamentals of operation..... maybe we should keep this limited to LED light output ratings and not touch on battery runtime ratings? I think both are huge topics.

Also, just to clarify in my earlier post I mentioned that I agree that graceful degradation is good, but only as long as you get your run time at the specified output first. I would not plan a dive around graceful degradation.

Some of these LED canister lights are $1000+..... that is no cheap flashlight.

I'm not trying to stifle the conversation, just making a few suggestions.

I don't think you can have a specified output with degradation can you? I suppose you look at a chart and see where output would be after an hour and plan around that.

For a backup light you aren't really planning around output anyway just planning for it to be there when you unexpectedly need it.

Cannister lights are generally constant current (aren't they) and aren't usually considered backup lights.

I don't have any problem with any proposed standard even if it's a standard of max lumen at the end of stated runtime. That's just not something that exists at the moment and I'm just trying to use this thread to explain light issues (including battery issues) as they exist.
 
I'm not trying to stifle the conversation, just making a few suggestions.
I don't have any problem with any proposed standard even if it's a standard of max lumen at the end of stated runtime. That's just not something that exists at the moment and I'm just trying to use this thread to explain light issues (including battery issues) as they exist.

That suggestion would work. Then you would get the runtime at the minimum specified output for that runtime. LEDs will be much brighter for the first 10 minutes when you turn them on as well, they need about 10-20 mins of stabilization to reach thermal equilibrium. This would address all those issues.

Also, there is one other area I think we should discuss on LEDs. Reliability.

Here is another industry misconception and lie: LEDs last for 50,000 hours.

This is not true, that is the MTTF (Mean Time To Failure) for an LED to reach its 70% lumen maintenance level (L70).

This is not based on an individual LED, it is based on the aggregate number of LEDs in operation in the field. That is to say if you have 50,000 LEDs in operation, one will fail every hour, on average. Failure is defined as when the LED can only give out 70% of its rated output. The MTTF number has no bearing on its total useful life, just on how many random failures you would get during its useful life period.

To anticipate how long an LED would last, a test such as IESNA LM-80 would need to be performed on the entire luminaire at the actual operating current, it is entirely lighting system specific (the better the heatsink and the lower the current you run the LED at the longer the life).

An LED will degrade over its life based on the operating temperature it is subjected to. (failure mechanisms in LEDs is another huge topic which I won't get into).

Hope that helps clarify another industry obfuscation of the truth.
 
In practical terms however, a led operated within reasonable specs will last longer than most other alternatives correct?

By the time a led operating in such an environment reaches 70% the dive light it is installed in is likely to have failed for other reasons or the combined system is likely to be functionally obsolete correct?
 
Some folks in the flashlight forum have spent considerable effort/time trying to quantify the differences between actual and "claimed" lumens, and even made their own DIY Integrating Sphere to gather some useful comparison data.

Here is the post describing the building and usage of the IS:
Building an Integrating Sphere ... - CandlePowerForums


and here is a post, very similar to this one here, talking about these differences:
Lumen Readings - Actual vs Factory claimed - CandlePowerForums


Will
 
In practical terms however, a led operated within reasonable specs will last longer than most other alternatives correct?

By the time a led operating in such an environment reaches 70% the dive light it is installed in is likely to have failed for other reasons or the combined system is likely to be functionally obsolete correct?

For your first question:
You don't know till you look at the Tj (junction temperature of the LED). Then you can figure out the L70. The Tj will be based on the thermal design and the operating current of the system.

There is no answer to your question unless you look at operating enviroment and use conditions. If the assumption is dive lighting and we are assuming that the light is under water and the heatsink has a low thermal resitance, then yes that would be a correct assumption.

For you second question:
Again, the MTTF has no bearing on the end of life for the product. This is application specific. You need an ALT (Accelerated Life Test) or IESNA LM-80 to see what the EOL (End Of Life) would be. There is no way to make that statement absent information on both the EOL for the Luminaire, the control electonics and the mechanical housing.
 
Some folks in the flashlight forum have spent considerable effort/time trying to quantify the differences between actual and "claimed" lumens, and even made their own DIY Integrating Sphere to gather some useful comparison data.

Here is the post describing the building and usage of the IS:
Building an Integrating Sphere ... - CandlePowerForums


and here is a post, very similar to this one here, talking about these differences:
Lumen Readings - Actual vs Factory claimed - CandlePowerForums


Will

Wow, that is interesting and inventive. Any idea what the testing standard is that they are using? I do not see a reference to one. IESNA LM-79 would be my preferred method.
 
Wow, that is interesting and inventive. Any idea what the testing standard is that they are using? I do not see a reference to one. IESNA LM-79 would be my preferred method.

Further down in those threads they do talk about the need for some accurate light source/reference, but I don't know if it was implemented. One of the guys following that thread also created his own IS, but had access to some more fancy equipment, and his measured values were within a small percentage of known lumen values:
Inspired by Precisionworks, I built a 12" Isphere ...


Maybe the resulting numbers were not "spot on", but it seems to have been a valuable resource to compare lights and make generalized comments about their relative output.
 
While your comments make sense when running on a direct drive or conventional halogen light, they make no sense for a DC DC converter driven LED drivers that run at a constant current. As the battery voltage drops, the current can stay constant.
In an ideal theoretical world, yes...but in the real world, any constant-current regulated light will drop out of regulation because batteries are not an infinite power source.

How long a light will run out-of-regulation depends mostly on the power source. If you use protected li-ion cells, they will just cut off...no warning. Unprotected li-ion cells will fall off a short & steep voltage curve.. Alkaline cells will decline down a very long and shallow curve. Lithium primaries and NiMH cells lie in between these extremes.

The important thing is that, when most lights drop out of regulation, they continue to deliver imperceptibly lower levels of light for significant time. And the amount of light that we see is what matters.

When you suggest "-15% rated lumen output" as the runtime metric cutoff, I'm guessing that you actually mean "-15% perceived light output". They are not the same. The human eye responds non-linearly to light intensity.

Put simply, your eyes/brain cannot see the difference between 100% lumens and 85% lumens in most cases. Lumens must be doubled (200%) or halved (50%) before you will really notice a difference. As a general rule of thumb, the lumens must be 500% higher for a light to look twice as bright. A 1000-lumen light looks about twice as bright as 200 lumens. Non-intuitive, but that's the way our eyes work.

So...assuming you were talking about perceived brightness, 50% of lumen output is right on target.

By the way, this is not my idea. The guys in the flashlight communities like candlepowerforums came to the same conclusion long ago...and they demand this rating from the cutting-edge flashlight manufacturers:

runtime = time to 50% of rated lumens

It makes good sense. Think about it. :)

J
 
Further down in those threads they do talk about the need for some accurate light source/reference, but I don't know if it was implemented. One of the guys following that thread also created his own IS, but had access to some more fancy equipment, and his measured values were within a small percentage of known lumen values:
Inspired by Precisionworks, I built a 12" Isphere ...


Maybe the resulting numbers were not "spot on", but it seems to have been a valuable resource to compare lights and make generalized comments about their relative output.

Seems like alot of work for a relative comparison. Why not just use a lux meter at a set distance (say 3m) and take measurements after a set amount of settling time (say 10 mins).

You can use the lux meter to find the peak intesity (in the center), then measure out to half that on each side to get the FWHM beam divergence angles. Then plot the results. That would give a you better idea on how to relatively compare lights using only the useful light. What do you think?
 

Back
Top Bottom