I'm working on a personal project which will involve pulsing a number of LEDs from a TI MSP430 microcontroller (I'm thinking 5-10 LEDs). Visibility is, naturally, one concern, but keeping power consumption low is also important: I'd like to have this display operate from a single, solar-recharged 1.2V NiMH cell and a "boost driver" like the Zetex ZXLD383.
Which brings me to my question: What pulsing pattern would yield the most efficient use of my limited power while providing a high level of brightness to the human eye?
As far as the LED itself, the light intensity actually given off is directly related to the current flowing through it (If). The human eye, on the other hand, seems to be a lot less "picky" about what it considers "bright", as most discussions of LED dimming point out: if you can pulse an LED (or paint a CRT trace) rapidly enough, the LED (or CRT trace) appears to stay lit (the "POV", or "persistence of vision" effect).
I have used PWM in the past for dimming LEDs, but until now I have always had "enough" power so that it wasn't a specific concern. As I wandered around the 'Web trying to find out how to minimize my project's power consumption I ran across descriptions of three methods for pulsing (modulating) LEDs: PWM (pulse-width modulation), FM (frequency modulation), and BAM/BVM (Bit-Angle/Bit Voltage Modulation). These are described in, among other places, Application Notes 9 and 11 from this location:
Artistic License (UK), Ltd
What I haven't been able to find, or perhaps what I skimmed past and failed to recognize, is some sense of how short a pulse the human eye can detect as "bright", and how frequent this pulse needs to be in order to "refresh" or "recharge" the cones/rods/whatever involved in this process. This information might not directly solve my problem, but it might at least give me some guidance on where to explore further.
Discussions of PWM, for example, tend to stop at the minimum frequency required to achieve persistence -- say 100Hz. What I'm wondering is whether I can achieve a similar effect with much shorter pulses -- say
100usec (10kHz) -- and a lower average current draw, thus minimizing the average current drawn from my poor, overloaded NiMH cell.This is not a new concept. The typical IR remote control unit, as I understand it, uses short-pulse, high-current PWM to maximize the range of LED emission (and minimize battery drain), but for this to be useful the matching IR detector must be capable of detecting and acting on those same short pulses. How short a pulse can the human eye detect?
Is anyone aware of any research in this area that I could consult?
This morning I found a 2008 article which suggests that using 60Hz/5% Duty pulses could significantly improve apparent brightness (but I haven't had a chence to test this):
Human Perception Studied to Double LED Brightness
However, there appears to be some controversy regarding the universality of the "Broca-Sulzer effect" which the article mentions.
Frank McKenney