I'm trying to fade some lights off slowly using a mosfet (see separate posting Fading Lights from 22 July) driven from a PIC and 74HC595 shift registers. I've encountered some strange results.
I actually have 7 identical mosfets with 7 strings of lights attached, each string of lights drawing a different amount of current. Each
74HC595 output runs via a diode to one mosfet gate. The lights get turned on at different times then I want them all to fade off together.
To create the fade I added a small cap to each mosfet gate (0.001uF). The first thing I noticed was that the higher the load current the faster the lights turned off. I therefore had to vary the size of the caps to get a uniform fade time across the 7 strings.
The smallest string of lights however was turning off too slowly even using a 0.0001uF cap. I therefore removed the cap completely. To my surprise that INCREASED the turn off delay - so much so that the mosfet overheated badly. Why the increase in turn off time? Would it be due to the leakage through the external cap actually draining off the mosfet gate cap once the external cap discharges?
Next problem. Increasing the size of the gate caps does not increase the period of the fade. It simply keeps the lights on at almost full brightness for longer, then they die very rapidly. The actual fade time remains constant at about 0.7 seconds no matter how big the caps. (I need about 1.5 seconds). I am therefore thinking of reverting to transistors instead to try to get a slower fade.
(I am driving it from an 8-pin PIC (Picaxe) but I can't do PWM for various reasons.)