I have some 1 watt LED's that I want to flash on and off.
I have a 317 current regulator supply set at 300 ma for 3 of them in series.
Now, I know there are alot of professional designers on this group and was wondering if I couldn't get an opinion on my approach to this, as it makes sense to me.
I know many decisions are made based on parts cost but either way I do this, the cost is about the same....the cost aspect is not what I'm concerned with.....just if I'm making "the right choice".
I initially thought I'd use a 555 timer to have a constant flash rate on an optoisoltor LED and control on/off with the LED ground.....then use the opto output and a sink transistor to control the DC path.
Cons: what if anything were to change with the transistor and present a change in the DC series circuit?....could I get an "on resistance" that would be low enough and STAY in a specific window? With 300 ma flowing through, some "potential" for disaster would be there.
Why not this....which makes more sense to me....
Control the AC to the transformer with the same setup....flash a triac driver opto LED with a 555, control on/off with the opto LED ground, use a triac to switch on and off the transformer primary?
That's what I have breadboarded now...works fine.
Don't you think it would be the proper way to go as far as a design decision?......it lets the cc power supply be stable and avoids risk?
The only risk on doing it the AC way is an AC line surge but I'll use a400 - 600v triac good for 1 amp or so.