Actually you still need the resistor even when the voltage is at the rating of the LED albeit a small one. It serves the purpose of current limitation in addition to dropping voltage.
Right. It's the correct way to get uniform brightness among the set.
It serves the safety purpose, but then you get variability on the brightness because each LED will not draw exactly the same current. So some will be dimmer than others.
The second thing is that the total current through the resistor will be a factor of the current through each LED. So for your example the resistor above would need to allow 60ma of current to flow through to feed the LEDs downline. The problem is that if one LED blows out then that current will have to be pushed through the other working LEDs, which can cause the others to fail also.
Single resistor value divided by the number of LEDs across it.
Right. But see the caveat above.
A better way to do it is to bump the voltage up and put the LEDs in parallel like this:
+V -> resistor -> LED1 -> LED2 -> LED3 -> GNDYou treat the equation the same as a single LED except that you add the Vf voltages of the LEDs. So for your example the total voltage of the LEDs would be 10.8V. So the resistor you'd use with a 12V +V would be:
(12V - 10.8V) / .02A = 60 ohms.
But there are significant differences:
1) Each LED will draw the same current, so you'll get uniform brightness. 2) If a single LED (or the resistor) fails, then the whole string goes out with only the single failed component, instead of going into cascade failure due to increasing overload.When doing a single resistor, that's the better way to handle it.
BTW, thanks for crossposting this message. There are many who don't understand the concept. But I think I'll limit my replies to basics and misc, which are appropriate.
BAJ