Hey,
I'm a reletive newbie in the world of practical electronics (although I've been studying electronics in school and college for years). I'm pretty solid with the theory of the whole thing but I'm having problems with the practical side of things.
The confusion is over batteries.
When analysing circuits on paper everything is nice and obeys ohm's law and the whole lot. But I recently discovered that batteries will only supply a certain amount of current and thus in certain circuits won't be able to supply current that would be obtained in theory. Also, their terminal voltage will drop if asked to supply high currents, I think.
Now I need to know exactly what kind of current I can expect from a battery at its rated voltage. In other words, how will I know if a battery at the correct voltage will be able to supply enough current for a given circuit to work correctly and avoid voltage drops and the like?
Thanks,
Moikel