I have a matrix of LED's, six rows and seven columns:
X X X X X X X
X X X X X X X
X X X X X X X
X X X X X X X
X X X X X X X
X X X X X X X
Across the rows, the LED's anodes are common, and each row has a single microcontroller pin.
Down the columns, the LED's cathodes are common, and each column has a single microcontroller pin.
The row pins have to source 25 mA, which isn't a problem for the =B5C. The column pins, however, need to sink 150 mA (150 mA =3D the current that will flow if every LED in a column is lit).
Because the column pins can only sink a maximum of 25 mA, I was going to use a transistor to provide the extra current. Since I have 7 column pins though, I'm going to need 7 transistors, which is why I'm thinking of going with a "transistor array".
I'm just wondering is there any "bread and butter" chip that people use for this? I've had a quick look at the ULN200X family and they look kind of suitable. If I was picking my own transistors, I'd use FET's as switches (instead of bi-polars), but I suppose if I the driver chip has a base resistor inside it, and if there's negligible current flow when it's turned off, then maybe a BJT array will be OK.
(Of course I'd be wise to switch the rows and columns so that I only need 6 transistors, but I'll get on to that later)
So what would you use as a "driver chip"? My microcontroller pins work at 5 V for high and 0 V for low.