Just measure the voltage not the amperage. you are only supposed to measure amperage (current) when you have a load attached. Just attaching the multimeter across the transformer to measure current is the wrong way and you will get useless results.You put the ammeter in series between the transformer and the load. The load you attach will draw only the current required for it to run. it will not draw the 4 or the 3.2 amps you measured. The current ratings on a transformer are only there to say "be sure the load you attach don't use more current than this".

Ideally the center tap should be exactly half the voltage as the two ends. But, in the real world the voltage between one side and the centertap will be slightly different than the other side to the centertap. But it is not much to make any real difference.

If the transformer has any markings or numbers on it you can do a search on yahoo or google with those to get its specifications on a data sheet. If it doesn't have these, see if it has any power ratings written on it . It might be something like 300W for 300 watts or maybe 100VA for 100 VoltAmps (different than watts). These specifications are saying that the transformer can only supply up to this much to the load. From this specification and a little math you can get the maximum current you can run through there. Which you might not even have to do if the load you attach uses only a little current.

If this is all new to you I recommend studying a little more before you proceed with your transformer project.

Josh

*My apologies for having to ask the same question again, but despote > the several helpful responses, I am still stumped. > > I have many different power transformers and am trying to determine > their current ratings so I can select an appropriate one for a > project. > > As I have not been able to get consistent results using various > methods, I decided to try measuring a power transformer that I do have > the specifications for. It is a Radio Shack (part number 273-1366A) > center tapped transformer 12.6-0-12.6V 0.45A. > > What I don't understand is when I apply ~120VAC to the primary, the > secondary voltage reads approximately 14.0V, 4.5A between secondary > and center and 28.15V, 3.2A between secondaries. > > So, with this transformer being rated at 0.45A, how and why am I > seeing >4A on a multimeter? > > Again, the whole point of my exercise is to find a method of > determining the approximate current rating of different power > transformers (generally small transformers, so approximating by weight > is not too practical). > > Thanks again, > > Sean Mathias*