I have a quick question that has been puzzling me. I am trying to calculate the internal power dissipated inside an op amp. One way of doing this is to determine the power dissipated in the load of the op- amp, then subtract that from the power delivered by a DC supply. The difference is the power dissipated in the output stage of the op-amp.
I have found several articles that touch on this subject, and they all state that the power delivered by the DC supply is the DC voltage multiplied by the average (not RMS) current.
My question is why is the average current being used rather than the RMS like the load calculations use?