I got the following brainteaser I can't handle with: what power we'll get on a resistor powered from 100W amplifier set on the level -40dB, when the source is a CD-player producing white noise at level -36 dB ? What current flows thru that resistor ? We assume, that all the circuits are linear.
You're not providing enough information to answer the question without making assumptions.
The total power in white noise depends on bandwidth: white noise contains equal power per unit bandwidth.
The Decibel is a logarithmic ratio i.e. it is relative, so we have to make assumptions about what the -40 and -36dB are relative to. Let's call it "line level".
Let's assume a "line level" white noise input produces 100W across an 8-ohm load. If we then inserted a total 76dB of attenuation, the output would fall to 2.5 micro-Watts. That would be 0.56mA (RMS) in an 8-ohm load.
Assume that setting the amplifier to -40 dB is done by an attenuator or pot at the input - this ensures that the amplifier is far from saturation. -40 dB wrt 100 W is 10 mW.
A white noise source produces a flat spectrum from near enough DC to the highest measurable frequency. In that case the bandwidth of the amplifier would only let a tiny fraction of the power through, so the power in the load would be negligible. It's more likely that the noise source produces flat noise over the bandwidth of the amplifier under test. In which case the CD player full output (0 dB) would produce 10 mW from the amplifier but at -36 dB would produce only 2.5 microwatts. The current is whatever is necessary to produce 2.5 microwatts in the load resistor.
(I was always taught that dBs is dBs!)
--
Jim Backus OS/2 user since 1994
bona fide replies to j backus jita
demon co uk
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.