I need to calculate the idealized (theoretical) maximum transfer of power into a dipole antenna with various loads.
I thought this would be a simple physics book lookup but all of the answer were more complicated that that and I didn't want to run down a rabbit hole of polarization, impedance matching etc. I'm looking for the textbook optimal/ theoretical best case that you never achieve in real life.
here is the situation: I have a 1 ohm resistor between two wires that can be 1" to 12" long.
I measure an electric field caused by a cell phone or wifi (2.4GHz) and I measure the field strength at 100V/m. If the wires were cut to tune the dipole antenna for maximum sensitivity, how much power could be delivered to the 1 ohm resistor. No matching network is there. just a plain ole 1 ohm resistor. wire lengths should be ~ 1.2" each for this frequency.
would it be the same calculation at different frequencies that tune to the 12" length. I assume this would be 1/10th the freq. so 240MHz.
I also assume that as you move away from the tuned frequency the less energy will be coupled for a given wire length. its been a while. :)