I'm going to pretend that I have a grasp on the significance of the dBm sensitivity (I don't, but let's pretend). The transciever I'm using says it has a sensitivity of -85 dBm. If I'm understanding correctly, a lower dBm sensitivity is better because it will pick up weaker signals than a receiver with a higher dBm sensitivity... right?
So, a sensitivity of -85 dBm would make my receiver about 32 times less sensitive than yours (I understand that dropping dBm by 3 results in about half the power), correct?
I can set the same device to transmit as low as -18 dBm. Am I right in calculating that this is about 15 microwatts? (If power is halved every time dBm drops by three, I calculate that -18 dBm would be 1mW /
64 ~ 15 uw... is this right?) So I'm guessing that my device will perform similar to the setup you've described. In which case, this may work. Thank you.Please feel free to correct any errors I've made.
And thank you or taking the time to explain this to me. I've been very impressed with how helpful everyone has been in answering such a complex question. I half expected to be told to sod off! Thank you all.
If I get this working, I will certainly let you all know.
Cheers!