We've got some Intercel GSM / GPRS modems in one of our products.
The modems have a couple of solder pads for connecting to the antenna via coax. We've been playing around with different termination routes and seeing different power consumptions and were wondering how this may relate to the matching of the antenna connection. We are using a good quality Belden RG-178 cable.
We are terminating as recommended in the manuf. data sheets and it all works ok however we find the variation in power consumption, estimated at ~ 50% interesting.
Our layout is:
---------- |-------------------------|====\\ modem |- - - - - - - - - - - - - - -|------=============== |------------------------ |====/
----------
| coax 12 -->80mm | SMA | 30mm stubby antenna
Is varying the interconnecting coax length between say 12mm and 80mm likely to significantly change antenna matching - radiated pwr / rcv signal??
Any advice for some RF dummies is welcome!
thanks rob