The lastest issue of EDN (Oct 14, 2004) has an article on RFID that includes a line that has me puzzled: "With all other things being equal, high-frequency RFIDs have longer range than their low-frequency counterparts, fundamentally because near-field effects don't degrade high-frequency RFIDs' signals. If a tag is less than one wavelength away from a reader, the signal decays with the cube of the distance; beyond one wavelength the signal decays with the square of the distance."
I don't recall any "cube of the distance" stuff from my school days, but I haven't done any RF since then. It sounds counter-intuitive to me, and certainly not the way sound waves behave in the near-field. ( Where sound wave fronts are nearly parallel, the decay is greatly reduced.) Can somebody explain why RF should be different? Or is the article wrong?
Bob Masta dqatechATdaqartaDOTcom D A Q A R T A Data AcQuisition And Real-Time Analysis