signal strength on a GPS

I'm asking this here in the hope someone knows something about GPS receivers.

I'm a decent tech, but not an engineer...

I have a Tomtom 2535 gps. Nice unit blah blah.

Some ( software? ) engineer decided to change the gps info display. It now contains a pic of earth with dots for satellites instead of a bar graph. Ok. No biggie.

Except now there is a signal strength percent! Percent of what???

Full quieting? 20 db quieting? 10 db sinad? or ????

I ask because at less than about "80%" I get incomplete data ( speed, direction, etc ). This is with 7 or more "locked" satellites.

This occurs quite often and I'm trying to figure out if its bad or "normal" operation. If its weak signal ( 100% is 10 db sinad ), I can understand it. If 80% is 80% of full quieting, then it must be broke.

Does anyone have a clue what 100% is relative to?

Thanks

Jim

Reply to
Jim Whitby
Loading thread data ...

Can I guess? 100% is full scale on some manner of RSSI (relative signal strength indication) buried inside whatever chipset is used in the TomTom 2535. I run into the same thing in cell phones and wi-fi chipsets. The relationship between receive signal level in -dBm and whatever the chipset produces is not linear. The output is usually in the form of an 8 bit number from 0 to 255. In order to convert that into signal level in -dBm, a lookup table would be needed. For cell phones, they just coarsely quantize it into 4 or 5 "bars" for the user, but produce the real signal level in -dBm in the "service mode" page. Looks like TomTom took the easy way out and just produced a percentage of full scale. I suspect that if you dive deep into the well hidden GPS diagnostic pages in your TomTom, you might find the signal levels in -dBm. Wi-Fi is more of the same. Some GPS and Wi-Fi chipsets now have an internal lookup table for the conversion. Either way, the number produced is still bogus if you have an external GPS antenna amplifier. The indicated signal level will be what appears at the receiver input, not what is at the antenna.

Disclaimer: I couldn't find the chipset used and therefore could not RTFM to see if my guess is correct.

You might get a more specific answer in sci.geo.satellite-nav on TomTom support site:

or on multiple TomTom user forums.

--
Jeff Liebermann     jeffl@cruzio.com
150 Felker St #D    http://www.LearnByDestroying.com
 Click to see the full signature
Reply to
Jeff Liebermann

sci.geo.satellite-nav

Reply to
miso

Well, the way a GPS receiver works, there is no way to actually measure the signal. It is below the noise of the receiver! What they do is autocorrelate the signal with a deciphering key and if the data passes a checksum, it is good. So, they must be measuring the ratio of good to bad decoded data blocks.

Jon

Reply to
Jon Elson

A software engineer probably doesn't know what a dB is.

--
John Larkin                  Highland Technology Inc
www.highlandtechnology.com   jlarkin at highlandtechnology dot com   
 Click to see the full signature
Reply to
John Larkin

dataBase?

Best regards, Spehro Pefhany

--
"it's the network..."                          "The Journey is the reward"
speff@interlog.com             Info for manufacturers: http://www.trexon.com
 Click to see the full signature
Reply to
Spehro Pefhany

Thanks will chk there.

Not much help there, from what I've seen. Can't get to anyone at Tomtom that might know what the answer is ( I suspect (sh)he is locked up in a Faraday cage).

Agin thanks for lead to sci.geo.satellite-nav

Jim

Reply to
Jim Whitby

dBm as in "don't bother me".

Using Occam's Razor, I suspect your simple answer may also be the correct answer to the question. It may also have been something equally simple, such as running out of program space or running out time to code the output in dBm.

--
Jeff Liebermann     jeffl@cruzio.com
150 Felker St #D    http://www.LearnByDestroying.com
 Click to see the full signature
Reply to
Jeff Liebermann

You're an idiot.

Reply to
MrTallyman

They measure timing flag packet arrival times from three satellites.

Reply to
SoothSayer

It's worse than just the signal being below the noise: the way a GPS receiver works, the received signal is always relative to the received noise: all but the most expensive GPS receivers pass the signal through a comparator (basically, or a "1 bit ADC", or a "splitter", but it's all the same thing). It works because unless there's a jammer in-band, the satellite signal is buried in noise, so the noise works to linearize the "ADC".

So there is no way to measure the signal strength as such, which means there is no way to compare it against any absolute reference.

And, that means that the question "100% of what?" is a good one, and (I'll bet) one that you'll not find an answer to easily.

--
Tim Wescott
Control system and signal processing consulting
 Click to see the full signature
Reply to
Tim Wescott

On a sunny day (Sun, 16 Sep 2012 00:03:50 +0000 (UTC)) it happened Jim Whitby wrote in :

Probably signal to noise ratio, in fact bit error rate. In GPS (the US system) all sats transmit at the same frequency, their data xored with some pseudo random code that is different for each sat. So they likely actually use the bit error rate for each satellite. That is how 'clean' the detection of each sat signal is in the total signal after comparing it its code.

Google for example: IS-GPS-800B.pdf for a lot of technical details.

Reply to
Jan Panteltje

Percent of 1-bit correlator output vs ideal full scale output. This is a measure of SNR.

Vladimir Vassilevsky DSP and Mixed Signal Consultant

formatting link

Reply to
Vladimir Vassilevsky

OK, explain to us what a dB is.

--
John Larkin                  Highland Technology Inc
www.highlandtechnology.com   jlarkin at highlandtechnology dot com   
 Click to see the full signature
Reply to
John Larkin

If you use a 4-quadrant (Gilbert cell) multiplier (mixer) instead of a XOR gate, the despread signal will definitely have a positive SNR in the 1000/50 Hz bandwidth) and the absolute signal power should be easily measurable.

Reply to
upsidedown

On a sunny day (Sun, 16 Sep 2012 19:48:12 +0300) it happened snipped-for-privacy@downunder.com wrote in :

No, you misunderstand the GPS system. The satellites all transmit at the same frequency, their digital signals are xored before modulation with a pseudo random code, a so called 'Gold code'. This code is different for each satellite. Basically this is done with a shift register. In the receiver AFTER demodulation the digital signal is run through a similar shift register, and one by one tested for the correct Gold code (each sat has its own code). In case of noise, you do not get an exact match against the know codes, and the number of wrong bits is then a measure for the signal to noise ratio.

100% good signal, no bit errors. Weak signal lots of bit errors.
formatting link
Reply to
Jan Panteltje

If it's Trimble, the measurement is relative to AMU (Arbitrary Mystery Units).

formatting link

Reply to
qrk

its own code).

number of wrong

The GPS signal is an ordinary direct sequence spread spectrum (DSSS) signal. It can be received with a mixer/multiplier/xor gate by multiplying the received signal with the same chip clock (in this case

1.023 MHz satellite specific PRN sequence) _synchronized_ with the transmitter modulator chip clock. After the demodulator the about 1 MHz wide spread spectrum signal is despread to something about 1 kHz. At this point you could also make power measurements. After this, the 50 bit/s data signal is extracted.

The shift register and bit error detection you are talking about has to do with signal acquisition _before_ the receiver chip clock generator has been synchronized with the transmitter chip clock.

Reply to
upsidedown

I didn't understand a damn thing in that document!

Reply to
G. Morgan

its own code).

number of wrong

Do they x-mit a different freq. for military use?

Reply to
G. Morgan

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.