If a UART baud rate is out by 1.6% will this cause errors? i.e. if the baud rate is 9600 +- 1.6% will a receiver get any erroneous bytes? What about if the receiver has some kind of auto-baud... I would guess that if the transmitter baud rate is a little slow then the receiver might get an overrun errors and try for a slower rate?
: If a UART baud rate is out by 1.6% will this cause errors? i.e. if the : baud rate is 9600 +- 1.6% will a receiver get any erroneous bytes?
If you are running async then that sort of error will not cause problems. The async start bit causes a resync, and after the usual 10 bits of async data the drift has not got far enough to cause problems. You can have more error than that and still work
: ......What : about if the receiver has some kind of auto-baud... I would guess that if : the transmitter baud rate is a little slow then the receiver might get an : overrun errors and try for a slower rate?
The limit for async is 2%. If the sender and receiver are off 2% in opposite directions, the sample point for the last bit is pushed to near the edge of the useable limit.
If you know that one side or the other is bang on, you can have the other side be 4% off. In general so long as the two side are within about 4% (or a smidge more) of each other you'll be OK, but you'll obviously not be assured of reliable communications with other RS-232 devices in general.
Thanks, I'm trying to solve a problem for someone remotely. Symptoms:
HyperTerminal connection to GPRS modem at 9600 baud works fine. Messages come through fine at server.
Embedded device connection to GPRS modem at 9600 baud doesn't work properly. AT commands have to be sent one byte at a time slowly, and once connected messages sent come out as garbage at server end.
Sounds to me like the embedded device's baud rate is way off, although I am assured its only 1.6%. Maybe the modem is out by 3% and hence the error occurs.
Are you sure that the embedded device generates true RS-232 voltage levels and not some "RS-232" compatible levels from a 5 V inverter ?
Is autobauding enabled at the GPRS end ? Try disabling it. In autobauding, the A and T characters are used to determine the speed of the line and if the speed is too much away from the standard speed, some erratic speed settings might occur.
This would suggest that the transmit clock is more than 5 % higher than the receiver clock. When the final stop bit is being received, the receiver sample pulse should be in the middle of the stop bit. After detecting the proper stop bit, the receiver would enable the edge trigger (1->0) to wait for the next start bit. However, with a too slow receiver clock, the stop bit sample pulse is at the end of the stop bit or perhaps after the end of the character.
When sending characters individually, the Mark state remains stable long after the stop bit, so it will get a stable detection of the Mark state (=equivalent to a stop bit) even if the sample pulse is long after the end of the character.
When sending a string of characters, the next start bit is immediately after the stop bit of the previous character. If the stop bit sample point has been delayed into the next start bit, the Space state is detected and the receiver would declare a framing error ("missing stop bit") and just the next 1->0 transition (now inside the next character) is (incorrectly) interpreted as a start bit. Thus, a long string of garbage is received, when the character synch is lost.
Try setting two stop bits when sending a string of characters. If this works, this would be a clear indication of this kind of problem.
Are you sure that the embedded transmitter is actually transmitting a decent stop bit. For instance in the QUICC coprocessor (in Motorola MC68360/MPC860 etc.) there is an option of cutting the stop bit shorter or even eliminate it. Make sure that this option has not been selected.
Which GPRS modem are you using? - I have used some GSM/GPRS modems that are *very* picky about getting the baud rate spot on. Which embedded processor are you using and what's the clock source?
there are some microprocessors with built in UART which generate the baudrate clock from a 20 MHz clock. For 9600 Baud, there is an error of
1 % or - 1,4 %, but this error does not affect the communication. What about messuring the signals with a scope? Some care is necessary to measure UART signals with a precision better than 1 %.
Thanks for your detailed reply Paul, I'm convinced that it is a baud rate issue too, especially since I have been told that the device uses the internal clock to generate the UART clock! The RS232 TX voltage is -10V so its not the driver. The modem has an autobaud feature. I can't do any repgramming as I have only been given a product to test, but I will pass on the comments. Thanks heaps.
did you have a look at the ATMEGA specification for the IRC? It is horrible, there is a temperature drift from -40 / 85 C of almost 10%. If you voltage changes e.g. battery driven the difference between a 5V supply and 2.7V is more than 10% (without the temperature drift). Companies that are serious about using the Internal RC with a UART specify the max. drift over temp e.g. Philips LPC938 which is similar to the ATMEGA but a 2-clock core has less than 2.5% across the temperature range and is trimmed to better than 1% at room temp.
We used this one without any problems running a UART.
The internal clock is voltage and temperature dependant. As a test you can try changing one or the other. As a long term solution, a more accurate and stable clock is a must.
I use the same processor, and RS-232, in many of our products.
When using the internal clock source, and RS-232 on the Atmel parts you *must* use the clock calibration byte to properly calibrate the internal clock or your baudrate can be off enough to cause problems.
-Zonn
--
Zonn Moore Remove the ".AOL" from the
Zektor, LLC email address to reply.
www.zektor.com
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.