AT commands: how to detect unsolicited result codes

As you know, modems (GSM, GSM/GPRS, PSTN, ...) are controlled by a host microcontroller through AT commands send to a asyncronous serial port. The modem answers with a message. Commands and answers are ASCII-based and terminated with and/or .

Usually echo is activated for human interaction, but it is usually disabled when the modem is controlled by a microcontroller.

For example (-> micro-to-uC, AT AT+IPR=57600

Reply to
pozzugno
Loading thread data ...

Just read until timeout. If there are a few character times idle after the last character, you could assume that the message has ended, especially if the last character was .

If autobauding is enabled, there can be a quite long time after the "AT" characters echoed and the rest of the response.

I had a problem with the updated version of the Siemens GPRS modem TC35i (but not with the original TC35), which often failed to handle ATxyz sent in a row.

Finally I had to modify the command transmission to

1.) Send "A" 2.) Wait for echo of "A" 3.) Send "T" 4.) Wait for echo for "T" 5.) Send xyz in a row.

Which worked every time. The autobauding seemed to slowed considerably from the original modem version.

Reply to
upsidedown

snipped-for-privacy@downunder.com ha scritto:

It's a nice idea... does it really work?

The receiving state machine should arm two timeouts: one for the no answer (5 seconds? I think it could depends from the actual command send) and another one for intra-character time (100ms?).

The only problem I see with your approach is with unsolicited result codes that could be sent randomly from modem. Consider the following:

-> AT

Reply to
giuseppe.modugno

I have used this method with MC 68360 and various Power PC with QUICC coprocessor for more than a decade. Of course, any controller with a free timer with microsecond timer will do.

Since this is simple 7 bit ASCII, you could insert special character like TIMEOUTxxxms into the reception queue and let the high level logic make sense of that frame.

Reply to
upsidedown

Just throw away LFs in the Rx ISR and use CRs for the official termination byte.

JJS

Reply to
John Speth

snipped-for-privacy@downunder.com ha scritto:

Do you mean pushing into receiving buffer a non-ASCII character (like 0xFF) to mark the end of the previous message? This could work only if the modem *guarantees* a sufficient delay between the last character of a message and the first character of the next message.

Consider the previous example (AT command from micro, OK answer from modem and immediately after RING unsolicited message). What happens if the RING event is generated *in the middle* of OK answer transmission? I think the modem firmware buffer the RING event and put out the unsolicited message as soon as the serial port is free, i.e. immediately after the last character of the OK answer. Do the modem firmware wait for some delay before putting out the next message?

In case you're right, how could I know the amount of this delay, considering this behaviour isn't documented? It is very difficult to measure it, because I need to reproduce the events above... how?

What is the delay you use to mark the end of a message with your modem? I'm using Simcom.

Reply to
pozzugno

Or even set the highest bit on the last byte :-)

Typically, the AT command set is just half duplex request/response traffic, so framing by idle periods is quite reliable. However, if there are full duplex traffic, just read until next idle timeout. Let the higher level logic figure out what is response to a request and what is spontaneous "RING" messages.

Of course, the receiver buffer must be able to contain a few of the longest messages.

In most cases 2-5 character time seems to work reliably, without spending too long waiting after the whole message. At 9600 bit/s the character time is about 1 ms, so an idle timeout of 2-5 ms should be OK.

So even if the delays between characters is longer you expected (especially after "AT" if autobauding is used), you get some extra idle timeouts in the buffer, but the higher level logic could ignore then, if it is within a known response sequence.

Reply to
upsidedown

snipped-for-privacy@downunder.com ha scritto:

I'm not considering full-duplex traffic, my example is half-duplex.

  1. Host microcontroller sends completely the command AT
  2. Modem answers with OK (during this msg RING event occurs)
  3. Modem transmits unsolicited message RING

My question was: does the modem waits between steps 2 and 3? If the internal firmware of the modem enqueue messages in a global output buffer, it is possible to have two consecutive messages without a delay between them.

Reply to
giuseppe.modugno

If the modem sends "RING" at the same time time as you start sending ATxxxxx, then it is definitely full duplex. This is possible in RS-232 and RS-422, but not in 2 wire RS-485.

I have no idea how each modem on the market behaves.

Reply to
upsidedown

Il 09/05/2014 19:21, snipped-for-privacy@downunder.com ha scritto:

Yes, you're right. But my example above aren't full-duplex. AT command is *completely* send out (step 1 is finished) when the modem starts answering (step 2). And the modem *completely* answered with OK (step 2 is finished) when it starts emitting unsolicited RING message (step 3).

This is definitely a half-duplex communication.

If the modem doesn't *guarantee* a minimum delay (4-5 characters time) between two output messages, are you agree your method (separating frames thanks to a delay) doesn't work?

Reply to
pozz

In the interrupt service routine (which should be kept very simple to avoid blocking other interrupts) just use the idle timeout.

It is to the upper level logic to find out the actual framing, e.g. by knowing the structure of the expected response (such as ending in ).

If it can't make any sense, just resend the command after a while (100-1000 ms). This works well in practice.

Reply to
upsidedown

snipped-for-privacy@downunder.com ha scritto:

I implemented the mechanism you suggested: mark the end of a message after a small timeout (4-5 characters frame length) without receiving any character. In this way, I will have a buffer with the complete message from the modem, answer from a previous command or unsolicited code... most of the time.

During developing I noticed the following. When an incoming SMS arrives, the modem sends an unsolicited message +CNMI to inform about the event. This message could appear concatenated to an answer to a previous AT command. In other words, the modem doesn't insert a pause, even small, between the answer to a previous AT command and the +CNMI unsolicited message. Detecting the start and stop of a message just using the timeout between receiving characters is a broken approach: I could have a concatenated message (answer plus unsolicited).

I know the application layer can break this concatenated message depending on the content and expected answer, knowing the previous AT command send, but this approach will complicate the problem. I hoped the modem gave the opportunity to the host microcontroller to separate the messages at a low-level, without parsing the content, breaking the stream of incoming characters in messages separated at the presence of a long pause between characters. Unfortunately this approach doesn't work, at least with my modem (SIM900 of SimCom).

IMHO, if the application layer should be the only responsible to separate the stream of incoming characters in messages, parsing the content, the low-level timeout mechanism isn't needed at all. It's sufficient to have a FIFO buffer where the incoming characters will be pushed in ISR and popped in the application code.

Reply to
pozzugno

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.