baud rate autodetection on AVR 8-bit?

Pefectly possible...you just have to write the software....

Reply to
TTman
Loading thread data ...

someone beat you to it...

formatting link

Reply to
amdyer

RS-232 originally only specified, how a DCE (Data Communication Equipment, i.e. a modem) should be connected over a short distance (up to 15 m) to a DTE (Data Terminal Equipment) either a central computer or a remote terminal. Thus it made possible to use remote terminals (DTE to DCE) over a communication channel e.g. (leased) phone line to central computer (DCE to DTE).

The standard specifies the voltage levels and abstract CCITT signal numbers as more familiar signal names. The original standard did not specify the DB25 connector or pin numbering.

However, the specification includes secondary channel, clock signals (for synchronous communication) etc.

Later on the standard was "misused" to directly connect terminals to a computer locally and various modem eliminators (null modems) to skip the DCE-phone_line-DCE part of the remote circuit. In the simplest case, just cross connect TxD and Rxd, however, various tricks are needed to fool handshakes. For interfacing two synchronous devices, some electronics is actually needed in the "null modem".

The problem was that each manufacturer "misused" the DTE-DCE standard _differently_ causing problems for "null modems" for direct DTE to DTE connections !!

It should once more be stressed that the RS-232 standard was not originally designed for direct terminal to computer interfacing.

Much simpler systems existed for local terminal to computer interfacing, such as 20 mA current loop. In a mechanical Teletype, the only semiconductors were the rectifier diodes in the power supply (24-60 V) and a big power transistor to generate the 20 mA current source. A Teletype with RS-232 interface required at least an additional +/-12 V power supply and at least two ICs (e.g. 1488/1489) or a lot of discrete components before those chips were available.

Reply to
upsidedown

Well, I was able to order 10 ATmega8's for $8.80 on eBay. (Hopefully, I'll get them within a week from now.) Besides, it isn't quite "for less money than," as was stated above.

Digi-Key doesn't seem like a sensible choice /for me/, either. FWIW, they're located on a whole different continent.

ACK, thanks.

The point is that should I ever end up designing my own kits, there'd be a whole world of hobbyists that won't be able to tackle anything with pitch finer than that of TQFP. (Or perhaps even finer than SO; but there, ARM seem to be in an advantage, as some LPC111x seem to be available in SO just as well.)

--
FSF associate member #7257
Reply to
Ivan Shmakov

RS232 doesn't have a protocol definition. And it never will.

--
John Larkin                  Highland Technology Inc 
www.highlandtechnology.com   jlarkin at highlandtechnology dot com    
 Click to see the full signature
Reply to
John Larkin

There is a "protocol" in that RTS must be followed by CTS and so on.

Apart from that, there are a lot of embedded systems that don't need "fancy" features. When the data rate is fixed there is no need for autodetection. It adds cost, complexity, rampant bugs, all sorts of nastiness.

A good design is the least complexity to achieve maximum safety and compliance, not a host of sophisticated features that are used for no good reason, only because it is trendy.

Of course, in some cases, you need sophistication, but then, by all means, one should use another protocol.

Reply to
Lanarcam

There are quite lot of handshaking in the DTE-DCE connection.

Typically the DTE computer/terminal sets the DTR (Data terminal Ready) when it is powered up. The modem (DCE) sets the DSR (Data Set Ready) when it is powered up and ready to communicate (telephone contact established). Until both DTR and DSR are on, there is not much point of trying to communicate.

In half duplex (radio)communication world, rising RTS is an indication that this station wants to transmit. For a radio link, this might include listening to the radio channel if the radio channel is free and if so, turn on the radio transmitter and wait until the PLL is stabilized, before turning on the actual radio transmitter, after which the CTS is raised and the actual message transmission can begin.

Those signals are there or a reason, not to make it harder to interface ordinary devices.

Reply to
upsidedown

Actually RS232 specifies signal levels the cable length urban myth is a common misreading of RS232A and RS232B, where it said that if configured using a very capacative cable with lots of adjacent signals the noise level went up at 15m. However the noise level never went across a noise margin to cause problems.

I have seen RS232 driven down 1km of bell wire and function correctly.

The DTE/DCE comes from CCIT V24 (Now part of ITU in particular ITU-T) is a telecoms standard particularly for modems and working out what was an end point and a mid-point (modem). This is what originally specified the DB25 and the signal naming.

The abstract CCITT numbers are CCITT V24 refernce not RS232 until about RS232 Rev D.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk 
    PC Services 
 Click to see the full signature
Reply to
Paul

That is CCITT V24 and actually does not specify RS232 only actually could be used with all sorts of signalling levels and connectors.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk 
    PC Services 
 Click to see the full signature
Reply to
Paul

As others have mentioned, RS-232 does not come close to defining the stuff you want, and you need several layers higher in the stack. And heck, not even the signal levels required by RS-232 are particularly well respected.

But after doing everything you want, you're going to end up with something largely incompatible with conventional async/serial/RS-232 style connections. At that point, there's no point - that sort of serial connection is used *only* because support for it is ubiquitous, not because it's a particularly good technology. So if you change it, it becomes irrelevant.

And in this day and age serial ports are (slowly) dying anyway. And once you add all that stuff into your link, you'll be back to something with complexity similar to that of USB anyway, so why reinvent it? And for peripherals, you've been able to get single USB chip implementations for a buck or two for years now, which are only minimally more difficult to use than a bare serial port from your device's CPU. Or if you don't like that, throw a buck or two Ethernet port and a TCP/IP stack on the device.

Reply to
Robert Wessel

Here's how I've done it in a commercial product that has to support auto-baud between 4800 and 57600 Baud on a fixed 8N1 format:

Check the framing error bit in the receive interrupt handler. After 20 framing errors, switch to the next baud rate and reset the framing counter.

Works like a charm.

Meindert

Reply to
Meindert Sprang

[...]

Doesn't it mean that the host has to transmit considerable amount of data for the device to adapt to the baud rate used? Given the possibility of "interactive" use, such a delay doesn't seem all that reasonable.

Might work as a last resort, however.

--
FSF associate member #7257
Reply to
Ivan Shmakov

It takes some time indeed. But in my application (receiving a constant data stream from navigation instruments), this is no problem.

Meindert

Reply to
Meindert Sprang

The time could be improved by keeping track of both well received data, and framing errors. At first, you could try a new baud rate after 2 or 3 framing errors, but as soon as you receive a couple of good chars, increase the tolerance for further framing errors.

To improve detection time, save a good baudrate in non-volatile memory so it can be used as the first guess when the device powers up again.

Reply to
Arlet Ottens

Most AutoBaud designs also send a known character to calibrate, and a pause .

If you want to auto-detect on random data, that is more complex, and you w ill (usually) discard info while you are deciding. Then, even if your hardw are is smart enough to quickly lock onto a bit-time, you next have to decid e which is actually the Start bit...

So there is no magic solution, but you can make systems that behave in a k nown way, reliable.

Another approach is to start a baud dialog at a known low speed, and then exchange information about mutually supported higher rates, and then switch to that. If you want highest speeds, and widest clock-choice tolerance, that is the best approach.

Just to reality check that aspiration, I see Bus Pirate have moved to use a 256K Flash device.

["We didn't want to run out of space again soon, so we used a PIC 24FJ256GB 106 with 256K of space."]

So you might want to look at the ATXmega parts, as they have some good pric es on smaller USB models.

-jg

Reply to
j.m.granville

(usually) discard info while you are deciding. Then, even if your hardware is smart enough to quickly lock onto a bit-time, you next have to decide which is actually the Start bit...

way, reliable.

It's impossible to make baud autodetection 100% reliable without the cooperation of the sender. Consider that at 8-N-1, the single byte

0xB5 is indistinguishable from the pair 0x67, 0x67 at double the baud rate. Hardly the only such example, just a handy one. For a simple doubling of the baud rate at 8-N-1, an indistinguishable pair of bytes exists at the higher baud rate for any byte where the fourth bit (from the high end) is one and the fifth is zero at the lower baud rate.
Reply to
Robert Wessel

will (usually) discard info while you are deciding. Then, even if your hardware is smart enough to quickly lock onto a bit-time, you next have to decide which is actually the Start bit...

known way, reliable.

I think that's true if you define " without cooperation" to mean that the receiver has absolutely no prior knowledge of the message from the sender and the data stream is continuous with no significant inter-character intervals. If you know that the data is ASCII text in a given language, you probably have enough data to get the baud rate correct, given a large enough sample size.

It the data stream is encrypted binary data in a continuous stream, you've got more to worry about than just the baud rate!

Mark Borgerson

Reply to
Mark Borgerson

will (usually) discard info while you are deciding. Then, even if your hardware is smart enough to quickly lock onto a bit-time, you next have to decide which is actually the Start bit...

known way, reliable.

I'd consider sending a known (or at least constrained) stream to be cooperation. Obviously the tighter the constraints, the more quickly you can get to a required confidence level in your baud rate detection.

As for inter-character gaps - it depends on the speeds and characters in question - so long as the low speed characters meet the xxx10xxx format requirement, then the double speed stream simply looks like two immediately adjacent characters, and gaps between individual low speed characters are just gaps between high speed pairs.

My point was more that automatic baud rate detection is a hack, albeit a useful one in some circumstances, although it will always have limits. And frankly the use of RS-232/async ports should not be a first choice these days.

Reply to
Robert Wessel

Here

formatting link
is an example of watching the pulse widths to infer the baud rate. It decides on the rate by the end of the NMEA identifier field, leaving the rest of that sentence available for sanity checks (framing errors, reasonable character set, etc.). There are certainly other approaches but I've been pretty successful with this.

--
Rich Webb     Norfolk, VA
Reply to
Rich Webb

pause.

will (usually) discard info while you are deciding. Then, even if your hardware is smart enough to quickly lock onto a bit-time, you next have to decide which is actually the Start bit...

known way, reliable.

I still find them useful when connecting to oceanographic instruments. I have been able to get full-speed usb (12mb) through a pair of waterproof connectors, despite the impedance problems. I haven't yet tried to get USB through the connector and 20meters from the deck to the dry lab on a research vessel. I've also tried Zigbee radios, but they run into problems with aluminum pressure cases and deckhouses.

One advantage that serial ports have over USB is that, with a properly designed receiving system, you have controlled latency that can allow time-stamping of incoming data. That's more difficult with USB or radio links where the data gets mashed together into packets and the reception time has little relation to the transmission time.

Mark Borgerson

Reply to
Mark Borgerson

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.