If it's async data, it likely won't work with just a shift register and a digital comparator; you could get false alarms. It probably needs byte framing logic, the start and stop bit thing.
--
John Larkin Highland Technology, Inc
picosecond timing precision measurement
jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
Certainly it's possible. If you're talking about standard asynchronous serial, you'd probably have to do just as real UARTs do - sample 4x per bit-time, find the center of the bit (so you don't end up actually data-sampling right at an edge), and have enough state to be aware of the asynchronous "start bit, data bits, stop bit(s)" timing. A simple-minded, purely-synchronous shift register approach would probably tend to mis-fire (false positives) on 8-bit patterns which were actually composed of bits from two separate bytes.
With tiny 8-bit micros being as cheap as they are, I suspect you'd find it less expensive to use one of those, rather than discrete logic. Most simple 8-bitters have built-in UART peripherals, but these generally need a fairly accurate crystal-clock to receive accurately. You could probably write a self-synchronizing "software UART" using 4x or 8x sampling rates and a fairly simple state machine to synchronize with the bits - I did this for a software-based synchronous-serial AX.25 receiver - and be able to use a cheap micro's somewhat "sloppy" on-chip clock oscillator without needing a crystal or resonator.
Still - small ARM Cortex is much cheaper than any programmable logic of such size that it fits your needs.
What is your spot-on in seconds? If the input is async current loop, the timing accuracy cannot be much below some tens of us. A Cortex runs tens of instructions each us, and it should be faster than enough for your data.
If it can be done with not too many off the shelf logic ICs, it might be worth it. If it starts getting up around 8, 9, 10 just for the byte detector then yeah, not worth it.
Tens of uS is okay, you're right the intrinsic accuracy of the signal won't be much better than that.
It's nice to not have to program absolutely everything.
with an mcu doing nothing else it would be easy to get within a few instruction cycles, wait for input going low (the start bit) and time everything with NOP loops
The ASR33 teletype did this, determined the character XON and engaged the paper tape reader. A classical UART chip (no FIFO, just parallel output) with a decoder (74HC138) or two can do the job.
In either case, 'hairball of logic' is correct.
A more modern approach might be to leave a terminal window in the background, and listen for the 'BEL' character. Unless you train your cat to react to the bell, it's not a hairball situation.
I feel dirty[1] suggesting it, but any of the many MCUs knocking around could do that up to, say,
100kB/s-1Mb/s.
For ease and simplicity of programming (the only hardware you need is a USB cable), you could look at the Adafruit Trinket or Cypress PSOC 4 prototyping kits.
The former is trivially simple and is based on the 8 pin ATTiny85 device which includes programmable timers, ADCs etc. The latter are more complex but also have some internal programmable logic gates.
There are, no doubt, many others. No doubt some have hardware UARTs, but I can't be bothered to filter them for you :)
[1] because it is a classic first logic design task you give a student, expecting them to use a few TTL ICs or a couple of PALs or a CPLD.
For async serial deteting the start bit is important
so a non-retriggerable monostable-shot that runs for {byte_length} triggered by falling egde in inout a astable with enable fed from by above a shift register clocked by above astable a latch clocked by falling edge of one-shot (could be part of shift reg) then whatever it takes to compare with the desired value
I'd use a $5 arduino clone before doing that, or a 50c PIC-10 with a software UART if it was needed in quantity.
--
This email has not been checked by half-arsed antivirus software
Not quite there. You need the 16x counter to be preset to a value of 7 (I'm assuming a down counter) when the data register is set to all 1s. It can't start counting until the input is a zero which sets a run bit. Then counting the 7 clocks will get you to the center of a bit and the data register is enabled for one clock each time the 16x counter reaches
When the start bits finds it's way to the end of the shift register (you only need 9 bits for this since the last data bit is on the input) the register is set to all 1s, the 16x counter is preset to 7 and the run/stop bit returns to STOP. All this requires combinatorial logic.
This still won't squeeze into a 22V10. But a slightly larger part such as the Atmel ATV750, or using an external data register or counter might do the trick!
I still think the MCU is the way to go. You can get units that have an internal RC that is accurate enough for a UART.
Your bit center alignment has to happen somehow. That happens exactly once at the combination of seeing a zero on the input and be in the process of shifting data, i.e. after seeing a start bit and before counting the stop bit. The only other way to detect this is to use a wide AND gate to detect the start bit in the shift register. In a PLD either way is about the same. In SSI/MSI logic I don't care because I would fire an engineer who proposed discrete logic for this.
I had a design to control a few relays inside a radio. When I looked at how simple it was (essentially SPI bus) I realized MSI could do it in some five devices and take virtually *no* power and create virtually no noise. But I was told that was too error prone in the sense that the interface may change so they had me throw another MCU at it. Silly really. This was a hand held military radio and costs some $10,000. I believe it has some 15 or more processors in it. Good thing they are small.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.