USART interrupt on transmission

The UART/USART peripheral usually available in many microcontrollers triggers a few interrupts. Two of them are DRE (data register empty, as named in AVR documentation) and TXC (transmitter complete).

The first can be used to feed the TX FIFO even during shifting out the last pushed data.

Documentation usually lacks details on these interrupts, for example exactly WHEN they are triggered. I made some tests on SAMD20 (from Atmel/Microchip) and LPC54628 (from NXP) and found a difference.

When the UART isn't transmitting, the DRE flag is set because it is possible to push some data on TX FIFO that is empty. When you enable DRE interrupt, the relevant ISR is triggered immediately because the flag is already set. In the ISR you usually push the first data to write and return. The first data is immediately moved to the shift register so the TX FIFO is again empty in a very short time, so the DRE interrupt is triggered again.

There's a subtle difference between the two MCUs above.

On SAMD21 the second DRE interrupt is triggered immediately after the first ISR returns.

On LPC54628 the second DRE interrupt is triggered AFTER a start bit delay. It seems the first outgoing data stays in the TX FIFO during transmission of the start bit and moved out only after that period. LPC54628 USART has an hardware TX FIFO, but I'm not using it because I configured the peripheral to trigger DRE interrupt when TX FIFO is empty.

The behaviour of LPC54628 is interesting. It can be used to enable an external transceiver after a few dummy bytes to have a short delay in replying on the bus, allowing the sender (that could be slower) to change its direction. If dummy bytes are 0xFF (that appears on the wire as a start bit only), I can enable the driver in DRE ISR when I'm ready to push the first useful data. In this way, all the dummy bytes will not appear really on the wire.

On SAMD20 devices this can't be done, because you would enable the driver before the start bit of the last dummy byte, so really transmitting it on the wire. You need to disable DRE interrupt in the DRE ISR of the last dummy byte, wait for TXC interrupt to enable the driver and enable again DRE interrupt. Push all useful data in DRE ISR and, at the last, disable again DRE and wait for TXC ISR to change again the direction of the transceiver.

Reply to
pozz
Loading thread data ...

Thanks for interesting info. However, it is not clear why do you want to sent dummy bytes? AFACS all you need is a little delay and for delay I would use timer.

--
                              Waldek Hebisch
Reply to
antispam

Some years ago, I read the idea to use the same UART peripheral as a "timer" to have a short delay in the reply. It is sufficient to send some dummy bytes, but not really.

The delay must be short (otherwise bus throughput would decrease too much), so you need a timer peripheral. Why you want to waste a timer for this if we can use the UART?

Reply to
pozz

Well, if there is availible hardware timer, than arguably _not_ using it is a waste. If there is shortage of timers, then multi-purposing single timer looks like better solution.

AFAICS main disadvantage of using UART is lack of portablity and flexibility. You are tied to specific behaviour of UART, changing to different mode (like enabling FIFO) can break your code. And you only get multiple of character transmit time as delay.

--
                              Waldek Hebisch
Reply to
antispam

I'm a bit late into this discussion but this was a standard trick in the days of serial terminals - the first glass terminals used either discrete logic or very primitive CPUs and thus operations such as scrolling or even cursor movement would take longer to execute than would be implied by the baud rate. Look up terminal padding - where NULL bytes were inserted after certain control codes to allow the terminal to process what it had just been sent.

Portability was in fact on the main advantages - the code knew the bauda rate and how long processing took on a given terminal so could compute the padding the add without external dependencies. Important at the ttime since UNIX (in particular) didn't provide convenient sub-second delays at first. It also benefitted from the fact it a NULL byte had known behaviour on most terminals - i.e. a no-op.

On an embedded device you'd have to weigh up those benefits against the closer to the hardware nature of firmware - i.e. you have less intervening software in the way isolating you from the hardware. Personally I wouldn't like to say what's best except in relation to the circumstances - like any delay you might be looking at a timer and interrupt, a delay loop or even a couple of no-ops (or shuffling of tasks) to enforce however long is needed.

--
Andrew Smallshaw 
andrews@sdf.org
Reply to
Andrew Smallshaw

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.