PIC processor interrupt for delayed output

What is the best approach to generate a delayed output from a trigger input to a PIC microcontroller? The idea is to have a constant known time from trigger input to output, so I don't want to poll the logic level on some pin due to the jitter that would occur. Thought about using the comparator module to generate an interrupt, but it seems complicated setting and clearing all the various bits and reference levels.

Is there an easier way?

-Bill

Reply to
Bill Bowden
Loading thread data ...

You can use the interrupt on change facility. However, you can't poll or write to the GPIO when you are doing this, because if you happen to be accessing it, the interrupt flag will not be set. Arrgh. Very annoying bug in the chip.

See GPIO INTERRUPT in section 12.4.3 in the PIC12F683 manual for more information.

Regards, Bob Monsen

Reply to
Bob Monsen

Use a timer?

Use a delay circuit that triggers another inteterrupt. Essentially use a cap and res with the appropriate delay. A pic pin supplies the voltage charging up the capacitor. Once the cap has reached the appropriate level it triggers an interrupt pin(needs to be schmitt triggered and the pin that charges the cap needs to be reset). Probably not the best way but would work well for very long delays(> s)

Just don't see why you don't use a timer interrupt as it is probably the most natural method.

Reply to
Jon Slaughter

"Bill Bowden" schreef in bericht news: snipped-for-privacy@h1g2000prh.googlegroups.com...

Assuming the trigger not to be synchronous with the PICs clock, you will always have some jitter. That will be always the case when using asynchronous systems.

As for the PIC, you did not mention what PIC you have in mind. The usual way is using the trigger signal to start an interrupt routine. This routine in turn starts a timer/counter that interrupts when delaytime is over. I see no use for comparators in this scheme.

petrus bitbyter

Reply to
petrus bitbyter

Yes, I notice in the data sheet for the PIC16F628, an on-change interrupt can be generated from any of 4 pins of PORTB . Any change will set the RBIF interrupt flag. But it's not clear how to set it up so it jumps to the service routine when a change takes place. I imagine the GIE bit needs to be set and maybe some others. There is a note that the change may be missed if the input is too short, so I guess it requires a minimum width input.

I might play around with that idea, to see if it works, but it's hard to simulate in MPLAB, and single step the program. How do I simulate an interrupt so the program will single step into the service routine?

-Bill

Reply to
Bill Bowden

You can set up the stimulus package to wiggle the port value after a timed delay. However, the stimulus package is pretty hard to figure out. I find it easier to simply write the program and try it out.

Regarding the interrupt, all you need to do is to set the GIE and GPIE bit, and watch for GPIF bit being set in the interrupt routine. You can tell the chip which pins you want to watch using the IOC register.

If you actually are using a PIC12F683, you can use the GP2/INT interrupt, described in section 12.4.1 of the datasheet. That might be more reliable. I don't know.

Regards, Bob Monsen

Reply to
Bob Monsen

You don't say what else you want the PIC to be doing while it's monitoring the pin.

If you can run it in a tight loop I think you can test 8 input pins (on one port) for a change in four cycles (16 clocks). If you unravel the loop you can get that down to an average approaching 3 cycles at the expense of code size.

You also don't say if you're then going to wait in another loop or whether you'll set up a timer to trigger an interrupt when the output needs to change state. Finally you don't say how long the delay needs to be. Unless you can afford your interrupt routine to trash the W and status register or you can guarantee that the interrupt will never trigger when anything other than a known bank is selected then there is quite a lot of overhead you'll need in the interrupt routine that will limit your minimum delay.

Tim.

Reply to
google

Infact, for a single pin I think you can get this down to 3 cycles or

2 cycles with the loop unraveled.

Tim.

Reply to
google

best approach to generate a delayed output from a trigger

It doesn't do anything, just waits for an interrupt in an infinite loop.

I got it working, and the interrupt routine calls an output routine that sets and resets a couple pins according to a calculated delay. This is what I want, but I'm still not sure what the jitter will be without putting it on a scope and all that hassel. Using the on-change interrupt requires the system to continually read the state of PORTB and generate an interrupt when a change occurs. I don't know how many clocks that takes and if the change is too fast, (say 100nS pulse), it may not be seen at all. I'm going to try and overclock it at 25MHz to reduce the jitter percentage.

Any idea how many clocks are involved monitoring the state of PORTB in a very tight loop of one line where the program continually goes to the same line and waits for an interrupt?

-Bill

Reply to
Bill Bowden

best approach to generate a delayed output from a trigger

It's all in the datasheet - at least for the 16F627 which is the one I've been playing with.

IIRC the pin is checked on every clock - i.e. provided your pulse is at least 1 clock it should be picked up. The interrupt is triggered on the next cycle (cycle is four clocks) - it's not immediately obvious from the datasheet if the interrupt will trigger on the same clock as the pin is sampled so your jitter will be between 0

Reply to
google

ug

RTFM

Reply to
Varactor

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.