simulation tutorial? Asynchronous interrupts?

I'm writing simple programs for a PIC 16F877A. Attempting to simulate with MPLAB.

Linear code...no problem.

But I've never written a program that didn't have multiple asynchronous interrupt inputs and timers.

The MPSIM seems to fall apart in that instance. Yes, there are ways to simulate inputs, but the asynchronous nature is where all the problems happen.

State another way, no problem simulating things that can be fixed by inspecting the code. Adding test code messes up the timing.

Is there a tutorial on dealing with asynchronous microcode? Tips for organizing code to minimize those asynchronous problems? Simulation of same?

Better freeware simulator?

Reply to
mike
Loading thread data ...

mike wrote in news:lj0phq$b4a$ snipped-for-privacy@dont-email.me:

MPLAB 8 has a fairly comprehensive Stimulus Control Language (SCL) for scripting the simulator's 'inputs'. As the simulator is a discrete digital simulation with single instruction cycle time resolution, any effects on a finer timescale than that cant be simulated.

SCL is a subset of VHDL and you can find out more about it here:

Although MPLAB X supports SCL, its simulator is known to be fairly buggy and I'd stick to using MPLAB 8 SIM for now.

Some people like the PIC simulator in Proteus, (fairly expensive commercial product), but, of forum posts mentioning Proteus, the proportion about code and circuits that simulate OK but fail to work in real life is pretty high.

--
Ian Malcolm.   London, ENGLAND.
Reply to
Ian Malcolm

You should probably read a book about operating systems. Back in my day, Tanenbaum's book was actually ok. I don't know what the current favorites are.

Reply to
Paul Rubin

Oops -- double negative alert! Do you mean that you've never written a program that _does_ have asynchronous inputs?

I don't know of any books -- I learned this stuff on the job.

The basic idea is that you have to write your code as if some interrupt routine will start running at any time unless you explicitly turn interrupts off -- because that's exactly what will happen.

So doing so-called "non-atomic" reads or writes, where you're accessing something bigger than a byte, one byte at a time, will get messed up if it gets written to or read from while you're reading or writing it.

You can run out of stack space.

Things can take longer to execute than you thought.

Etc.

I suspect, but do not know, that Jack Gansel's "The Art of Programming Embedded Systems" covers this quite well. I've never read the book, but I've read a whole lot of Jack's other writing as well as spoken with him at the Embedded Systems Conference.

--
Tim Wescott 
Wescott Design Services 
 Click to see the full signature
Reply to
Tim Wescott

I stand by my original statement, but it's not worth argument.

At $58.36 that ain't gonna happen.

The question was about simulation. You and I know how to examine code and determine the effects of asynchronous interrupts. Problem is that the person I'm trying to help can't grasp the concept. I'm trying to offload my involvement so he can use simulation. Sometimes, you do stuff for friends that you'd not do for strangers. I can't just tell him to FOAD.

Reply to
mike

I didn't realize you were asking for someone else. The right answer, in my humble opinion, is to get it right by design. But yes, giving someone tools so that they can hopefully succeed while doing it their way is laudable.

I can't help you there -- I just work hard do it right from the git-go...

--
Tim Wescott 
Wescott Design Services 
 Click to see the full signature
Reply to
Tim Wescott

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.