Looking for thoughts on embedded systems education

Hello all, I've been teaching embedded systems for quite a few years now and we are looking to make some major changes in what we're doing (going from one class to two, going from PPC to ARM/ATOM/AVR, etc.) I _think_ we've got a pretty good idea about the direction we're going, but I thought I'd come here and try to get feedback about what folks in industry (and anyone else frankly) think is important and what isn't. In general I'd love to hear any thought you all might have, but in specific I thought I'd get feedback on how important it is to spend time on certain topics. Again, any and all feedback would be welcome...

  • ABI (application binary interface) stuff. Specifically caller/ callee registers, dedicated registers, stack management, and the like.

  • Writing code in assembly.

** Specifically handling all the issues associated with interrupts by hand (saving all registers used, nested interrupts and saving state for that, identifying interrupts sources etc.)

  • Processor memory bus interfacing. Timing, having I/O devices respond to external memory requests etc.

  • Creating hardware (via FPGA) to do interfacing for either memory bus or some other protocol.

  • I2C/SPI/other simple serial interfaces

** Is just learning to use them enough? Should they learn to control wires by hand?

*USB/firewire or other more complex serial schemes

  • PCB design.
** High speed PCB stuff important?

  • High-level tool sets? Matlab/Simulink, Android development stuff, Arduino?

**What should they learn? Does it matter which? **Is writing low-level modules to work with these tools important?

*Writing Linux device drivers?

Reply to
Mark Brehob
Loading thread data ...

It all depends on the topics in your curriculum I suppose and I think I would write the curricula for embedded systems teaching slightly different to most.

It is all very well teaching students about the basic architectures of embedded systems and definitely give them an appreciation of machine coding and assembler. Once they have that under their belt then I would look at the other aspects of decent embedded systems design, including the design management aspects (not many courses seem to cover that well enough). Topics such as re-entrant code design, interrupt service routine design with simple fault-finding and performance monitoring techniques (ban the use of a logic analyser as a fault-finding tool). I wouldn't cover operating systems in particular but would certainly cover some of the functions operating systems would include.

I think I would introduce them to the notion of system modelling, task analysis and function point analysis, exploring the risk elements posed by systems and how to architect to reduce those risks, and insist that they ensure the security and survivability of their project work (probably would question whether or not to deliberately put a spanner in the spokes for that bit but you would definitely get to know who had backed up properly and had decent information and code management in place ... I know, it sounds cruel).

I would definitely cover communication systems and protocols as a segment and make one of the projects a "with comms" one.

--
********************************************************************
Paul E. Bennett...............
Forth based HIDECS Consultancy
Mob: +44 (0)7811-639972
Tel: +44 (0)1235-510979
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E. Bennett

Two of my comments:

Not really, low speed and low cost PCBs are more important to me and my customers.

Not really, but portings tools/applications are necessary.

For example, I am right in the middle of GNU hell right now.

In my case, older version of GCC works for older chip. However, new chips need new GCC. So, new M4, MPFR, GMP, LLC, AM, AC and GCC upgrade hell.

Reply to
linnix

Just as a guiding thought, designing embedded systems teaching seems to require some ability at predicting both the needs of students to come as well as industry directions. So what do you see there? (There are some core values, I think, and you've mentioned some good ones below. But suggesting additions to your list would be easier if there were some idea what you see as student need and industry direction.)

So can you talk about that, then?

These two go hand in hand, I think. Should be taught either in the same class or nearby in time. I'd add "computer architecture" generally, here, as well.

And maybe, if you can consider fitting it in, floating point formats and some discussion about numerical methods and possible pitfalls using FP in numerical applications.

Interrupts, level vs edge triggered, etc. come immediately to mind when talking about interfacing. Also, various bus concepts such as ISA (incident) and PCI (reflection) and direct memory access methods all flood in. Some of this is still meaningful at the very small instrument level, some only comes in with very complex systems (PCI, for example.)

And >* I2C/SPI/other simple serial interfaces

Including multi-master and then the few undefined states that can occur there?

You know your limitations better than I do. If I were thinking about embedded systems classes, I'd probably want them to be a 'total immersion' thing -- 15 hours a day with the students captive on some big ship in the ocean where they cannot escape. ;) And there are few things better at teaching that putting hands onto actual situations and real results to help drive home the concepts and theories being discussed. Theory and result go hand-in-hand, I think.

Once sufficiently into multi-master I2C, I think USB will be a logical extension if you can afford the time there. I'm not sure about how you would consider dividing time between host and slave, though. Your 'market' determines that as well as whether or not you cover it at all.

I encounter USB more than I'd like, though. Sometimes, the solution is to just "buy" a USB-RS232 solution and write the traditional serial port code to get by. Total Phase's Aardvark and so on comes to mind here. Other times, that isn't acceptable and more knowledge helps. I don't know where the future is going, here, though. I'm mostly just following along and dealing with what sits in front of me. You need to anticipate, I think.

Not in my general experience. Except PCI, with serpentine clock lines and skews, mostly because "it's so abundant."

Not in my experience. But it is admittedly narrow.

How to find information when they need it, how to interpret what information they find, .... okay, basically how to _imagine_ as well as how to perform when told exactly what to do, because there will be times when imagination is what is needed.

Actually, encouraging imagination is something that can be done as part of the above technologies. For example, before teaching them about multi-master in I2C (if you do that, at all) and before even letting them know about it, take a break and pose the problem to them for their own imagination. Ask them about how they might 'modify' the single master I2C in order to achieve a usable multi-master situation. Let them work in teams on the problem and then present their ideas shortly later to the class. Let them deal with objections, too. It would be a great segue into multi-master, but it would also be a lost opportunity for the students to develop their own imagination skills if you just rolled on immediately into the established multi-master solution without giving them the opportunity to think for themselves here.

This should underlie much of the teaching -- don't miss your opportunities to encourage imagination. I think this means breaking students into groups, in many cases, too.

And this gets back to why I'd would prefer to have ALL of their attention, all day and into the evening, for a concentrated situation.

At some point, not necessarily at first though, I think it would be very good to actually have the teams compete with each other, some of the time, and be given specific challenges with boundary conditions spelled out but also where they are completely free to come up with a solution, so long as those written boundaries are accounted for, and then given very short time scales to come up with a solution.

That's a question for you, not me.

I haven't ever done an embedded Linux thing. But there are some devices out there that use it. However, I suspect the businesses (I'm ignorant, so have no idea) playing in that game probably have their own specific ways of training new hires and unless you are willing to become a vassal of one or more of them, I'm not sure you can focus on that.

Best of luck and let me know where this might be taught.

Jon

Reply to
Jon Kirwan

That's exactly my problem. The latest USB enabled chip needs the latest GCC tools chains. Updating tools are very time consuming.

USB-RS232 is too limited in device control. For integrated USB micros, I use libusb very often.

I don't need to write Linux driver too often. I have to use tools and other people's works. Not just Linux, but public software in general.

I would give the student an A (or even a check) if he/she can upgrade/ update to the latest GCC 4.4 or 4.5 using GCC 3.4.

Reply to
linnix

Hi Mark,

I teach a pair of embedded systems courses. I believe in both covering the underlying fundamentals (especially interrupts and memory bus interfacing), but also in covering higher level design approaches. To me whether you pick SPI or USB or I2C isn't as important is that they get exposure to something that smells like a serial port.

Rather than make a big long post I'll direct you to the courses I teach. The home pages have a lecture-by-lecture list of exactly what we are teaching. Tastes can reasonably very quite a bit, but in my case the content is informed by a number of years in industry before I switched to teaching.

Junior-level course:

formatting link
(gives specific section references in a course text) Senior/MS level course:
formatting link
(includes slides on-line)

Cheers,

-- Phil

Phil Koopman -- snipped-for-privacy@cmu.edu --

formatting link
Author of: Better Embedded System Software, Drumnadrochit Press 2010

Reply to
Phil

From the littany of topics that follows, it looks like you are trying (perhaps) to cover too much. And, probably at a level (complexity) that many won't experience "in industry" (there are a lot of *small* embedded systems to which many of the issues below would never apply).

Are you wanting to address hardware or software? or both?

I would concentrate on what makes an embedded system *different* from a "desktop" system. Most of these issues are obvious -- once you *think* about them.

For example:

USER INTERFACE

Embedded systems tend to have different user interfaces than the traditional "monitor/keyboard/mouse". Thinking about how you

*reliably* communicate with a user (who and *what* is your user??) requires an entirely different mindset than the "let's pop up a dialog box and ask him a question..." approach.

You may be confined to a simple keypad with a 7 segment display. Or, maybe a couple of buttons and an indicator light! :-/ Or, you may have hundreds of buttons, levers, knobs, etc.

*AND*, in most cases, *all* you have are those buttons, knobs, indicators, etc. -- i.e., *you* have to interface directly to them (instead of "receiving scan codes from a keyboard").

You also have to consider how you can convey "errors" (on your part *or* the user's) to the user with these particular mechanisms. And, how the user can remedy those errors without resorting to aboriginal rain dances, etc.

FIELD INTERFACE

Embedded systems usually interface to other physical devices that aren't common in a desktop environment. In addition to "real" switches, you might have to monitor temperatures, pressures, rates, etc. I.e., *you* have to acquire data instead of expecting the user to "type in a value". So, you need to be conversant in various measurement techniques, understand sources of error, unit conversion factors, etc.

You may need to interface to mechanism and understand how they behave, physically. So, for example, you can't expect a motor to be "up to speed" instantly after you have applied power to it.

Often, you have to implement control loops. So, you need to understand classic control theory so you can recognize the issues that are important in controlling a mechanism or process -- and the techniques that you can use to overcome the issues that *will* arise ("Gee, the mechanism overshot it's intended stopping point...")

COMMUNICATIONS

Rather than focus on the mechanics of particular communication technologies, think about the *content* of exchanges over those media. What do you want to say? How can you be assured that the other party understands the intent of your message (e.g., in the presence of noise)? How can you be sure *you* understand what is being "said" to you? (ECC's, CRC's, protocol design, etc.). How can you be sure the other party is "still there"??

RELIABILITY

Embedded systems are often throwaway, commodity items (e.g., mice). But, other times, they are expensive, one-of-a-kind devices. Or, are part of much larger *systems* (which can be large, expensive, etc.). Many times, they can have direct or indirect health and safety issues (e.g., the ABS on your vehicle; or the control system that meters insulin into a diabetic). The consequences of "bugs" can be *considerable* -- in terms of lives and treasure. And, the cost of repairing a bug can be astronomical -- many times more than the cost of the product *or* the development effort itself! (I'll be amused if Toyota "discovers" a bug in their engine control system... "oops!")

[Testing is a whole course in itself]

You can't just shrug/curse and give the three finger salute.

So, you have to be able to think about what *can* go wrong. Even the things that you think *can't*! And, how they will affect the performance of your code and product. (e.g., what happens if you *can't* write to the FLASH??)

COST / PERFORMANCE

Many embedded systems are not produced in the huge volumes that are common with desktop machines. (though some are produced in even *bigger* volumes!). So, things that seem inexpensive in the desktop world can be prohibitively expensive in the embedded world. (e.g., you can buy a

17" LCD monitor in QTY 1 for about the same price as a 3" QVGA *display* -- which still needs electronics to drive it! -- in the same quantity)

Most embedded systems have fixed resource complements. You can't just "add more memory" or "a bigger disk". And, being cost constrained, you can't overspecify your resource requirements to "play it safe". So, you have to learn how to come up with *good* estimates given some "nebulous" design goal (from the folks in marketing). And, how to use tools to refine those estimates before you have to commit to them.

Sorry, the list goes on but I haven't the time to try to enumerate all of the issues. My point is, to avoid the petty technology issues. Those are easy to pick up. Instead, focus on the philosophical/environmental aspects of embedded system designs. Otherwise, you're just creating more "coders" without the skills to know *what* and *how* they should "code".

HTH,

--don

Reply to
D Yuniskis

Since I taught a course like this in the mid 80's, small processors have added DMA interfaces for serial peripherals. DMA is very useful for some things and should be discussed.

TIMERS TIMERS TIMERS (and how the can be used for PWM)

Mark Borgerson

Reply to
Mark Borgerson

What you are doing seems extremely important, because embedded systems despite being widely developed and used, desperately lack good education. Below I'll try to back up my comments with real problems that I saw in many projects in different companies (and of different scale!)

This is rarely really interesting, unless you write your own operating system or at least generic task switcher. Most embedded systems either include an OS already or follow single-task + interrupts design.

Writing in assembly is rarely needed at all, but it is extremely useful for understanding and "feeling" the architecture. This isn't covered well in academia, so this is a great step for programmers to get in touch with at least one architecture. Moreover, it is enough for your student to get into one particular architecture in order to know how to approach a different one later.

Regarding interrupts - there are some common issues that have to be covered:

  • edge or level triggered interrupts
  • shared / daisy chained interrupts
  • automatic acknowledgement by hardware (part of bus interface) or explicit software acknowledge needed (device will keep interrupting until ack'ed)
  • interrupt controller design: masking, priorities, in-service/EOI etc. You may use one of standard IC chips or design your own simple, but it has to exist on the test system so that students can play with it and feel what happens "here" and "then"

Very important. Also to describe some common mistakes, like the one that I once had to figure out in software and docs. Spansion NOR flash was connected to 440EP PPC via one of chip selects, and its "ready" signal was connected to the CPU's peripheral polling leg... the problem was that when polling the CPU's peripheral controller allowed the device to be "not ready" for *max.* ~3 ms, while flash chip was not ready for *at least* 300-400 ms when erasing block... so the system panic()'ed with "machine check" by design! :-)

Funnily, the older (and smaller) AMD NOR flash, while otherwise compatible, didn't cause "machine check exception". The reason was... that older flash chip just didn't have a "ready" leg.

I think that you need to separate pure hardware and pure software issues and decide what side your class will be. It can't be both. Embedded engineers are either electrical engineers with "some knowledge in software", or software engineers with "some knowledge in hardware". Hw people will not find anything new in the topic, andsw people will never need it. May be they should be two different courses

- "embedded for EE" and "embedded for SW".

Yes, in many systems it is necessary to bit-bang, so such control should be taught. May be also JTAG software control should be there (as it is common to use for flash programming).

USB is complex enough to be subject of a separate course. But I would suggest covering some standard UART design and communication in details. This will also make for a good communication intro.

IMO, this is subject to the same saparation between hw and sw.

I never even participated in a project in which somebody used them. But that's more of side-note and "no comment" from my side. Watch for how many responses will substantially discuss them.

A good idea. The students need to write something working from scratch, and Linux drivers are a good choice due to OS source availability, good informational coverage and extreme popularity in embedded projects nowadays. I would also suggest customizing Linux BSP for the platform, so that people get the feeling of a "live" system's needs.

I would also suggest covering several topics that nearly any embedded will need:

  • debugging (serial interface, JTAG, software emulators etc.). Use of LEDs for debugging in bootstrap / system-critical code
  • communications - generic Ethernet rings design and TCP/IP stack, application-level protocols availability etc.
  • upgrades, version management, configuration interface etc. (flash / persistent storage interfacing). Use filesystem or design own simplified management?
  • evaluation of software availability (OS available? or too "thick" for the design? TCP/IP stack available? bootstrap / CPU example assembly code available? etc.)
  • risks estimation. This is never covered realistically. Most failed embedded projects do so not because the design was not workable, but because different vendors provide extremely bad support (see "software availability"), are not willing to fix bugs or even cooperate in diagnosing (they are more interested in "shutting" you so that nobody else will know) and bluntly false claims about their projects (they would easily lie to you that something works when it doesn't, only to generate sales). Overall amazingly bad quality of "dedicated" embedded software. Silent recognition of this fact has led to Linux taking over embedded OS field, despite that it is not RT and was never even designed to be a dedicated embedded system. Several well-known companies are now promoting their custom embedded Linux distros even ahead of their own core products.

Good luck! Daniel

Reply to
Stargazer

Combine event driven programming with interrupts focus on the objectives and no the detail of the solutions. For example an event happens interrupts current execution context must be saved. Run to completion probably doesn't need context to be saved.

Communication principles. Add LINbus and possibly CAN to the list LINbus is showing up in many of our customer applications.

Arduino has some distinct teaching advantages by providing some quick systems experience. It has the ability to show the big picture without overwhelming implementation details.

Hosted vs non hosted embedded systems.

My own list of important topics includes.

Reliability. (Also mentioned in Don Yuniskis list which has several good ideas) Principles of reliability. Start with basic reliability math and terminology work towards applying the principles. misra has some good ideas on this

Timers, Timers Timers Core of many non hosted embedded systems. Event scheduling, PWM (PWM tone generation followed by DTMF with a PWM makes good lab)

Multiprocessor applications. Interprocessor protocols. Load sharing. Considerations to logically dividing the application. Heterogeneous and homogeneous systems. Interprocessor calls shared variables. Possibly use IEC 61131, IEC61499 as references. Multiprocessor solutions have been here for 20+ years and needs to be formally covered.

Software issues Fixed point math. One tight lecture lecture of principles of fractional math basic operations and conversions. Trading precision for dynamic range.

Been there taught that.

Regards,

Walter..

-- Walter Banks Byte Craft Limited

formatting link

Reply to
Walter Banks

It's probably important to give them enough information so they understand what causes problems and when they might encounter them when doing a design.

And let them know where they can get more details if required.

Nial.

Reply to
Nial Stewart

Wow, I knew I'd get a lot of feedback, but I didn't expect that much from so many folks... A lot of you have asked some really good questions, mostly asking for more context, so I'll provide at least some of that in one go.

We've been using a PowerPC/FPGA based system (standard PPC and FPGA board with a custom interconnect) for about a decade and it's clearly time to make a change. We teach one pure embedded systems course which has as pre-reqs a digital logic class, a computer organization class (very strong coverage but uses a very simplified ISA), and one and a half programming classes. We also have two specialized embedded classes (controls and DSP) which are more focused on domain specific issues (Matlab, making fast filters, computer vision, etc.)

Our current set-up for our pure embedded systems class is that the students spend a lot of time in lab designing I/O devices on the FPGA and writing code that uses those devices on the PPC. The major topics in lab are:

  • Introduction to "real" assembly and byte/word issues (needed due to the nature of our organization class)
  • Memory-mapped I/O from both the hardware and software side
  • Memory systems (fairly basic basic) (The above are all done in assembly on the PPC side after this it's a mix)
  • Using C and assembly together, ABI stuff
  • Interrupts (edge/level, saving state, debugging. Nesting and "synchronous" interrupts (aka faults) vs. asynchronous interrupts (aka external interrupts) and the whole "why we need interrupts" thing is really only in the lecture)
*Timers (partly to introduce timers, partly for PWM, partly for more interrupt practice, partly for on-chip memory-mapped I/O practice) *A2Ds (basic theory and use)

The student then do a project of their own choosing in groups which are usually fairly impressive (say 100 hours/student put into it on the average).

We are go Thanks again, Mark

Reply to
Mark Brehob

Snip..... 8<

Nobody seems to have mentioned watchdogs...

Glyn

Reply to
<address_is

Or reset ciruits, since you bring that up.

Jon

Reply to
Jon Kirwan

Apart from high radiation environments, who needs watchdogs ?

Reply to
Paul Keinanen

Indirectly I did with timers. Watchdogs have not been all that useful. As processors get larger code is often now transferred on reset from slow ROM to fast RAM to be executed. The reset times can be very large and other reliability strategies need to be used.

Alternatively watchdogs are used to trigger events that stabilizes the system in a some safe mode. The additional application complexity may not be worth it.

Regards,

Walter..

-- Walter Banks Byte Craft Limited

formatting link

Reply to
Walter Banks

Agreed.

Starting out, a long time ago before I learned how to do EMC, I used watchdogs all the time. In desperation I even had one system with a free-running 555 connected to the reset pin. They still crashed occasionally, or worse!

Now I know how to protect agains ESD and line tranients, I don't bother with watchdogs anymore. And they don't crash.

--

John Devereux
Reply to
John Devereux

As with most things, this depends entirely on the application environment.

When designing *gaming* systems, you have to contend with malicious users deliberately *trying* to subvert your system.

It is almost impossible to protect a *running* system from these kinds of threats.

I have found the opposite tactic to be more effective:

*cause* the system to deliberately reset when attacked (not so the code will safely restart but, rather, so the attacker will forfeit any "winnings" he has acrued).

Amazing how quickly they stop using this means of subversion.

:>

Reply to
D Yuniskis

l

Given this has a quite high level of pre-requisites and is quite focused, an obvious question, is WHY do you need to change ?

Is it that the particular devices, and development boards, are hard to source ?

The PowerPC is still actively used, and sourced from Freescale, ST, and Allied ?

If you are intending to do a large change, I'd flip the problem on its head, and FIRST look for well designed Development/Project platforms.

Doing this, as an example, the ST Primer2 development platform has to appeal to students.

formatting link

-jg

Reply to
-jg

Not every system needs one. However, you are right, no one did mention watchdogs.

I tend to use a simple pulse-maintained-relay circuit. The relay is energised within a charge-pump circuit. Any single component failure would prevent the relay remaining energised. The processor output you drive this from has to toggle at a steady rate and I tend to make this as a result of regular system sanity checks.

Still, the topic is a good one for the OP to add to the list.

--
********************************************************************
Paul E. Bennett...............
Forth based HIDECS Consultancy
Mob: +44 (0)7811-639972
Tel: +44 (0)1235-510979
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E. Bennett

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.