Hi, How much do you embedded guys need to know about analog circuit design? I asked one guy and he said most embedded engineers know little about analog circuitry. Usually they look at application notes on how to build sample circuits. Is this true?
Also what real-time OS do most of u guys use? Is it mostly used on single board computers?
Finally, why are there so many different vendors selling single board computers. Do they all custom build their stuff, is there no standard?
Depends one what you mean by "single board computers". Almost all the stuff I've worked on for the past 25 years has been on custom boards. Some single-board, some multi-board. One project used a COTS micro-ATX board.
I presume because people are buying them.
You'd have to ask them.
There are tons of standards. That's the great thing about standards: there are so many to choose from. [Apologies to whomever I stole that line from.]
--
Grant Edwards grante Yow! What PROGRAM are
at they watching?
zalzon wrote in news: snipped-for-privacy@zalll.com:
Depends on what you are trying to do. If it's digital logic only then you need to know enough to understand what the app notes are talking about in an analog context because ultimately any thing digital must make its way around an analog circuit, aka Wires that have Capacitance, Inductance, resistance and so on. or via radio that has its own issues.
Well it's true of me, thats why I'm currently doing an Electronic Degree so that I can understand more about whats going on. Although having said that I have not had much problem so far, because analog devices have app notes and example circuits.
I use Windows 2000 on a single board computer, it has a PII 260 chip installed.. But I'm sure that's not what you were talking about.
This embedded guy knows a fair amount of analog design, and it's important for what I do - but the necessity of knowing analog depends on the task. Pretty much all embedded projects need some analog design. In larger projects, there may be people who focus purely on the embedded software, while others deal with analog circuitry.
App notes are handy to get started, but relying on them is not a good idea; they are often of marginal quality, and sometimes flat-out bad.
When I've used an OS, I've generally used Windows as a reasonably-real time system. Since Windows is not hard real time, I also use circuitry as appropriate to ensure that things are handled on time.
Note that many small embedded systems need no OS.
I don't know why there are so many folks competing, but I like it! There are some standards, such as PC104 form factor, and so forth. Look through catalogs, you'll figger it out.
It is important for embedded systems designers to understand analogue circuitry in quite some detail in order to develop the most appropriate interface to the real world. The real world is a rough, harsh place and knowing how to provide limiting, protection clamping, frequency and spike filtering and matching highly reactive transmission lines is a necessary part of getting a well behaved embedded system. The analogue topic is probably more significant than the digital hardware aspects.
Forth with either round robin, pre-emptive scheduling (I have used both simultaneously in a few project).
Look at the details of the cards and you find that the resources available on each one vary quite markedly. It is nice having a choice of architectures and I/O resources. The systems engineers selecting for the application domains they deal in should be the only ones to make the rationalisation choices.
--
********************************************************************
Paul E. Bennett ....................
IMHO this is like asking wether there are a lot of big guys here or small ones. There are all sorts of people out there.
You must understand that the embedded market is often highly price driven. So, the OS as well as the hardware are often choosen to match the task "best". It obviousely depends on how the individual(s) takeing the decision define "best" but still.
There can't be a standard. There are just way too many too different requierements and applications. A standard would mean a hughe overkill asking for 1000% more hardware than needed in one case and would be too restricted in just another. Might be that there will be classes of standards or such in the future but luckly there is still a lot of new developement in this field.
Not quite. When you design hardware, any hardware, it is necessary to understand analog techniques. The faster the micro runs, the more so. Else something either doesn't work as expected and the EMC test results can be a huge disappointment.
App notes are fine. However, they rarely tell you about the intricacies of layout, decoupling, and other things that are often considered miscellaneous (but which aren't). Then there is grounding. Here app notes can point in the wrong direction.
I never used an OS on embedded stuff myself. Besides some very little programming in C it was all assembler. Oh, one exception: We did use ONX on one larger project but I was not the guy who programmed. We found this OS to be extremely robust and it had quite predictable interrupt service times.
Perhaps not "most" but certainly too many. These are apparently the embedded people who come out of computer science. There are still people like me who have an electronics/EE background (both digital and analog) who ended up working with embedded processors, and can do more of the project (sorry, I don't do marketing).
As someone else said, this is always a good idea, regardless of one's knowledge level. But still, it helps to have an analog background I attended National's mos recent Bob Pease tour, and one section was on National's op-amps and ISTR some slides (at least in the book, they may have just mentioned them in the presentation) on op-amp configurations and gain calculations. To knowledgable analog guys, that's about equivalent to a nand gate's truth table.
Most of my designs have used a "heartbeat" interrupt as well as one or two other interrupts, and I hang many of the tasks on the heartbeat interrupt. I've written all the code the processor runs. I'm sure I have a big hole in my resume from lack of RTOS experience.
You may have also noticed a wide range of prices, processor power, electrical power the board takes, various peripherals and other features available. etc. You can prototype a wrist watch with a MSP430 or Atmel "Butterfly" board, but it probably won't run Linux. There are a huge number of different applications under the word "embedded" and the vendors are just selling what people buy.
There are a "few" standards, but lots of stuff is "non-standard."
I was quite horrified about the intensity how some people in this newsgroup defended the use of watchdog timers. While WDTs may be acceptable in some cases (e.g. in high radiation space operations), the use of WDTs to cover up inadequate hardware EMC design is not.
You know, that IRL EMC can be different to test cases? And that a WDT can be used to ensure the safety of a device, not to cover a programmers ass?
Even if SW and HW is properly designed, you can't catch all (multiple) failure cases. Why not use a WDT as last line of defense? What is it, what horrifies you?
IMHO, even device that uses a WDT to cover a programmers ass is, though surely crap, better than a device that trust the environment. To trust remembers me of foolproof devices, for which there is always a fool to proof that nothing can be foolproof.
Andreas
--
I thought I was wrong once, but I was mistaken.
(unknown)
When dealing with System Safety or Mission Critical aspects of a project's design the WDT is very useful as a last line of defense for those situations where unexpected upsets occur. I agree that they should not be used to cover deficiencies in programming skills. In many systems I even supplement with a PMR circuit which is guaranteed to disable high risk outputs if just one component fails.
Jack Ganssle has some useful info on WDT's (see
formatting link
--
********************************************************************
Paul E. Bennett ....................
Getting back to the original poster's question, It's neccesary for SOMEONE to understand these things, not neccesarily the person writing the code. Many companies pigeon-hole people into their job titles, and have "software" people and "hardware" people. I've often had the title "sofware engineer" even though I designed hardware (at SOME of my "sofware engineer" jobs) as well.
For some reason this became a big polarizing, black-and-white issue. I have no problem with WDTs used as an addition to a well-designed, well-tested, reasonably reliable system.
That (and/or using WDT's to cover up inadequate software as well) was/is the polarizing issue.
I have no argument, except that if the WDT fires off on more than very rare occations, it tells me there's some problem that needs attention, whether it's code, electromagnetic suceptibility, or what.
At least one poster WAS claiming to use a WDT for more than the "last line of defense" and to "save" a product with flaky/defective software design. A WDT goes off because something bad happened. It WILL save the system from many bad things, but not everything bad that happens will end up with the WDT going off.
It's almost essential to know when a WDT fires off. For testing, and perhaps for production as well, states should be saved both in RAM and non-volatile memory so that after a reset the state of the device (including normal powerdown) can be determined. If there's a problem, the thing should "phone home" (when feasible and appropriate) with as much status info as can reasonably be transmitted. Just blindly adding a WDT to a design may cause it to "work longer" so that a serious problem isn't discovered until there are many units in the field, rather than earlier in testing when the cost to fix it is much less.
When there is no human around to counteract the inevitable application of Murphys Law, the WDT can serve as a substitute. Normally it should never fire.
--
Chuck F (cbfalconer@yahoo.com) (cbfalconer@worldnet.att.net)
Available for consulting/temporary embedded and systems.
Re-reading it now, THAT is the article that caused a lot of fuss over WDT's. I finally read through the whole thing, and now my overall impression of it is a lot better than my first reaction in this thread:
formatting link
or:
formatting link
It's a good article, but I still don't like the statement "Well-designed watchdog timers fire off a lot."
I was quite horrified about the intensity how some people in this newsgroup defended the use of seatbelts and airbags. While SBs & ABs may be acceptable in some cases (e.g. when a drunk driver plows into you), the use of SBs & ABs to cover up inadequate brakes and poor driving skills is not.
I was quite horrified about the intensity how some people in this
in some cases (e.g. if the Japanese attack Pearl Harbor again), the use of WDTs to cover up inadequate boat maintenance or disregard for bad weather is not.
I was quite horrified about the intensity how some people in this newsgroup defended the use of parachutes. While Ps may be acceptable in some cases (e.g. if another plane hits you), the use of Ps to cover up inadequate aircraft design is not.
I was quite horrified about the intensity how some people in this newsgroup defended the 911 telephone system....
--
Guy Macon, Electronics Engineer & Project Manager for hire.
Remember Doc Brown from the _Back to the Future_ movies? Do you
In this situation, hopefully the actual WDT hardware never malfunction in a way that would pull the processor reset pin, otherwise the reliability of the whole product will be degraded due to the use of the WDT :-).
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.