Test driven development?

You can read about how I used testing in a microcontroller-based project here:

formatting link

--
Guy Macon
Reply to
Guy Macon
Loading thread data ...

As the owner of a company that does validation and certification testing for software used in safety critical applications, I'd say all software development, embedded and otherwise should be test driven.

It never ceases to amaze me how low the quality of the "finished" software I see. And this is software headed for products where failure could directly result in loss of life.

Far too many developers limit their testing to ideal cases at the end of development and give no consideration at all to trying anything else.

I strongly encourage all developers to give testing the level of attention it deserves. Even in non-critical applications failures are costly, both to your products and your reputation.

--
Scott
Validated Software Corp.
Reply to
Not Really Me

I'm curious: what do people think of test driven development when applied to embedded (read small microprocessors and micro controller) systems?

Mike Harding

Reply to
Mike Harding

On a couple of projects I've worked on recently we developed the testbed during the design phase and then tested the design with minimal/cobbled together codes before getting into implementation. Projects hit reliability, time and cost estimates very nicely.

Three years ago, for a new project for a networking box starting from a blank sheet, we had the team follow through the "extreme programming" method as they decided on their methodology for the project. The team had a couple of experienced C/C++ embedded controller guys, an experienced big system OO/java guy and a new grad postdoc. When we reviewed the project the general comment was "can't see what all the fuss is about - most of it is what we were going to do anyway but described differently".

Stephen

Reply to
Steve Maudsley

what do you mean by test driven development ? Do you mean developing in target and fixing all problems as you move forward ( bad ) or do you mean desinging test features in to facilitate test at the end of development integration phases ( good )

Reply to
Ben Midgley

Sorry guys, I didn't make my question clear enough, I was not referring to the practice of testing as such but more to the methodology of Extreme Programming and, particularly, the Unit Testing aspect of same.

formatting link

Mike Harding

Reply to
Mike Harding

We do that in Forth all the time.

--
********************************************************************
Paul E. Bennett ....................
Forth based HIDECS Consultancy .....
Mob: +44 (0)7811-639972 .........NOW AVAILABLE:- HIDECS COURSE......
Tel: +44 (0)1235-811095 .... see http://www.feabhas.com for details.
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E. Bennett

If you had gon to the link provided and read the introduction to the subject matter on that web-site you would have known what the basis of TDD is. See Mike's link (reproduced below)

formatting link

--
********************************************************************
Paul E. Bennett ....................
Forth based HIDECS Consultancy .....
Mob: +44 (0)7811-639972 .........NOW AVAILABLE:- HIDECS COURSE......
Tel: +44 (0)1235-811095 .... see http://www.feabhas.com for details.
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E. Bennett

It seemed quite clear to me. I must admit that I only read the introductory text at the link provided and didn't venture further into the 12 or so links on that page.

With targets that are connected to the development system via a communications port it is very easy to build test scripts for regression testing (just capture the echo as you manually enter the first test on the initial code. Best where development is incremental.

Unlike you I am not so afraid of concurrent development and coding. However, I agree that prototype code should not migrate to becoming production code (as production code should be created direct from a properly organised technical specification). Incremental code development (as required in the TDD method) means that you concentrate on one component at a time and fully develop the production quality code and test tools before you move onto the next component. Sounds like extremely good advice to me.

I know that I am accused of sticking with Forth to the exclusion of all else but that is not really the case. I do read quite a lot outside of the Forth arena of available material and will admit to stealing any technique that makes sense in a Forth context. I have been doing that for more than

30 years now and can provide an honest certificate of conformity for any of my systems. None of my systems are cheap though.
--
********************************************************************
Paul E. Bennett ....................
Forth based HIDECS Consultancy .....
Mob: +44 (0)7811-639972 .........NOW AVAILABLE:- HIDECS COURSE......
Tel: +44 (0)1235-811095 .... see http://www.feabhas.com for details.
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E. Bennett

for

I

directly

I think this is an issue of quality management and control, not coding methodology. Requirements need to be made clear up front. I do agree that, possibly, whoever is coding these devices hasn't really thought things through; but is that their job? Given the nature of these systems, it sounds like a quality systems audit would be in order.

I have yet to meet a coder who can test his own code as well, and as objectively, as a good test technician/engineer. For the most part, asking coders to qc/test their own code is a ticket to hell.

Far too many specifications provide too little in the way of detail and requirements, and it is not uncommon for people to expect too much for too little -- especially those with no skill or experience in working with developers. You must be familiar with the tendency of projects -- especially poorly defined projects -- to balloon. This is caused by the vicious cycle of, "What about this, and what about that...." Who gets screwed in the end is the developer with the "you shoulda" statements. That's bullshit.

(IMHO)

Mike

Reply to
Mike Turco

That's an excellent method. In my experience, you run into a couple of places where you can't figure out how to write the tests. These are the places where your design is flawed; fixing them improves the product and ignoring them is a disaster in the making.

--
Guy Macon
Reply to
Guy Macon

Actually, I am a lot less wary of it when the developer is using Forth than I am if the developer is using C, and I am *really* wary of it if the developer is using assembly language. It also makes a difference whether we are trying to do something that nobody has ever done before or whether we are trying to do something that we have done many times.

What I am afraid of is that the developers will do one of the following:

Ready, Fire, Aim.

Ready, Aim, Aim, Aim, Aim...

Reply to
Guy Macon
[%X]

I am sure that many of us have had, at some time, projects that were poorly defined or became nightmares (only two in my case but I have seen them). The real question becomes "what do we do to prevent projects falling into such traps?"

Certainly getting the spec tied down properly first will stand you in good stead. Ways around doing this are to:-

  • Take the spec apart and formulate questions about each and every apparent statement of requirements. Formally record all questions.
  • Obtain answers to each and every question formulated above and record these answers against each question complete with the source of the answer (not all the answers will be from the client).
  • Write a fully detailed technical specification based on the original requirements and the answers to questions. Present the document to the client, using any additional presentation aids that may be appropriate to aid understanding by the client.
  • Do not proceed any further unbtil the client has signed off on this technical specification meeting his requiremnents. Once he has signed acceptance any changes that are instigated by them can easily become chargeable.
  • Ensure you have a completely rigourous change management procedure in place to cope with any chanbges that are required of the specification, contract or design. This is a most impoirtant aspect of project management.

Of course, during development you will need to keep your client in the loop and appraised of the impact on project timescale and cost every time they go "cvould we have?". They can should also be permitted to observe that your management of the project follows your published procedures in every respect. That way the get a good feeling that you care about the quality of the end result and may even assist in overcoming any difficult problems. After all, system development should be a partnership.

--
********************************************************************
Paul E. Bennett ....................
Forth based HIDECS Consultancy .....
Mob: +44 (0)7811-639972 .........NOW AVAILABLE:- HIDECS COURSE......
Tel: +44 (0)1235-811095 .... see http://www.feabhas.com for details.
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E. Bennett

It's worse than that IMHO. Most software developers don't design for the failure paths. In no particular order, I have seen basic design problems in code as follows:

1) Not checking return codes 2) Not exercising error conditions 3) Not root causing problems, only putting in "retry loops" to avoid getting errors back 4) Not thinking through the design as potentially being used for more than its current design in the future (expandability) 5) Not thinking about how unit testing can be accomplished manually or in an automated fashion

It's a rare thing to run across an engineer who actually thinks in these terms, so if you know one and are in the Portland, OR area, I'd like to hire them. ;-)

-->Neil

Reply to
Neil Bradley

I mostly agree with your comments. Especially that a good test technician/engineer (or team) is a great addition to a project. When a project is designed with a critical application in mind, we generally see good stuff. This primarily includes requirements and design documents that are well thought out and APPROVED before the real coding starts.

Where we see the worst nightmares is in code that was designed for a non-critical app or an app not initially recognized as critical that must be shoe-horned into a previously non-existent process. It often seems to restate the erroneous idea that "anybody can write code". It's a fading idea, but it surely isn't gone yet.

As you eluded to in your last paragraph, the "moving target" is alive and well.

I disagree with your statement about coders qc/testing their own code, unless they are in a company that has sufficient SQA that the tests are designed and coded prior to the actual code. Developers in a limited SQA environment don't really have a choice. It's surely not an ideal situation but it is often unavoidable.

Scott

Reply to
Not Really Me

Comments below.

Doug

software

That would be dead wrong. Coding methodology is as much a part of quality control as anything else. Incidently, "coders" is a job position that has been extinct for a several decades.

This never happens. Requirements change, thus the failure of the Waterfall approach to project management. Project management must be able to adapt to changing requirements.

Absolutely it is their job!

That too. But a audit is not a solution to the problem. Finding quality problems after the fact is expensive. Quality must be built into the methodology.

That is because the test engineer is focused on comparing performance to requirements and also as a second pass to make sure things are right. This not an indictment on the work that "coders" do. Incidently, the term "coder" pretty much died inthe 60's with the COBOL world. No realtime embedded programmer would be referred to as a coder.

That would be true if this were the only testing. Any responsible organization would not be organized that way.

That is a management issue. Developers do what they are told or allowed to do. If you want better coverage in testing, then make it a part of your process.

especially

Developers are always the victims of poorly defined requirements. Poorly defined requirements are often the norm since the desires of the customer do change due to the fact that customer quite often cannot articulate the nuances of the requirements, That is why iterative development methodologies do better.

Reply to
Doug Dotson

An embedded programmer who cannot at least test his/her code to reasonable functionality isn't programming. More like "dumping".

-- Samiam is Scott A. Moore

Personal web site: http:/

formatting link
My electronics engineering consulting site:
formatting link
ISO 7185 Standard Pascal web site:
formatting link
Classic Basic Games web site:
formatting link
The IP Pascal web site, a high performance, highly portable ISO 7185 Pascal compiler system:
formatting link

Being right is more powerfull than large corporations or governments. The right argument may not be pervasive, but the facts eventually are.

Reply to
Scott Moore

Many programmers test their code for reasonable functionality. The real world is, however, often unreasonable.

Reply to
Dingo

"Testing code well" is not the same as "Testing code to reasonable functionality."

Reply to
Guy Macon

asking

I remember a DOS application I wrote maybe fifteen or eighteen years ago, that was an interface to a mag strip reader (i.e. a credit card reader). I was in the process of starting this business, and this was my first customer. You can bet that I tested that code up-and-down, back-and-forth and every which way to hell.

Anyway, I demo'd the s/w to the customer and everything worked great. Then the guy takes the machine and a credit card and starts swiping the card back and forth through the reader at a rate of three or four times a second. As you can guess, error trapping be damned, the program crapped out.

I bet that just about everyone here can relate to this story at some level or another. Do I write better, tighter code that is less prone to error than I did fifteen years ago? You bet. Still, though, I make it a point to have another person, who is knowledgeable in the area of the product, test my code for me before I have the balls to say that its done.

I think one would be hard pressed to find a medium or large sized company that develops and produces products, that does not have an independent QC department responsible for both generating quality/reliability specifications and performing tests to ensure conformance. This is the way things are.

So, then, my opinion is this: developing and releasing a product that has only been tested by the developer(s) isn't programming, and it sure isn't good business.... its dumping.

Mike

Reply to
Mike Turco

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.