C program flow control - Very Basic Question.

:)

Using setjmp()/longjmp() to handle exceptions is far harder to understand (and in my experience causes many more bugs) than using goto for the same thing.

Don't use scanf() is pretty close to one of the absolute rules. ;)

--
Grant
Reply to
Grant Edwards
Loading thread data ...

Whew! Thanks for all of the replies guys - the time you put into helping me out really is appreciated.

I think I'll go with Don's suggestion - in hindsight it's a little obvious that this is what I should do. Thanks also Tim, Joseph, Hamilton.

I once read somewhere that if you can write in assembler then C should be easy - either I misunderstood or whoever said it was WRONG! :)

Now to implement it!

cheers Royston

Reply to
Royston Vasey

The problem you are having is that C hides a bit more of the hardware from you that ASM would.

E.g., in ASM, you would be more aware of what's on the stack, etc. You wouldn't do something like:

MAIN: ... CALL FOO ...

FOO: CALL BAR ...

BAR: CALL BAZ ...

BAZ: JUMP MAIN

(because you'll munge the stack!)

Reply to
D Yuniskis

Although this is c.l.e I would recommend using Turbo C 2.0 to learn C. If you can get it. No distraction from the language itself.

Groetjes Albert

--

--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
 Click to see the full signature
Reply to
Albert van der Horst

Thanks Albert, but I'm using C18 as my objective is to created an embedded device and the direct route suits me best.

Reply to
Royston Vasey

That's true. I guess the advantage of C when I become more conversant will be the speed of getting code up & running.

Reply to
Royston Vasey

be

Changing platforms (different processor etc.) becomes a lot easier, especially if you abstract the hardware properly. (HAL)

--------------------------------------- Posted through

formatting link

Reply to
RockyG

I get that you want to dive into hardware, but you'd really be better off learning the language *before* you try to use it for an embedded project. The problem with C is that it _looks_ simple - the truth is that it will be quite a while before you will be able to write reliable programs.

Compilers for small MPUs, DSPs and PICs (the generic "PIC") tend to have non-standard features, weird limitations and just more plain old bugs than compilers for popular desktop OSes. And cross-compiling for an embedded target creates build and test issues that desktop systems don't have. All these things are confusing distractions that you don't need while you're trying to learn a language.

There are decent, free compilers available for just about any OS. Except for GCC, most won't be C99 compilers, but any ANSI compiler will do for learning.

George

Reply to
George Neuner

Up to the 1960's usually the only way to alter the program execution was some kind of jump/branch/goto instructions and some primitive loop constructs on some high level languages (such as the DO loop in Fortran IV), thus gotos had to be used almost exclusively.

With languages containing some structured features that are easy to use, the need for gotos was significantly reduced, but not eliminated completely.

The C-language lacks several features such as loop naming (allowing exiting multiple nested loops at once) or switch/case style error handlers at the end of module and thus gotos are still required.

I would consider the slogan "goto considered harmful" as a harmful statement, since applying it blindly has created a lot of unreadable and hence unmaintainable code (such as weird status variables or very deeply nested if-statements) instead of using one or two well placed gotos to simplify the program structure.

After all the Dijkstra/Wirth slogan "goto considered harmful" was intended to advocate the structured programming model and languages based on that model.

Reply to
Paul Keinanen

I disagree with that advice. Programming C on a "big system" and programming C on an embedded system are very different. People who have learned C by reading books (or doing courses) and programming on Windows, Linux, or whatever, often have a lot of unlearning to do before they can write decent embedded software. They'll use "int" everywhere with no consideration for the underlying cpu and they'll use floating point, memory space, "printf" and "malloc" as though they were as cheap as on a PC. They will miss out all understanding of interrupts, volatiles, hardware access, resource limitations, etc.

I'd agree that the PIC's (at least, the small PIC's) are awkward devices to learn and have a lot of idiosyncrasies. My recommendation here is simply to drop the PIC and choose a better microcontroller. But if you want to learn programming microcontrollers, learn with microcontrollers.

Reply to
David Brown

A better slogan is "Goto almost always harmful", which basically means that it's a really useful too after you've gone through a gun safety course on the thing.

--
Tim Wescott
Control system and signal processing consulting
 Click to see the full signature
Reply to
Tim Wescott

Your prerogative.

I don't disagree with anything you're saying, but, in my experience it is difficult enough for many (maybe most) people to learn programming _well_ without screwing around with an embedded programming environment. [IMO, many "professional" software developers should find another profession and the majority of people should never try to write software ... but those are topics for other threads.]

Learning is definitely more fun when you can make things light up, sing, move, etc., but too many of the people I've met who learned to program by screwing around are also among the worst software developers I've met.

IMO, learning a programming language and learning to program to the idiosyncrasies of a non-standard target should be separate issues. A good programmer with a solid grasp of the language (usually) can make the switch - a lousy programmer is lousy on any target.

George

Reply to
George Neuner

Yes. There are, too often, quirks/extensions that are present in the tools used for (many) embedded platforms. Sometimes to get around quirks in the processor, sometimes to (ahem) "add value" to the toolchain (read as: tie customer to vendor), etc.

This was particularly true when processors first started seeing use as "logic replacements" many decades ago. Too often, an EE would be assigned the job of writing the code (after all, it *is* an electronic device, eh?). Without any formal training on how to approach this, he'd (eventually) get things to work -- just barely. And, only if you didn't look too hard at them!

Can anyone spell "Therac"?

Agreed. Learn what the language is *intended* to do and then you're in a better position to kick and scream about what the particular toolchain *doesn't* do (or does poorly) -- and you will have a better feel for *why* it had to make those tradeoffs.

E.g., write a "genuine" printf() and you can see why many embedded platforms have "itoa()"-ish alternatives! :>

[N.B. For yucks, try something like: printf("% 0#*.*g\n", INT_MAX, INT_MAX / 5, PI); assuming I have remembered all the proper flags :-/ ]

--don

Reply to
D Yuniskis

It would be a boring thread if everyone agreed!

I /almost/ agree with you here. Learning to program should, as you say below, be separate from learning the idiosyncrasies of a particular language, toolset or target. My point of disagreement is that programming on (small) embedded systems is so significantly different from most programming on big systems that I don't think learning on big systems is a help. A clear indication of that is that on big systems, unless you are doing time-critical or low-level programming (not beginners topics), you'd almost certainly use a higher level language than C.

You are not too impressed about the quality of most software developers

- I agree, and I think it also applies to most books and courses that teach software development. I think the average "teach yourself Visual C++" course/book is worse than useless, especially for someone looking at embedded development.

For someone wanting to learn to write good embedded software, I'd recommend starting with assembly. You have to get to understand the low level of your processor. If you want to write C code that will compile into small, fast and efficient target code, you've got to have a feeling of the sort of target code you will get. It doesn't much matter which target you use - learn assembly on "nice" cpu like the AVR or the msp430 rather than a PIC. But you'll have a solid foundation for your skills, rather than forever feeling that your embedded system is a cut-down or limited environment.

Reply to
David Brown

In my sleep. I once threatened to quit a project when the customer demanded the software monitor solvent tank levels to prevent both running dry and overflow spillage. I demanded hardware interlocks shut down the machine if either situation were imminent.

It wasn't that software couldn't handle the situation - it could have easily. The issue was that the solvent was dangerous and I didn't want to have liability in the event of a spill. Also a factor was that this project already needed FDA certification and I didn't want to deal with EPA reviews on top of that. The hardware wasn't my department so all I had to do was write a letter saying there was no software responsibility for solvent handling.

George

Reply to
George Neuner

Yeah, but software is more "brittle" than "wires and switches".

I worked on automating a tablet press. Several tons of moving iron (e.g., 10HP motor to spin the thing). Capable of exerting forces measured in many tons. Lift a "guard" and lose a finger in a few ohnoseconds.

Software could read the switches just as a *courtesy* to the user (i.e., to tell him *which* guard is open). Always rely on the least corruptible technology for safety!

Same reason an "emergency brake" is little more than a cable!

Reply to
D Yuniskis

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.