Absolute addressing on the ARM

Sorry, but that single quote syntax looks utterly naff. :-)

I think Ada style 123_457_134 would be much more readable than 123'457'134.

[snip]

Strongly agree with using Ada's "_"; do you have any idea why they picked a single quote ?

Simon.

--
Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP 
Microsoft: Bringing you 1980s technology to a 21st century world
Reply to
Simon Clubley
Loading thread data ...

It is not essential in every single case to take this stuff so seriously.

There can be a place for small, informal tools that are not going to be used for mission-critical applications.

When I started creating my own tools (around 1981), there was no viable alternative. When I just wanted a few lines of code to test out some I/O signals on a new board, I needed a language that would compile instantly. At the time, if I wanted to use C (if we could somehow acquire it) it would probably have taken several minutes of grinding floppy disks to compile the simplest program. Too long!

But later I found my own tools were more productive anyway (and for much of the 80s, my naive unoptimised compilers tended to outperform C! At least, for Z80 and 8086. They could also support newer processors sooner, eg. 80186 (with on-chip peripherals) and 80386, so that gave me an edge too).

Anyway, I see examples of bad design and programming all the time, especially with consumer devices with embedded code. Just using a mainstream language doesn't guarantee good results; there's a bit more to it than that.

--
Bartc
Reply to
BartC

It was before my time, but FIG-Forth was released in 1978 and it sounds like exactly what you wanted.

Reply to
Paul Rubin

Your point in its context are well taken. The FDA has rules that safely allow C (and other languages) to be safely used in these types of applications. As good as the a provable correct compiler is, it does not guarantee that the application is also correct.

w..

Reply to
Walter Banks

Possibly with an extra advantage: it would cost litigants a lot of time/money first to learn the environment then to prove them wrong, whereas with C the problems are well documented!

Reply to
Tom Gardner

(Rats, hit button too soon)

But then wouldn't it be more reasonable to choose a "high integrity" language/compiler/environment than a bog-standard C one? Spark/Ada would seem a good starting point.

Reply to
Tom Gardner

Or perhaps PL/M style 123$457$134 :-)

Reply to
upsidedown

If my life or safety depended on some essential software, then I'd be happier if it was written in Ada than C. (So long as someone else does the actual programming; using Ada seems like wearing a straitjacket.)

But then, there does seem to be a version of Ada that is just a front-end to gcc, presumably inheriting the latter's complexity and bugs too.

--
Bartc
Reply to
BartC

That does seem like the worst of all worlds!

The entire environment is important, from requirements gathering, specification, code control, compiler, static analysis, etc etc.

If I was to investigate this area, I'd start with

formatting link
and then see if there was anything better.

The entire pedigree of the company (Praxis, not Altran) and personnel (esp Martyn Thomas and Bernard Carre) that created the toolset was oriented strongly towards high-integrity solutions.

Reply to
Tom Gardner

[snip]

Programming in C feels to me (a habitual Ada user) like crossing a deep, rocky ravine on a narrow, swaying plank. I guess it's subjective, and some people may enjoy it :-).

Just to clarify that: the most widely used Ada compiler, GNAT from AdaCore, is part of the GNU compiler collection. The front-end generates an intermediate representation (like the other gcc compilers) which goes into the gcc back-end for code generation. GNAT does not generate C code.

Then there is another Ada compiler (also now from AdaCore) which generates C source code which can be compiled with various C compilers. This is useful for targets which GNAT does not support.

But there are two different issues here for critical SW: firstly, is the programming language suitable; secondly, are the tools/compilers reliable (i.e. sufficiently correct).

On the first issue, using Ada, and perhaps even SPARK/Ada, inherently helps avoid many kinds of errors which are common in C code. Moreover, static analysis works better, because the language gives more information to the analyser, especially with the "contract" features added in Ada 2012 and used to good effect by SPARK 2014. This advantage does not depend on the compiler technology.

On the second issue, if the compiler uses the gcc back-end or generates intermediate C code and then uses some C compiler, certainly this reuses the bugs in those tools, not much one can do about that.

I don't know of any empirical study of the reliability of the gcc back-end compared to the back-ends of other compilers, whether for Ada or C. Would be interesting.

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
      .      @       .
Reply to
Niklas Holsti

Operational semantics defined in terms of a virtual machine. The compiler emitted code fragments that implemented instructions of that VM. Easy to formalize, but the generated output was rather inefficient. I remember that I proposed some "really safe" optmizations back then, but they were not impressed. Looking back, I think they made the right decision.

But that was in the late 80's/early 90's, so I do not remember much of the details. As far as I know, they never published anything about their approach.

Interesting! Even if they only handle a subset of C, this is still quite impressing. Machine-assisted proof has come a long way since the 90's.

Appreciated, thanks!

However, I would argue, that TCC was not really simple in its days. Simpler than most other C compilers, but still an optimizing compiler.

I agree that compiler quality goes up, and formal verification, regression testing, a large user base, etc has all contributed to the very high standards we have these days. However, for really critical tasks, I would still prefer something that is *easy* to understand, formalize and verify. Especially, if I do not need the power of an optimizing compiler.

Plus, a simple homebrew compiler requires less man power for maintenance because in the ideal case one persion can understand both internals of the compiler and its formal model.

--
Nils M Holm  < n m h @ t 3 x . o r g >  www.t3x.org
Reply to
Nils M Holm

Brackets? What brackets?

Just kidding, after a while you just don't see them any longer.

--
Nils M Holm  < n m h @ t 3 x . o r g >  www.t3x.org
Reply to
Nils M Holm

  • formatting link
  • formatting link
Reply to
Paul Rubin

Good point! While a compiler that is known to produce correct code is a necessity, it does not stop the developers from writing faulty code. So in an ideal world, we would prove the correctness of the application, too. However, not every application is equally suitable for formalization and verification. For a compiler, we have formal specs, like a grammar and formal or semi-formal semantics. Many many other problems are underspecified, though (which is, IMHO, a bad idea) and hence not suitable for the above approach.

As you raise that point, what would be your approach?

--
Nils M Holm  < n m h @ t 3 x . o r g >  www.t3x.org
Reply to
Nils M Holm

I agree that there is plenty of use for small "informal" tools - the case I picked was intentionally an extreme example.

But is there a place for small, informal tools when there are large, well-tested and powerful mainstream tools available? Sometimes cost is an issue, but gcc and llvm have reduced that enormously. Yes, I think there are still situations where small tools are a better choice - but only for niche areas with particularly good reasons for choosing them.

Reply to
David Brown

Given that in general there is an "impedance mismatch" between the problem-being-solved and the language, under what conditions is it better to create - a small domain specific language tailored to the problem, or - a library in a standard language

Yes, I have my own opinions, but I'd like to hear other people's opinions.

Reply to
Tom Gardner

Tom Gardner schreef op 23-Mar-14 6:13 PM:

You should not regard that as a one-or-the-other question, there is a continuum from solving a problem using standard ways (that require 'more' of them to solve the problem) to writing a dedicated language for it that solves this one problem trivially. The 'soft spot' in this continuum varies. When a certain type of problem needs to be solved only once the 'standard ways' is to be preferred, because it requires less work, and is easier to read. The more often a type of problem needs to be solved, the more a 'dedicated' solution is favoured: the amount of up-front work (both on the implementor and on the reader) can be averaged out over all applications.

I tend to prefer using an existing programming language that has a sufficiently high abstraction level (named parameters and overloading are a plus).

Wouter van Ooijen

Reply to
Wouter van Ooijen

Some languages are particularly amenable to the creation of embedded DSL's (EDSL's) inside a general purpose host language. That lets you leverage the capabilities of the host language. Lisp, Forth, and Haskell are examples of host languages sometimes used this way. I just posted some links to Atom and ImProve, both of which are Haskell EDSL's used for realtime control stuff.

Reply to
Paul Rubin

Well, yes. No argument there.

So what would you regard as useful indicators or preconditions that would bias the answer one way or the other?

OK, so why? :)

Reply to
Tom Gardner

OK, but under what conditions would/wouldn't you use them to solve an embedded problem via a DSLanguage or DSLibrary?

I must admit that, while LISP is clearly a sufficient base in which to create a DSLanguage, I'm not sure I would rush to use it in an embedded environment.

Haskell? I don't know enough about that to have a valid opinion, but "lazy evaluation" and "strictly functional" cause my eyebrows to go upwards in the embedded domain. (The former because of time/space considerations, the latter because an embedded system with side effect is more than a little pointless!)

Forth? I'll discuss my opinions at a later date, if appropriate.

Reply to
Tom Gardner

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.