Best Scripting Language For Embedded Work?

How do you get files on and off of the drive? If you say something about USB sticks, the next question is whether you have heard of Stuxnet.

Reply to
Paul Rubin
Loading thread data ...

I used this method on computers that didn't have an internet connection and I used it before USB even existed. I didn't seem to have problems moving files then. And I certainly do NOT use an internet connection when working on a client disk. Nor do I use software I'm not certain about.

I guess I don't understand your concerns.

Jon

Reply to
Jon Kirwan

USB sticks didn't exist when I started this practice. I can't recall ever even so much as attaching a USB "disk" to a machine while a project disk was active. And I wouldn't.

Jon

Reply to
Jon Kirwan

Stuxnet was a virus that propagated by infecting USB sticks, so it can attack computers with no network connections if you transfer files by USB. Obviously the concern can be mitigated in various ways, but understanding it is a good idea.

Reply to
Paul Rubin

So how do you get files on and off of the customer drive?

Reply to
Paul Rubin

I am thinking "with a telegraph key."

--
Les Cargill
Reply to
Les Cargill

You can't imagine any method OTHER than a USB drive??

Seriously?

Jon

Reply to
Jon Kirwan

I can imagine all kinds of methods. I'm wondering what method you used.

Reply to
Paul Rubin

A monolithic executable gives you an easy way to verify that you have the correct version of the interpreter (i.e. the right scripting tool). (*) I don't make the claim that one can't check version reliably on systems with .DLL's or more files, and I also don't make the claim that the nature of a Windows system itself might not leave far larger loopholes for mistakes and accidents.

Most embedded compilers are very nearly monolithic executables. You can verify that you have the intended tool just by comparing a cryptographic hash on a few executable files (typically just compiler and linker). There are typically no .DLL dependencies, and no registry changes made as part of the installation.

"Monolithic executable" isn't a strict requirement, but it gives a one-step way to be sure that the scripting tool is correct. Please see my sentences beginning with (*).

Thanks and best regards, Dave A.

Reply to
David T. Ashley

Thanks for the clarification, and you are correct.

My statement was that I required a "monolithic executable" when what I really need is "guaranteed version with minimum variability of behavior based on other factors". In other words, I didn't state my requirement--I stated what I believe the best solution is and called that a requirement.

Historically, I don't believe I'm the first person in software history to propose a specific narrow-minded solution and call that a requirement. : )

Best regards, Dave A.

Reply to
David T. Ashley

The problem is that the source code may be in some cases an intermediate form, similar in some ways to how one thinks of an object file.

The source code may come from models or from other complex factors in the base of code and designs.

Simply controlling the source code output says nothing about the probability that it may contain subtle bugs ... which may not be apparent ... and controlling it as source code won't change those residual probabilities.

One has to control the tool that generates the source code, in the same sense as one qualifies a compiler.

Do you routinely examine object files looking for errors? Same question.

DTA

Reply to
David T. Ashley

OK, so you understand [and actually support] what I'm trying to achieve, but believe my approach is too Draconian and simple-minded.

I can live with that.

DTA

Reply to
David T. Ashley

There is one other point to make.

You may be inherently using the wrong probability model for Windows.

In addition to assessing variability, one has to assess the probability that bugs in Windows due to the variability in configuration or updates of whatever will actually affect an embedded build.

In most cases, for a given deployment of Windows, bugs that affect typical development tools (filesystem bugs, memory management bugs that affect processes and may corrupt memory, etc.) are very, very rare. I've not seen anything like that in my career. Not that Windows might not contain bugs, just that they are not of a type to affect an embedded build.

On the other hand, a different version of a script interpreter where the script performs part of the build ... that is a much higher probability.

So saying that Windows is uncontrolled trash doesn't imply that one shouldn't control versions of development tools. The development tools have a higher probability of affecting the build results.

The key issue isn't the number of bugs in Windows or how different versions or patch levels might vary in which bugs they have ... they key issue is the probability that those specific bugs will affect an embedded build. I judge this probability to be quite small (but of course not zero).

DTA

Reply to
David T. Ashley

And how do you know that the libraries used as input to those tools are valid and have not been altered in a subtle way ?

David is correct. You appear to be focusing on one part of the trust problem and not the overall picture.

Simon.

--
Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP 
Microsoft: Bringing you 1980s technology to a 21st century world
Reply to
Simon Clubley

Perhaps not - but you may be the first person in software history to have admitted it! I know I have been guilty of the same fault many times, but I don't tell anyone... :-)

Reply to
David Brown

A good version control system gives you equally easy ways to check the integrity of the programs (someone else suggested keeping the interpreter or other software directly in the version control system, which is a good idea here).

And as you say, the OS has bigger potential for problems here.

You have to take a step back and look at the larger picture - what are you actually trying to achieve? (The answer, I believe, is high confidence that the script-generated files are correct.)

There are basically two ways to ensure that something is built correctly

- you can control the building process so carefully that you are sure the result will be correct, or you can verify the built object so carefully that you know it is correct. Usually, you aim for a mixture of these two to give a balance of practicality and confidence. But if you are able to test and verify the generated source code files sufficiently, you no longer need /any/ control over the build process. It's worth thinking about.

Anyway, once you have a clear idea of what you want in the build, you can break it down into parts and steps. Then you can do a failure analysis - figure out what can go wrong, what could cause such problems, what the chances of the problems are, what the consequences of these problems are, and what can be done to mitigate the problems. It is only then that you can find out what will /really/ make a difference.

Then, I think, you will see that the "requirement" of a monolithic executable (which would greatly hinder your choice of scripting language) is like putting extra deadbolts on a door made of plywood. It might look good at first, but it does not give any extra security in reality. It is certainly far outranked by choosing a language that lets you (or someone else) write the scripts in a clear and correct manner - bugs in the script are a much more likely cause of failure than corrupt or mismatched interpreters, and will have far greater consequences.

Most embedded compilers that I have used don't make many registry changes that are relevant to compilation (though they might put things like include file paths there). But I have never seen one that comes remotely close to being monolithic, and that's just the compiler. All the headers and library files are vital to compilation. A quick check of an installation of avr-gcc (just the compiler and library - no IDE, debugger, etc.) weighs in at 6832 files on my machine. That's not very monolithic.

It's one tiny, irrelevant step forward - but several steps backwards because it restricts your choices so much. But perhaps having put that "requirement" in your first post has brought some wider issues to light, and helped both you and others here think about these problems. I know it has encouraged me to think more than usual.

Reply to
David Brown

That's correct. But in this case, it is not the /interpreter/ that is generating the source code - it is the script file. The interpreter is a step back up the chain. The higher up that chain you go, the more you use standard, well-tested parts, and the more you have to trust them, and the more you are /able/ to trust them.

What is more likely to contain relevant critical errors - a script written by you (or someone else at your company) in a scripting language they don't know, or an interpreter used all over the world by millions of people?

I routinely examine the generated assembly files for my code, especially if it is a particularly critical application. But it is very much spot-checks - it is rare that I examine the entire code for a program.

Reply to
David Brown

That's a very good point - and one that I agree with. So why are you not applying the same logic to the interpreter? Windows is a poor choice of OS for an automated build system, and it is almost impossible to tie down if you need to make exact replication of the build system - and yet I fully agree that if your build system is put together correctly, the variations or bugs in Windows are extremely unlikely to affect the final result. My point, however, is that the same thing applies to an interpreter like Python. If your scripts are written correctly (half-decent should be good enough), then the outputs will be identical on every version of Python from 2.4 to 3.4, running on 32-bit and 64-bit Windows, Linux, Macs, and anything else you try.

You are basing that claim on nothing but paranoia. Proof by assertion is not persuasive.

I counter you with proof by common sense. Generation of a text file (source code or otherwise) has been a major application of Python since it's first conception. If there were bugs in such basic functionality in Python, they would have been found and fixed a decade ago. Python is a major part of the glue that holds the internet together, and is found all over the place - it is well tested. Of course, I don't claim any given version of Python is bug-free - but just like with Windows, it is free of any bugs of relevance here.

Reply to
David Brown

To emphasize this (sometimes overlooked) point - From both the C99 and C11 standard draft documents, section 7.1.2, standard headers:

-----------------------------------

2 The standard headers are ... 3 If a file with the same name as one of the above < and > delimited sequences, not provided as part of the implementation, is placed in any of the standard places that are searched for included source files, the behavior is undefined.

-----------------------------------

(I understand the original comment applies to all header files, not just the standard ones.)

--
Roberto Waltman 

[ Please reply to the group, 
  return address is invalid ]
Reply to
Roberto Waltman

We are using PHP heavily for such tasks. I am not a user of Python so can't comment on the differences. But we use PHP scripts to automate lot of tasks other than build the code. For e.g. renaming of build binaries based on the software version in source code, running some other executables to generate some support files, copying the required binaries in some particular folders, even creating the folder structure etc.

Sadly, it's not me who has written the PHP scripts, so can't give you details but I will vote for PHP sure !

--------------------------------------- Posted through

formatting link

Reply to
gopal_amlekar

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.