Embedded Basic interpreter recommendations?

Debug becomes pivotal here. The Line-number/edlin model can really only get you so far, then you need to step thru the code, and watch variables.

Once you do that, the Host/Emulation/Simulation (includes Emulation via Simulation - umbilical solns) becomes much nicer, then you can force DOWN the resource in the target, and use the Host more. umbilical emulation allows a minimal monitor core, and the host runs a more local simulator. Also suited to flash/ram splits seen in typical uC.

Of course that places more caveats on the host.

Still, this pathway allows you to share a lot more of others work, and also chose a mature language/debug system for the host, that you clone on the target, OR you decide to implement a TurboPASCAL model entirely in the target, with a compact editor/debug/Compiler - Chips ARE now big enough to make this viable. A USB terminal could almost be a defacto, as most candidate devices for this, would includes USB for free.

Next step would be a Flash-Drive stub, where you included a Better PC- USB-Terminal in a SPI flash, and the target becomes the whole system...Mobile-Compute-Debug model :)

-jg

Reply to
-jg
Loading thread data ...

Non-programmers? Their level of approach is printing out variable values by inserting lines. Basically. The whole idea of watching variables would be a bit much -- not so much because they can't get the idea in general but because of the learning curve for the menu options of adding, deleting and modifying the added watches. It's really, really easy to use what they have already learned (adding statements.) It's just more unnecessary "stuff" to laden them down with another user interface.

Of course, I really don't know what the OP is intending. (Wish we'd get some responses.) "It's meant to be a tool for interactive experimentation of embedded concepts by non-programmer types," seems to be the guide, though. I imagine this as "turn on a light" by a single statement that keeps a TRIAC fired and where the actual detailed code that may monitor zero-crossings, handle phase angle decisions, etc., is hidden inside the BASIC interpreter. Stuff at that kind of level, really. Or maybe not. Hard to tell. So maybe more is better than less. But the OP did clearly say something about line numbers as the approach -- and I have to count that for something here.

Okay. Well, it's about time for the OP to jump in. We are proceeding far beyond the original comment. I feel you may be taking this over the top a bit, but then you might be right on target for all I know.

(I agree about the USB interface -- RS232 is getting hard to find these days and USB ports are everywhere, almost. So that would seem to be the preferred hardware connection. I would expect it to "look like RS-232," though and be able to use Hyperterm, for example, or whatever serves the same under Linux.)

I suspect many folks have thought about BASIC as a segue for non-programmers to do useful things they want to achieve for themselves. BASIC is probably the closest thing to making that possible, but what is the right hardware widget and BASIC combination that appeals to a wide array of non-programmers' interests?

My own focus would be on reaching high school education, allowing a path for non-programmer students to have some fun while learning and gathering and processing data and controlling things that move, make fire, smoke, and flash lights. For example, learning trigonometry in the context of moving a drill press around and drilling holes is a great way to provide tactile and sensory information about why sine and cosine mean something practical and real that can be seen right in front of them. If they make mistakes, it becomes immediately apparent and they can laugh a little at it and in the process deepen what those things mean to them.

But I'm going way afield of the OP, here.

Jon

Reply to
Jon Kirwan

g

Yes, but I was also partially answering Frank's posts here too... :)

-' what is possible at low cost' has rather moved in the last few years, with large flash uC and USB almost 'there by default'. (certainly is on the STM32 the OP mentioned)

-jg

Reply to
-jg

now

ike

It doesn't have to be an early Pascal variant - Oberon-07 satisfies the above:

formatting link

I did a fair bit of work extending PL/0 to make it into something usable - added REPEAT loops, SIZE, ORD and CHR, record and string assignments etc. etc. However, that all that went into cold storage when I changed direction and started the Armaide project with the Oberon-07 compiler:

formatting link

The Oberon-07 compiler in Armaide runs on Windows but I have used it to compile itself to ARM code. The object files occupy about 80k bytes

- plenty small enough to fit in the Flash ROM of many ARM MCUs.

On a related track I've also been experimenting recently with an interpreter based on Wirth's Lilith M-Code Interpreter. The source code of the M-Code Modula-2 compiler and related documentation (including a listing of the interpreter) are at:

formatting link

-- Chris Burrows CFB Software Armaide: ARM Oberon-07 Development System for Windows

formatting link

Reply to
cfb

RS232 communication is so much easier to implement and is not a major problem even if you only have USB sockets on your computer. You can get cables with an RS232 plug at one end and a USB plug at the other with the converter built into the plug that work well and cost less than $20.

-- Chris Burrows CFB Software Armaide: ARM Oberon-07 Development System for Windows

formatting link

Reply to
cfb

Maybe you are right for people with no prior computer experience. Timesharing systems sounds a bit old. I think it is difficult today to find someone who has never used a computer before. And then it is the same in all programs: It doesn't matter, if you write in a textarea in a HTML form in an internet cafe, or a letter with a word processor on your PC or Mac: You can see all lines at once, move cursors and edit like you want, insert lines etc. I think many pepole would get angry, if you try to replace this with a C64 line edit interface or something like Edlin. Why should this change, if someone writes programs?

But to support your opinion a bit: I've seen inexperienced computer users trying to write letters and it really hurts to see how they insert empty or text lines by pressing spaces until the rest of the text is moved one line down. But most people learn the meaning of "Enter" and the ideas of the usual editing programs with cursor keys, insert etc., very fast. But maybe a good idea to give a short tutorial for absolute computer beginners how to use the editor in my IDE.

--
Frank Buss, fb@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
Reply to
Frank Buss

No question.

Well, there is that I grant you. I guess we don't disagree.

Jon

Reply to
Jon Kirwan

I'm older.

Even if people have had more opportunities for _contact_, it may not mean much more than that. Mass produced books have been around a very long time, now -- centuries -- and most everyone has had more opportunity to _see_ them around. But I still know a lot of families with no more than 5 or 6 books in their home; less than that which they may have read. Computers have been in the atmosphere for far less time, too. My lifetime is very short and the transition has been taking place during it.

I'm just being argumentative above, though. You made a point.

On that, I'm not sure what to make of it. So I'd just repeat that although you might be suggesting (I can't tell, yet) something to the contrary of my earlier experiences, I have actually taught computer courses in a 4-yr university as recently as 10 years back. I didn't find anything in that experience to suggest I should change my mind.

What I did find different is the distribution of people interested in becoming programmers as a professional choice. The computer field did (then, at least) attract people from a much wider spread of talents and interests than when I was getting involved. And the advance of available software has meant that people with far less breadth of training and scientific inclinations were enabled to actually do things that produced useful results for others. So the base of the pyramid widened out a lot. That's both good and bad.

But non-programmers are still human beings and must take first steps like most anyone, even in my day. A toddler today is little different than a toddler 50 centuries ago -- they both struggle to learn how to walk mostly in similar ways.

I'm not sure what the OP is targeting, so it's difficult to get going on this.

Just to be difficult, though, there is a huge advantage in being able to type in "PIN5 = 1" and see an LED light up. Without having to go into an editor. This is one of the reasons why I said someone would bring up Forth. It's like that, too. Except that it really isn't right for beginners, in my opinion.

Look, I'm no teaching expert. I have some years of experience teaching a few languages to high school students with little or no experience; and in teaching assembly language, computer architecture, concurrent programming and operating systems as part of a 4 yr course at the 2nd and 3rd year arena. This does not mean I'm right, it just means that I am not completely without some experience. I've found BASIC easier at getting non-programmers excited about what they were doing. And I've tried other approaches.

I don't mean to say other ways can't be made to work well, too. Just that for who I am and what skills I bring to the teaching table and those I've been exposed to, it's been easier this way. Maybe it's just me or the groups I've had the luck of knowing.

Well, let's just leave it here. Really, the OP is dead silent about all this and I think it's not our turn, anymore.

As I said earlier, I like the idea of using microcontrollers together with tangible things that smoke, spark, move, explode, etc., because that is how you motivate young students into doing the hard work it takes to get past some of the learning involved. We humans are like many other animals -- we like shiny objects. "It's pretty!" At a young age, before one has had time to develop the intrinsic motivations to get one through tough, abstract problems and reasoning, there is a need for external stimulation as payback for the work ahead. Why did I bang my head against the wall, reading the same book on optics 5 times over, before enough of it gelled to be useful? Because I wanted to build my own telescope to look at the stars and we lived hand-to-mouth at the time and there was no other possible way for me. Why did I spend so much time in the University library (as a high school student at the time) reading about chemistry? Because I wanted to make fireworks and firecrackers on my own. I got paid for my effort. I don't need those things now. I've crossed some of the harder barriers and matured a bit and once you do that, pieces in the larger picture of life on earth and the universe begin to show you a picture all of its own and you begin to crave seeing more of that, so you don't need the smoke and flames anymore. But before that intrinsic motivation precipitates, it's action, fire, smoke, etc....

That's the part that interests me. Passing on an abiding interest in using microcontrollers as a segue (not an end of itself, or even just an occupation) to learning about the world. Like learning to use a keyboard isn't about learning keyboards as an occupation -- its about facilitating writing for other purposes. At first, all you do is watch your fingers. Eventually, you get past that and stop thinking about where your fingers are at and you then can be fully engaged in what you write, not how you write it. Programming and microcontrollers are, largely now for me, a means to access the world around me -- an inexpensive tool to try some idea in optics, or chemistry, or physics -- maybe also to explore something in mathematics. It pays the bills at times, yes. But that is only something in my peripheral vision -- it's not my focus. It's like moving fingers to hit keys. I don't look at my fingers anymore and I don't look at the programming lines so much anymore, either. I already have learned the techniques for most things I care about and I don't find focusing on those details holding my interest that much, now. I'm focused on the application and what that application can teach me about the world around me.

Or so I sometimes like to imagine I do.

Jon

Reply to
Jon Kirwan

John Speth schrieb:

It's not Basic, but...

formatting link

--
Mit freundlichen Grüßen

Dipl.-Ing. Frank-Christian Krügel
Reply to
Frank-Christian Krügel

Or in Forth you type in "PIN3 PORTB ON" and see the led light up.

If the OP can live with 16 or 8 bit processors, there are some ready working alternatives.

PIC18F and dsPIC30F : User Guide :

formatting link
Download :
formatting link

megaAVR:

formatting link

These are self contained Forth systems that only need a serial line for communication.

You can interpret commands and compile code directly into Flash. Very good for exploring those particular micrcontrollers. The teacher can build up special easy to use words for controlling whatever stuff you want to control.

The code editing is done in files on the PC. When you want to compile just send the file as ASCII text over the serial line.

Mikael

Reply to
Mikael Nordman

I'm the OP.

My intent is this: Sometimes here at work we have a project idea. Then a circuit with a microcontroller comes along and the marketing types have little to no software requirements. I could spend all my time writing throw-away code asking "Is this what you wanted?" at the end of each iteration. It would be simpler for me to just let my associates have their own way with the circuit. It woud allow them to develop ideas and walk a mile in my shoes as a bonus. The language shouldn't be an impediment.

I learned basic on a PDP-11 i in the 70's. I could execute "n=12" then "print n" interactively or I could assign line numbers and run, as in "10 n=12" then "20 print n" then "run". That's what I want. Extensions for the micro like peek and poke would be necessary.

JJS

Reply to
John Speth

Editing source code and interactivity is orthogonal. Maybe you and Jon Kirwan are right and for non-programmers with no previous word processing experience, it is easier to learn this PDP-11 (and C64) interface.

A more modern approach would be GUI, but with an additional command line interface, like a Lisp REPL, or with separate windows, one for input and one for output, like I've seen and used in Squeak (a Smalltalk implementation). Then for the Lisp environment, e.g. with Lispworks, you have an additional text editor and you can hit "compile" to compile the content (and other included things) into the runtime system, where you can call the compiled functions etc. from the REPL. In Smalltalk it is even more integrated and (being object oriented) you have an object browser and you can inspect methods and open them in an editor. This would be the C64 concept, but in an advanced style: instead of "LIST 30-50", move cursor up, change content in line 40 (or even more worse: re-enter the whole line, with your changes) and finally press enter to recompile, it would be "inspect setLed" from the command line (or browse a function list in the GUI), which opens an editor, where you change the content by clicking the mouse to the text you want to change (which is another problem, if you use pure RS232), then hitting "save" to compile it to the runtime envionment, which finally can be tested interactively on the command line.

BTW: I think using Basic is a good idea for my system. Modula is more clean, but more complicated. I've learned Basic, too, as my first language and it was very easy. But when I write programs in Pascal-like languages, like VHDL, I still make syntax errors (do I need a semicolon at this place or is this an error? Ok, I think Modula has improved the syntax a bit, but still some discipline is required, like one place where you define variables, then "begin" etc.), so this would require a really good editor support to avoid user frustration, which is easier for Basic.

--
Frank Buss, fb@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
Reply to
Frank Buss

That's just the compiler ? - What about linker, and library OBJ files, and and Editor, with inbuilt Debug ....... (aka TurboPascal ) ?

I see this uses .NET, which means it would not make the cut, for Jon ;)

( Not sure I'd agree with the removal of LOOP either... and the change in RETURN - Still, I guess Size and Speed matter less these days... )

interesting.

-jg

Reply to
-jg

I still have and use 80486 computers, running on Win98SE. Have you ANY IDEA what a program depending on .NET does to one of them??? I'm talking about tiny programs here, that depend on .NET to run.

Long ago, I'd written my own MD5 program that runs in DOS under Win98SE (and later incarnations, yes.) I used it so that clients getting stuff from me could verify file versions, if there was any question about it. (Folks rename things, at times.) One day, I happened upon a Win32 program that allowed drag and drop and was very easy to use for some of my clients and I decided to point then in that direction. It was about as fast as my own (I didn't make precise measurements, but it was close enough.) The program fired up very fast and was easy to use and was free. So far, so good, yes? Then the author made a change to .NET for the same program. In fact, there was NO differences at all in the prior version and the new version except that he said he'd decided to change over to .NET for this new version (perhaps looking forward to other features once that "port" was done and working.) I loaded it over and tried it out. It took almost two minutes to just load up!!!

Now, once I'd paid that price and .NET was "up", it did admittedly come up a bit quicker. Maybe in less than a minute, but still very very much longer than the earlier version, which was present and ready in about one second or less. It just wasn't worth the hassle anymore.

Windows once ran "okay" in 8 Meg and pretty darned good in 16. Then it needed 32. Then 64. Now, a gig is considered pitiful. Widgets negotiate for positioning in toolbars or other widgets, toolbars dock now, there are interfaces just to enumerate the interfaces and then interfaces for those to enumerate the functions available, and 64 layers/interrupt levels for handling events, rings upon rings, layers upon layers, ....

And despite the fact that computer hardware has advanced at a pace that boggles the mind (front side transactions that operate in parallel with transaction, error, cache hit and data phases all operating on the same clock, levels of caching not only in the cpu(s) but in the chipset, read around writes, relaxed transactions rules for graphics [where the order of arrival isn't all that important], etc.), with multi-GHz internal cycle times, many stages of cache and buffering going out to slower memories, huge memories unthinkable not so long ago, and so on.... Microsoft STILL manages to find ways to slog such a machine down to a crawl.

Don't even think of getting me started. Yeah. .NET = .NOT.

;)

Jon

Reply to
Jon Kirwan

IMO, the BasicX BX24P stamp is a superior alternative.

formatting link
formatting link

--
Guy Macon
Reply to
Guy Macon

Sorry - I should have said Oberon-0 (the later incarnation of PL/0) from Wirth's compiler construction book:

formatting link

Linker ~10k Library OBJ files =3D zero K for a simple app like Blinker, ~ 6k for runtime error trapping and reporting ala TP 1.0 via RS232.

I haven't considered an editor yet. I'd do the same as somebody else suggested - do the primary editing on the PC with the terminal emulator. Have a simple editor on the target to fix simple compilation errors.

For the .NETphobes the compiler, linker and loader are both written in Component Pascal so a Win32 version of the command-line versions could easily be built using Oberon microsystems BlackBox compiler.

True. Can't see that size or speed are too relevant though - it's all in the interest of reliability / security / maintainability / testability i.e. a return to the old idea of a rigorous single-entry / single-exit approach for all blocks of code. Yeah, I also groan when I'd really like to code a couple of exits from my loops or procedures. However, although it takes a bit more effort I've found that nine times out of ten the resulting code is far more digestible.

-- Chris Burrows CFB Software Armaide: ARM Oberon-07 Development System for Windows

formatting link

Reply to
cfb

s

You still need to store the OBJ files, but that could be a SPI flash and smaller executabe blocks could also be loaded from SPI as needed....

That leaves the problem of the memory image needed for Compile, and I suspect RAM will be the stumbling point, more than Code space.

64K is large RAM on a uC, and very small RAM for a compiler.

I suppose only (very?) small programs would be targeted for 'self- compile'

So what is .NET being used for ?

You don't sound that convinced either ;)

The strangest change I see, is the new WHILE else, called ELSIF.. DO ?!

Was the paranoia at adding a new explicit and correct keyword so high, that an existing one had to be press-ganged into service ?. ELSIF now means two different things, and is entirely context dependant, which will make reading large blocks of code risky.... (shudders)

-jg

Reply to
-jg

I know nothing about it, so you could be right. However, I have yet to see an implementation on micros that comes close -- and with far more flash to deal with. We included matrix mathematics along with all the usual transcendentals, conversion to and from very highly efficient tokenized code, the interpreter engine and a lot more all sitting inside 6k word (12k byte) of code space. I would very much like to see something similarly well done, so I will take a look on your assurances.

Jon

Reply to
Jon Kirwan

(Oh, and I forget to mention line editing and error checking.)

Looks nice enough, having looked. Some things we didn't have that are appropriate differences considering time-sharing vs micro -- multiprocessing, for example. No matrix math, I see. Looks like a compiler, though, which wouldn't serve my purposes.

Jon

Reply to
Jon Kirwan

Correct.

Good question. Obviously not an issue when compiling on the PC. I'll do some measurements when I have a spare moment.

Small *modules* perhaps - that does not necessarily mean small

*programs*.

All of the standard Windows GUI features and the rest of the IDE - the multi-window / multi-file / split-screen, Oberon-07 syntax aware editor; the procedure / imports navigator etc. etc.

You guessed it - I'm basically lazy and am as guilty as anybody else of risking long term pain for short term gains when I can get away with it. It's a heck of a lot easier to just write a LOOP and throw in a couple of exits than worry about trying to achieve a good flow of control. Not so good a few months later when you have to go back and untangle the spaghetti ;-)

I have to agree. Apart from the example quoted I have yet to find any use for it. Can you think of any real-world applications for it?

Maybe the length of an alternative keyword was a factor in the decision? The maximum length of any Oberon-07 keyword is only seven characters - presumably for efficiency reasons. Besides, you wouldn't really prefer ELSEWHILE or ELSWHILE would you? ;-)

If I am correct about the rarity of its use any confusion is unlikely to be a problem in practice.

As it is, the way it has been implemented only resulted in the addition of four lines of code to the compiler. Seeing as it was so easy to do I suspect Wirth may have just included it as a tribute to Dijkstra. Alternatively maybe he just wanted to give people like us something controversial to chew over ;-)

-- Chris Burrows CFB Software Armaide: ARM Oberon-07 Development System for Windows

formatting link

Reply to
cfb

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.