I'm looking at starting an angel-funded effort to build some interesting small sensor devices, for which I'm going to need to do some Cortex M3 programming for real. Since that sort of job needs speed more than money, I'd like to get a full-function development system with a decent JTAG-based debugger, real support, and the whole 012 yards.
All you folks with the several $k development systems: What are you using, how do you like it, and why?
Gnu tool chain, Eclipse IDE, $50 JTAG dongle from SparkFun. It works as well as many $10K development systems that I've used. Support is a bit harder to come by, but I've gotten some pretty dreadful support from the suppliers of those high-dollar tool chains, too.
My liberal friends think I'm a conservative kook.
My conservative friends think I'm a liberal kook.
Well, this may be totally wrong for your needs, but have you looked at the BeagleBoard? it is completely awesome, 3" square,
3 W max (more like barely 2) and you can plug in a keyboard, mouse and XDVI video screen, and add a USB ethernet adapter. I run Linux on it. Uses a CF card for "disk". It has about 20 pins of available GPIO.
No development kit needed at all, as long as you don't crash the system. I have made several TCP server appliances for remote control of devices using it.
This depends on what your actual goal is. If you are expecting others to use the tools, then you probably want to find tools that are "affordable and available" to those folks. E.g., I try to use "free" tools for anything open-source that I develop (it doesn't seem fair to force the sorts of people in that camp to purchase "proprietary" tools). For hardware, I'm a bit less of an idealist (its too constraining, for me).
You also have to consider how easily folks will be able to get effective *support*. If they each need a service contract to get answers to their questions/problems with "BogusWare 5000", then you've essentially made support unavailable.
I try to document problems with the tools that I'm using and work around those problems. Often, "fixes" just change the set of problems that you have to DISCOVER and work-around (I'd rather live with a known set of problems than a brand new batch of yet-to-be-discovered problems!)
Finally, you have to consider available skillsets and talent base. You could opt for the "ideal" tool -- only to discover that folks have been bottom feeding off "free tools" and don't have the skills, experience or resources to effectively utilize the "gift" you've saddled them with!
Sort of like giving a glass of fine cognac to someone who's spent a lifetime drinking 3.2 beer... :-/
I use the Keil (part of ARM) development system. I just forked out for the upgrade to the pro system which gives you libraries for USB, Ethernet, Flash/SD CArd/FAT files etc. They seem pricey compared with "free" stuff but you get support, every manufacturer of silicon (AFAIK) has Keil friendly example stuff. I also suspect that the relationship with ARM means they know what's round the corner better than most. You get to use their scheduler (small RTOS) which works quite well.
They do their own debugging tools and they seem to work OK.
I would much rather use the Keil IDE than any other I've tried.
Rowley CodeWorks here. It's gcc-based but with their own libs so LGPL issues are avoided. Includes their own RTOS (CrossWorks Tasking Library aka ctl). Supports a bazillion JTAG interfaces (well, lots) including the more-or-less standard Segger J-Link, the inexpensive Olimex USBs, and their own CrossConnect. It supports CM3 SWD with the CrossConnect, Olimex, and Amontec JTAGs. Has a reasonable IDE and debug environment.
30-day free evaluation period. It's been pretty bulletproof IME.
I hate the Visual Studio editor--any proper programmer's editor should be line-oriented, whereas VS's is just like Notepad. Its debugger is pretty nice, though--much nicer than any gdb frontend. (It isn't as good as the IBM VisualAge debugger by a long shot, but that's crying over spilt milk.)
I'm not a big fan of any of the gnu debuggers, and the Eclipse-based
1-wire systems I'm most familiar with don't allow setting watchpoints, which IMO are pretty vital when doing choreography with multiple interrupt levels.
Is the Embedded Trace Macrocell support worth the tool price?
Do the proprietary tools have any significant advantage in code size? (Chip pricing seems more sensitive to flash size than to anything else.)
For cost reasons I'm probably going to use the ST ARMs, e.g. the STM32L151, which I can get for under $3 in onesies. That's assuming that their ADCs and DACs aren't too horrible. I can bandage them with a table lookup, provided that they don't have a lot of missing codes.
This is going to be a fun one if it comes together.
My amp goes up to 11, so I needed an extra yard too. ;)
All true. In this case it's all proprietary anyway, at least until Won Hung Lo reverse-engineers it, so I'm looking for whatever will make my life the easiest. When you work by yourself, getting stuck on some stupid problem gets expensive, fast.
Did you ever use Eclipse? It works nicely as a gdb frontend.
I prefer a logic analyzer and I/O pins for checking that sort of real-time stuff. There really isn't much use for a debugger on embedded platforms. I you want to debug code (verify whether it works) it is much more comfortable to do this on a PC. The nice thing about Eclipse + gcc is that you can write code to run on a PC and use the same code on an embedded platform.
AFAIK the compiler made by ARM is the best for code density. GCC comes second. Most vendors sell you a GCC compiler these days. Codesourcery has a free lite version.
When I do the math I mostly end up with NXP's ARMs. I did a design once with an ARM from ST (STR700 series) and its a complete joke compared to NXP devices. And don't get misguided by the price. NXP devices can run full speed from flash. ST devices can't.
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
But that can also be an *advantage* of self-employment (depending on time table, etc.): if you're "stuck", do something else (plant a tree, play with the dogs, etc.) until your mind has a chance to "reset" (In a 9-to-5, you just beat your head against the wall until closing time -- and are usually anxious because you'll have to face the same problem in another 15 hours...)
That's not often the case. You can get a good idea if the *algorithms* that you are using will work and iron out many of the bugs. But, since the code you will ultimately be generating (unless your target is a PC!) will have different characteristics (code/data sizes, endian-ness, timing, etc), you can easily be fooled into missing BIG problems (depending on your level of coding discipline).
"Crap! chars are *signed*, here!" "Whaddya mean, no MUL opcode?" "Yikes! Why is it stuck endlessly servicing IRQ's?" "Ouch! I had no idea stack penetration would be so deep!" "Ah, every time I tickle this output, this other thing happens..." etc.
I like embedding a standalone debugger *in* the device that lets me watch execution of a single thread, peek/poke random memory locations, etc. Debugging multithreaded applications on a PC is just not practical when you can't realistically simulate the I/O's (that are *driving* the various tasks).
[Note that such a debugger need not be feature rich. You aren't relying on it for *all* of your debugging needs!]
But, if I "kick" a device and it doesn't "scream", I can peek at the "kick detector" to see if it SAW that kick. Then, poke at the "scream emitter" to verify that it WILL scream when commanded. Is there a problem in one or the other? Is the problem in the communication subsystem? Is the OS not scheduling the proper tasks? Is a task waiting on some resource that I *thought* would be available? etc.
These are hard to examine in a desktop "emulation"/simulation.
I'd much prefer talking to a debugger than passively watching an instruction trace on a logic analyzer (especially when the trigger event could be in one thread and the "reply of interest" in another thread thousands of cycles away).
I guess if your coding style is to treat things in "big units", you can get more done on a desktop. I like to split things into very fine units for maximal concurrency/sharing. That tends to make my solutions more dependent on the final platform for proper execution (e.g., producer and consumer need to co-operate in order to see real functionality).
I've used Eclipse, I just think it's klunky compared with Visual Studio or VisualAge. I've begun experimenting with ZeroBugs on Linux, which I quite like so far. It doesn't use gdb at all.
Mainline code, sure. Interrupt-driven code is harder, and I'm interested in being able to find things like memory corruption bugs that are hard to do with a logic analyzer unless it's a fancy new one that can look inside the chip via ETM etc. That probably isn't in the budget.
That's good wisdom, thanks. Processor cost seems to be driven mostly by the size of the flash, so tighter code is worth real money.
Hmm, interesting. Thanks for the heads-up about the internal flash needing a wait state. It does have prefetch, which should help some.
What other things were inferior about it? It does look as though NXP is better integrated with the lower-priced tools such as Code Red.
The applications I'm looking at really benefit from the two separate DACs, so I can ping-pong them during data acquisition. I don't think NXP has that.
I'd like to stay with Code Red if possible, to be able to interoperate with some colleagues who use it, but their ST support seems pretty thin.