what uC do you prefer?

So with the plethora of micro-controllers out there, which do you use/prefer and why?

I use PICs pretty much exclusively but I just had a look at ATMELs range - to be honest, I don't see any reason to change from what I know - am I missing anything?

Do all uCs use harvard architecture? Sometimes it would be nuice to have data available as bytes without having to wrap it in code and form tables. Tonight I scared myself - I caught myself googling "microcontroller z80 core" eek!

Reply to
feebo
Loading thread data ...

Stay with RISC particularly if you use assembler.

If you changed now to a CISC you *would* hate it - they are a silly remnant from the days when memory was expensive.

Robin

Reply to
Robin

I use Renesas's R8C chips (from their R8C/M16C/M32C family) in my furnace. Lots of peripherals and timers and such in a small package, and the assembly isn't as convoluted as some risc chips can get (it's very i386-like). GCC is available for it (disclaimer: I'm the m32c maintainer for gcc). They also support 2.7v to 5v operation (the m32c's support split supplies - like 3.3v for the cpu bus side, and 5v for the peripheral side)

Digikey carries many of the chips in this family, and they're easily programmed with a serial port and some GPIOs. Mine are programmed from a gumstix (embedded xscale/linux board).

No, most don't. The R8Cs aren't, although they do have "near" and "far" data (only the m16c variant really cares, the r8c is usually too small to have flash "out there" and the m32c has 24 bit address pointers anyway).

Reply to
DJ Delorie

Many micros use flash memory for code (the transitions continue) and include a small amount of RAM. Since these are naturally separate memory systems, often with differing needs for address lines and directional latches to access them, and since a processor often keeps a separate set of latches for the program counter and current instruction under decode, it's almost like falling off a chair to make it Harvard. It may take a little extra effort to make it appear as von-Neumann. But there are a number of micros that take the trouble (the MSP430, for example.) And also cores which have an external address/data bus, like the 8051 core, can often be easily adapted, externally, as a von-Neumann arranged memory space. (The basic 8051 chips, in fact, often are wired exactly that way and certainly can be arranged that way in their various incarnations by manufacturers including both flash and ram.)

PICs are fine. I like them for hobbyist use because they do well at supporting small-qty users without a lot of hassle about it (like asking you to work through distribution.) I also like the Atmel AVRs and would use them, too. There is a very nice programming system for them that isn't too expensive and includes buttons, leds, and so on so that you can program and test some ideas right away. However, with Atmel, I am usually routed through local distribution and/or a local FAE (Eric Feign, in my area, has long been filling that role) for questions. For a hobbyist, this can be a small problem, though not necessarily.

Unless you have some reason to change, though, I think PIC is a great place to focus on. A fairly wide range of options and a company that will support its development tools quite literally, it seems, forever. When you pony up the money to buy something from them, and especially if it is a professional level tool, they will pretty much jump at your whim and ask what you want on the way up. I've had them replace entire units and modules at the mere suggestion that the On/Off switch might be a "little flaky." And this, an old tool they no longer even sell!

So I'd say stay there unless you think there is something worth having elsewhere. It's a good company from my experience.

Jon

Reply to
Jonathan Kirwan

8051 family.

A. Because I know it very well. B. Because I have a PLM/51 compiler and I like to use PL/M C. Because there are as many variants of an 8051 with various inbuilt goodies as there are flavours of PICs (enough at least anyway). D. Because it's not single sourced like certain MCUs. E. Because you can get ultra-fast versions should you need one. F. Because they're almost as cheap as PICs G. And these days they're low power, low voltage, have flash memory and are fully static too.

Graham

Reply to
Eeyore

I like them, too. My very first microcontroller project, where I actually did all my own investigation of the unique interface needs and then designed all of the hardware for it, used an 80C31. I wire-wrapped that project and it worked the very first time, placing the newly burned EPROM into its socket. This included a serial-port for use by an IBM PC (the year I did this was 1984) and it interfaced directly into the reed-relays of an IBM Electronic typewriter to turn it into a printer, for me. At the time, the Electronic 85 was fairly new and there was no information on doing this, so I 'scoped out the signals on my own and developed a table. It was such a thrill to see it work, right off the bat.

The source code was written in assembly and used a table assembler, which had been originally developed by someone in Washington state for those interested in using 8051 cores to make products for the hearing disabled, if I recall. I may still have that tool floating about, though I don't use it anymore.

I have a box of some hundreds of 80C32s (with small printed circuit cards neatly soldered to the back of each one, which provides a built in power-on reset and software accessible reset line.) Got them very cheap.

Jon

Reply to
Jonathan Kirwan

as

You can find the second application for an 8051 I designed here.... This is a late revision, it actually dates from about 1993.

formatting link

Graham

Reply to
Eeyore

Harvard is an anomoly. And a nasty anomoly. Decent processors have a unified code/data/IO address space, without specific i/o opcodes. So you can apply the same instructions to anything, and put data structures and code wherever they fit best. A table can contain mixed data and flags and routine addresses, for example.

I mostly use the MC68332, a very CISCy 32-bit machine. It's a pleasure to program in assembly, and has a beautiful orthogonal instruction set, including some handy 32 and 64 bit mul/div things. The later versions of this architecture are the Coldfire parts.

Like, we often use a single 8-bit wide eprom, to save board space. But that makes instruction fetches slow. So for a subroutine that has to run fast, we just copy the code from eprom to CPU internal ram, and run the copy there, blindingly fast. We can even reuse the ram workspace, overlaying different blocks of code. It's easy, with a decent architecture.

The TI 16-bitter, the MSP430, is pretty nice too, and very fast. Like the 68332, it has a register-rich, symmetric, very PDP-11 looking architecture.

John

Reply to
John Larkin

For hobby purposes, if you can write in C, I'd go AVR because of avrgcc/avrlib. I've used commercial compilers for MSP430 and PIC and IMHO, avrgcc/avrlib is excellent. I haven't used PICs for a while but I believe the only 'free' C compiler worth using (HiTech PIC C Lite) is code size limited to 2K... You can forget doing anything cool like TCP/IP with that! Also on freertos.org the AVR seems to kill the PIC for speed. By all means use the PICs simple instruction set to learn how to use micrcontrollers, then learn C and go to AVR.

Reply to
Fox one! Fox one!

On Fri, 20 Jul 2007 17:54:39 -0700, John Larkin wrote:

I must say that the Harvard has got in the way of what I want to do a

*lot* more times than it helps - in fact I can't think of a single time when having seperate data/program structures has been anything better than neutral i.e i don't care about it. I am fully aware of the dangers of having buffers and data structures butting up to the edges of code (buffer over-run exploits - spesh of Billy boi) but that is hardly a problem for a lot of embedded stuff. I guess as we march to more advanced uCs and solutions it might have a place but I have always considered buffer over-runs and the like to be s**te programming... I have never written a piece of code that didn't check limits on storage areas as it worked... oh, wait... I have... years ago with Z80 stuff... never really kept an eye on the stack pointer moving down through RAM but never had a crash - code was not sufficiently complex or recursive enough to use up the DMZ between last_byte and SP :o). I do hanker for a contiguous RAM area where I can point to sections of code and modify them on the fly, place data in line etc. PICs are cheap and pretty well spec'ed but I am starting to get constrained with constant bank switching on calls and jumps or checks thereof and the encoding method for tables or even simple message to be output to having to be wrapped in code and the table called with a progressing pointer are a real pain. I just feel the core is limiting things and all the ways of doing stuff, instead of being "works of creativty" by the programmer are a series of work-arounds :o( Don't get me wrong, I think PICs are great and dead easy to program so long but you need to keep constantly aware of the wrinkles - and I only ever write assembler so I don't have some HLL keeping a check on this for me - with the resultant increase in code size and reduction in speed.

That does sound a nice piece but way beyond what we have a need for here which are mainly small controllers/converters etc. I really liked motorola assembler on the 68K family - is it much different - (looks on shelf - sees "programming the 68000" - sighs :o)

yep -standard proceedure - the preamble gets everything ready for fast code - the PIC is fast (in it's place) so the code execution is not a problem, but if you want to update the code down the phone or something and store it in a serial eeprom, I think only now are Microchip producing a part that allows sections of the program to be "blown on the fly"

PDP...

Reply to
feebo

I did look at AVR stuff last night but they dodn't offer much over the PIC that we currently use heavilly in our controller designs. - did I miss something?

I only ever write assembler - control freak that I am :o)

Reply to
feebo

Take a look at the '430 instruction set; you'll get downright nostalgic.

John

Reply to
John Larkin

The PIC is descended from the PDP-8, as I recall. Nasty little things.

Me too. The code density is so low that there's lots of room for comments, even running dialogs. Most C program are literally "code", in that they require massive amounts of decoding to figure out what's going on.

John

Reply to
John Larkin

Then take a look at my web page on the instruction set, comparing it with the PDP-11:

formatting link

Jon

Reply to
Jonathan Kirwan

I like the PDP-11 a lot, as well. I used to code assembly for both it and the PDP-8 (and rarely, the PDP-10) from DEC. (Also programmed in Bliss-32 and Macro-32 on the VAX, but that's another story.)

The MSP430 is very nice, hardware-wise. It's instruction set has been sacrificed on the altar of "lots of registers," though. In supporting

16 equivalent registers, they had to literally demolish the careful and well-hewn balance found in the PDP-11's instruction design. As a small trade-back, they've included the concept of a very modest constant generator -- not enough to even come close to making up the other damage, but useful. But like I said, it's hardware is really great. Timers can come with up to 7 capture/compares in a single unit, many peripherals enjoy the benefits of a DMA controller, etc. It breaks up the flash memory into nice-sized pieces allowing it to be erased in small bits. It allows you to place code into RAM, where you can program the flash without waiting... or you can execute code from flash that can modify and erase other banks of flash (but where the code is automatically suspended for the duration.)

Have a look at it.

Jon

Reply to
Jonathan Kirwan

Why would any intelligent person fret over an instruction set ?

Graham

Reply to
Eeyore

Because even if you don't code in assembly it makes a difference in code space requirements, execution time, memory bandwidth, .... and more.

Jon

Reply to
Jonathan Kirwan

And why fret over those either ? If you're so close to the bone that it's an issue, you're probably using the wrong device in the first place.

Besides, the PL/M compiler I use seems to produce compact fast code anyway.

Graham

Reply to
Eeyore

Well, there is that. But I cannot recall a single instance (of any application significance, I mean) where I wasn't mixing assembly with C or just writing in assembly. There are a number of application requirements that simply cannot possibly be met by C alone. It simply doesn't have the semantics for the application space I'm usually working in.

I wrote my own PL/M compiler, many years back. I wonder if I still have it around here, somewhere. Now I'm going to have to go look. ;)

Jon

Reply to
Jonathan Kirwan

That sounds interesting !

Graham

Reply to
Eeyore

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.