Advice neede: Atmel or Philips ARM

Perhaps ask these people when their bga board will be ready.

formatting link

Reply to
dmm
Loading thread data ...

Amen, brother!

One wonders what the instruction set designers were smoking when they devised the basic scheme. Using four bits for conditional execution which is rarely anything but "always", using another three bits for a third operand which isn't often used, rotating immediate operands right instead of left shifting, ... It's easy to see how to pare the 32-bit instructions to 16 bits.

There are some clever things that can be done with the instructions as designed, but it must drive people trying to develop compiler code generators to drink trying to get the use correct in other than a rudimentary form.

Reply to
Everett M. Greene

The edge connector is for the BDM pins. However we have a fulluy usable toolset using only the serial port, the BDM can be used, but it is not required.

Our compiler if GCC based. We aslo include RTOS and a bunch of support libraries and code.

I love the Coldfire. The assembly makes sense! (I also like the AVR, I dislike, arm, pic and PPC assembly ugh!)

Paul

Reply to
pbreed

Add msp430 to the list of sensible assembly.

PIC assembly has some of the least pronounceable and least obvious mnemonics I've ever seen - can you tell at a glance what "btfsc" (or is it "btscf"?) should do?

However, nothing can beat the PPC "Enforce Inorder Execution of Input Output" for its sound - I suspect the mnemonic, and then figured out what it could stand for.

Reply to
David Brown

Thats easy

btfsc ; BiTFuSCate Accumulator ; Webster: Fuscation, n. a. darkening; obscurity. [Obs]

I.E: some kind of unpredictable operation of which the least said is better ;-)

btscf ; BitfuSCate rotating Flags ; Will randomize Accukumulator similar to operation above AND make ; the status flag register flicker in a random sequence

--
Best Regards,
Ulf Samuelsson
This is intended to be my personal opinion which may,
or may bot be shared by my employer Atmel Nordic AB
Reply to
Ulf Samuelsson

It's funny you say that - ARM was originally designed to be easily programmed in assembly language. In the first few years almost everything was 100% assembly language, the OS, modules, applications etc. The rest used BASIC. Even around 1995 major applications like word processors were still written in assembler...

Although they could have made different choices, the result is pretty good (best codesize of 32-bit RISCs for example - by a large margin). Look at Thumb-2 and you'll see that most things are still relevant 20 years on.

Thumb-2 removed the complex addressing modes from LDM and the register controlled shifts in ALU instructions. It has a smarter way of generating immediates without wasting

25% of the space and a better way of conditional execution.

However the condition codes, 3rd operand and shifts are used frequently in both assembly code and compiled code, and this helps codesize and performance a lot. In fact this is almost exclusively what makes ARM so much faster than Thumb-1.

If I started from scratch I'd keep most of it, but use the bits more efficiently. Eg. use only 2 bits for conditional execution and reduce the number of shift+ALU options. Then use the extra bits on 32 registers and bigger immediates.

Actually using the full ARM instruction set in a compiler is relatively straightforward, and good ARM compilers are beating most people writing assembler. I am one of those "unfortunate" who work on an ARM compiler :-)

Wilco

Reply to
Wilco Dijkstra

It's 0.90 vs 0.74 DMIPS/Mhz but varies with the compiler version used. So there is about a 22% difference on Dhrystone. On real applications the difference is normally between 30 and 40% (Thumb-2 material mentions ARM is 38% faster). So Dhrystone understates the difference between ARM and Thumb.One reason the difference is smaller on Dhrystone is that a large amount of time is spent in the libraries, which are - guess what - written in ARM assembler...

Wilco

Reply to
Wilco Dijkstra

On the contrary, it's probably the easiest of the RISC processors. For instance, there are nearly no delay slot instructions (the only exception I know of is after a banked LDM and before the loaded registers can be used).

If you need an example of real RISC assembly coding, have a look at Sparc or MIPS.

--

Tauno Voipio
tauno voipio (at) iki fi
Reply to
Tauno Voipio

Or the PPC. Then you'll really understand that RISC parses as Reduced-Instruction Set Computer, rather than Reduced Instruction-Set Computer. Compared to the PPC, the ARM is easy. The ColdFire has probably the nicest 32-bit ISA I've used - arguably better than the original m68k since it cut out some of the more complex addressing modes.

Reply to
David Brown

In article , Wilco Dijkstra writes

Interesting my recollection is that C started taking hold from around

1984/5 for many things including embedded work. By 1995 "everyone" was using C

In fact I have only used BASIC a couple of times and never in a professional or embedded System. From 1985 it was ALL C. (OK some Module

2)
--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

What were they used in? They were designed as an IP core for uses in ASICS in embedded systems from 1986..... The ARM MCU's arrived much later.

By 2001 over a billion ARM processors had been shipped.

As I was working in 1985 I can tell you that C was in widespread industrial use. BASIC was made popular by home users on the many home micros that sprang up at that time. Though I used C on my spectrum and Atari ST's in the mid 80's. I think from memory it was a Hi-Soft C on the spectrum but I forget the one on the ST but I know there were several available and at that time we were using C at work in our Modems.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

ARMs were not used in embedded systems (at least, not significantly) in

1995. The word processor and BASIC programs were for Acorn Archimedes computers, which came out about 1989 (IIRC). Most of the system software was written in assembly, just as most of DOS and other x86 software was written in assembly at that time. The main HLL used in Acorn computers was BBC Basic - a rather neat and powerful BASIC variant, and on the Archimedes it was fast enough to write real applications even though it was interpreted. I don't know much about the Archimedes after 1990, however (having left school by then).
Reply to
David Brown

No, ARM started inside a home computer company called Acorn in 1987. Most ARMs were used in desktop computers (Archimedes and RiscPC) until Apple became interested, and ARM Ltd was spun off to concentrate on designing and licensing ARM cores. ARM only really took off in the embedded world with the ARM7tdmi in 1995 - it still is the most popular ARM in licensing numbers and shipments. ARM was never designed for the embedded market, but it turned out to have a good code density due to the powerful instructionset. This translated into good performance of course and because it wasn't designed for performance at any cost like the other RISCs, it's power consumption turned out to be great too. The rest is history...

Wilco

Reply to
Wilco Dijkstra

On the ARM C compilers took a long time to get mature, around

1993-1994 the quality became just about good enough to be useful. However they have improved significantly since then and today a good ARM compiler easily beats a mediocre assembler programmer. Just about all large embedded applications are nowadays C indeed, with assembler only used when there is no alternative. Today few assembler die-hards remain, but in 1995 it was very different...

Wilco

Reply to
Wilco Dijkstra

The original ARMs (from 1985 - 1986) were designed for the Archimedes computers, to succeed the 6502-based BBC micros. It was not until the ARM6 in 1991 (okay, earlier than 1995 - but later than 1986. We were both wrong) that they started targeting embedded systems with the Apple Newton. With the ARM7 core in 1993, it started taking off properly in the embedded market.

All the original simulators and other development software for the ARM were written in Basic, incidentally.

formatting link
formatting link
(and wikipedia, and a few assorted Google'd links, for the details)

There's no argument about where ARM processors are today - I only disagree with your ideas of where they started.

C was never popular in home micros (I too had Hi-Soft C on my Spectrum, though I never used it much), either as a language for users (which was almost universally a flavour of BASIC, except for the Jupiter Ace) or for the built-in system software (which was invariably written in assembly to get the maximal efficiency and code density). Even third-party software was almost always written in assembly. This continued at least until machines like the Amiga and the Atari ST - and I'm not sure whether their systems were written in assembly or not.

The choice of C or assembly (or any other language) depends on a balance between programmer efficiency (including factors like portability), code efficiency (speed and size), and the target (including volumes, and time to market). For most modern micros (embedded or otherwise), where compilers can produce similar code efficiencies to expert assembly programmers, C (or another HLL) is the sensible choice for all but the tiniest high-volume targets. But in 1985 it was a very different situation, and even in 1995 there was significant assembly code being written and used in "big" systems. Remember, one of the biggest issues when the Power Mac came out (1994?) was the emulator for the 68k machine code for the operating system (never mind for third-party apps).

I'm not saying that C was not in widespread use in 1985 - just that it was also common to use assembly, even on "big" cpus like the ARM, x86, or 68k.

There was less serious use of BASIC, although there were a few niches. One was the Archimedes, where BASIC was a natural choice for historic reasons (Acorn's BASIC was one of the best available, and far more suited to "real" programming than BASIC's on most home computers), and because it ran so fast on the ARM even when interpreted. It was also the primary programming language for the Pick operating system on mini computers (from 1973). And, of course, it was popular for writing smaller apps in DOS and other systems. But it has almost never been used in embedded systems, except on PICs (where *anything* is better than having to write in assembly...).

Reply to
David Brown

Most of the Amiga things were done in C. The core of the OS was in BCPL (Before C Programming Language?) but later converted to C.

Reply to
Everett M. Greene

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.