C cross-compiler for 6800 (yes, you read correctly)

I used their 68HC11 tool chain, over 20 years ago. I remember it as being a fine product that got the job done superbly.

--

Tim Wescott 
Wescott Design Services 
http://www.wescottdesign.com
Reply to
Tim Wescott
Loading thread data ...

Yes, see for example

formatting link
which was maintained til recently. I wonder if it's feasible to retarget it to the 6800.

From

formatting link
:

cc65 has C and runtime library support for many of the old 6502 machines, including

the following Commodore machines: VIC20 C16/C116 and Plus/4 C64 C128 CBM 510 (aka P500) the 600/700 family newer PET machines (not 2001). the Apple ][+ and successors. the Atari 8 bit machines. GEOS for the C64 and C128. the Nintendo Entertainment System (NES). the Supervision console. the Oric Atmos. the Lynx console. The libraries are fairly portable, so creating a version for other 6502s shouldn't be too much work.

Reply to
Paul Rubin

The 6502 is still more crippled than 6800, with its

8 bit stack pointer etc.
--

-TV
Reply to
Tauno Voipio

Although the zero-page indirect stuff could make up for a lot of that, and using that for the C stack pointer would have been easier than using 6800's SP or X.

Reply to
Robert Wessel

In the 1970/80's the only usable high level languages were PL/M 80 for Intel 8080/85 processors and marginally some Pascal implementations for 6809.

In those days, the C-language was just a fringe issue in the microprocessor world, but of course a big issue in the PDP-xx minicomputer world.

Reply to
upsidedown

Herve Tireford (I think it was him) had written the BASICM compiler for the 6809, it was a lot more usable than the Pascal compiler I had under MDOS09. In fact it was quite good, one could not wish a lot more from a high level language targeted at embedded applications. The limitation was the

16 bit address space it had to live in of course, and it was a severe limitation, but apart from that it was really good (as was the guy who wrote it of course).

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
dp

I used PL9 a nice one-pass compiler...

Philippe

--- news://freenews.netfront.net/ - complaints: snipped-for-privacy@netfront.net ---

Reply to
Centexbel

I had heard something about that, but I have never used the newer versions of the Imagecraft compiler (I moved to gcc as soon as it was solid for the AVR. You can't fault the excellent support and ease-of-use that you get from Imagecraft, but gcc is a much more powerful compiler).

Reply to
David Brown

The named address spaces are quite new (as they are new to the C standards). You can use them as alternatives to the "progmem" macros and attributes - when you specify exactly which flash bank you are using, they are approximately as efficient (depending on exact optimisation details). You can also use a more general pointer that covers all of flash and ram - but you pay for the flexibility in code time and space.

Although I haven't used them myself (I haven't used the AVR for a new project for quite a while, and I don't change tools for existing projects), I think they are a nice idea. They are about as good a solution as it is possible to get when using an 8-bit architecture to address up to 512 KB flash.

gcc (in cooperation with the linker and linker scripts) will put /some/ types of consts into flash, depending on the architecture. But in C it is valid to write code like this:

extern size_t strlen(const char* s); size_t foo(void) { char a[] = "Hello, world!"; return strlen(a); }

I.e., you can cast a pointer to a non-const into a pointer-to-const. This means that the implementation of "strlen" here cannot assume anything special about the placement of "s" - it could be in flash, and it could be in ram. For most architectures, assembly instructions for pointer-to-ram and pointer-to-flash operations are the same, so there is no problem. But on the AVR (and PIC, 8051, COP8, and many other small micros), the operations are different. To be consistent, the compiler/library implementer must choose either "fat" pointers with their extra overheads, or assume that all normal pointers are to ram. Thus on the AVR, "const" data must be allocated to ram - there is no other way to be efficient and consistent.

The compiler will then give you some other method of declaring that a particular piece of data is explicitly put in flash - such as "progmem" and friends, or "__flash" named address spaces. And you use similar compiler extensions to access said data. This is usually worth the effort for strings, tables, etc., but not for small constants.

For targets that have a single unified address space (arm, msp430, most

32-bit processors), it is usually easy for the compiler to allocate const data to read-only memory.

You should learn to love it. Values that don't change should be made const - it lets the compiler do more checking for errors, and sometimes also leads to better optimisation. Where possible, use "static const". In particular, prefer "static const int maxValue = 100;" over "#define maxValue 100" - as long as your compiler is optimising, it will generate as good code and will have better static error checking. (You can't replace /all/ macros this way - but do it when you can.) And any function that takes a pointer to data but does not change that data should const-qualify the pointer - it helps spot mistakes and can lead to better object code.

The msp430 has had a usable gcc port for many years, but it was arguably not "proper" - it was outside the main gcc tree, and for a while it was a long way behind mainline gcc. But thanks mainly to the efforts of Peter Bigot, it has been solid - I have used it for many years. And now TI has paid Redhat to make msp430 gcc a full member of the mainline gcc, and to provide a complete toolchain and library package. This new version is now available in beta, with integration in CCS.

Still, it must be said that the msp430 is only really C-friendly with the devices up to 64K total memory (flash + ram + peripherals) - data access beyond 64K means using 20-bit registers and that means messy compiler-specific stuff. (You can access normal code over more than 64K

- it is only when you use pointers that addresses above 64K get ugly.)

Reply to
David Brown

The 6502 was a different beast entirely. It was also used for the Acorn BBC Micros, which were immensely popular in schools and education in the UK, and there were a fair number of development tools made for it. I certainly know C and Pascal compilers were available, as well as Forth (but I think that was interpreted rather than compiled).

Reply to
David Brown

[...]

Oh I do use it for that. Or for ints I often use enum.

That is the bit I don't do yet.

Yes that was my recollection, no slight intended to the author.

Yes, I noticed that when I checked my previous post. Better late than never I suppose.

Well that is OK *now* since I would likely only use something like that MSP430 on tiny projects. Thanks for making me look at it again.

--

John Devereux
Reply to
John Devereux

The best choice depends on the context, of course. But remember to make your consts "static" in such cases - then the compiler doesn't have to generate storage for them in most cases. (You probably already know this, John, but there may others reading this who don't.)

It doesn't often make a big difference, but I think it helps make function declarations a bit clearer.

We use the msp430 on a number of projects, but we are seeing a steady move away from the msp430 and AVR towards Kinetis Cortex M4 devices. One area we find the msp430 useful is for higher temperature systems - there is a nice 150 C msp430 chip, which is a lot cheaper, smaller, lower power and easier to use than most alternatives at that temperature.

Reply to
David Brown

While "unofficial" it was solid and worked great for many of us who used it without problems for years.

I've never tried to use the 20-bit versions of the MSP430, but I always felt that if 16-bit pointers and 64KB of address space wasn't enough it was time to look at small ARM parts...

--
Grant Edwards               grant.b.edwards        Yow! ... I don't like FRANK 
                                  at               SINATRA or his CHILDREN. 
                              gmail.com
Reply to
Grant Edwards
[snip]

Its niche is made smaller by the restricted power supply (voltage) limits: it doesn't quite work with a rechargeable lithium battery supply. If I have to put in a regulator, one of the ARM variants offers more capabilities. At the low end the AtMegas (and other 8-bitters) can handle the voltages.

Reply to
Frank Miles

There are a bunch of different models including some that go as low as

0.9 volts and use extremely little power. There are also some with non-volatile (FRAM) memory, which I think is still close to unique. I do think the architecture has gotten uncomfortably squeezed between ARM and 8-bitters.
Reply to
Paul Rubin

Back in the day I used Manx Aztec C - which was commercial - on Apple and Commodore. It supported code overlays and had both native and p-code compilers. You could link native code and p-code functions together to balance executable size vs speed.

The 6502 native compiler produced pretty good code given the limitations of the CPU and the p-code interpreter was heavily optimized so p-code wasn't particularly slow either.

Only real caveat was compiling on floppies was painful. I was very thankful to finally get hard disks.

Manx produced a whole family of 6502, 8080, z80 and 8086 compilers. They are no longer supported but many of them still are available here:

formatting link

George

Reply to
George Neuner

And ZP pseudo registers.

George

Reply to
George Neuner

and 68K.

Reply to
George Neuner

And after looking at the code generated for MSP430 by GCC for several years. Then I started working the AVR parts and looking at the code generated by C compilers gave me nightmares. I still shudder when I think about AVR ISRs that use pointers. An ISR that was 3-5 instructions on a '430 would be 20-30 instructions on an AVR...

--
Grant Edwards               grant.b.edwards        Yow! I want to read my new 
                                  at               poem about pork brains and 
                              gmail.com            outer space ...
Reply to
Grant Edwards

Hah, you might have been somewhat relieved back then if you knew there were worse situations than yours. I never used an Apple, but I had friends here who did. There was a Bulgarian clone of the Apple II - which was a fair equivalent - and there were the floppies also cloned which I have been told were even worse than the original Apple ones :D . Had a plastic disk with an engraved spiral to do head positioning, I never saw these things work twice in a row without some major error :D :D :D . In comparison to these even the hugely unreliable 8" clones of some Shugart model I had were rock stable :D . [These taught me to always have at least 3 disks with the latest version; it could (and did) happen while backing up one floppy to another the original one to get unreadable and the second one still mostly blank.... I had learned how and where to press the head during retries to read a problematic sector, sewing files together from physical dumps of 128 byte sectors was a trivial exercise :D . Before too long I moved things to 5" and later to 3.5" floppies and things got much much better.]

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
dp

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.