Pi software development on a PC

Anyone used a PC to develop software for the Raspberry Pi?

I am looking for a cross-compiler and assembler (preferably to run under Linux) so that I can develop Pi software in C and assembly. I'm not worried about compiler speed. The more convenient it is the better. Unix command line tools would be ideal as I could hook them in to a makefile.

Any suggestions?

James

Reply to
James Harris
Loading thread data ...

Out of curiosity, why don't you want to develop on the RPi? Especially since the standard Raspbian distro includes all the usual GNU C compiler tools.

I run my Pi headless via and ssh session from my Linux boxen. One of my first actions was to set up cvs on it, point that at my source repositories and port my favourite editor, microEmacs, over. The only thing missing was libtermcap, which is not part of Raspbian, so I got it from GNU org, compiled and install no problem and MicroEmacs ran straight off too. No code changes needed.

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

I think that is the simpler solution, but I am sure an ATM target options exists for GnuCC

not sure how the whole linker thing would work.

Id guess you set up a different library path in the makefile

formatting link

has some detail

--
Ineptocracy 

(in-ep-toc?-ra-cy) ? a system of government where the least capable to lead are elected by the least capable of producing, and where the members of society least likely to sustain themselves or succeed, are rewarded with goods and services paid for by the confiscated wealth of a diminishing number of producers.
Reply to
The Natural Philosopher

Perhaps the simplest reason is that some of the source has to compile both for the Pi and for a PC. The existing build jobs for that source already run on Linux. It would make development easier if I could run the Pi builds from the same makefile.

Sounds good. I know I could use NFS to mount the drives from the Pi. That would be only slightly less convenient that running the builds on the PC but it would give me other problems with testing the code. I may come back to those under another thread.

James

Reply to
James Harris

You could easily end up with at least a different makefile for each target. In fact microEmacs shows something like that because its source repository is structured to be built on a wide variety of systems. It has this general structure:

me me/cmds # runtime scripts used by microEmacs me/doc # installable docs me/src # system-independent source code

me/linux # Linux-specific files me/linux/bin # The linked binary is placed here me/linux/obj # Compiler output for linking me/linux/estruct.h # Linux-specific #defines etc me/linux/makefile # Linux-specific makefile recipes

me/rpi ... # RPI-specific structure matching that for Linux

...

Porting it to a new system consists of:

- copying one of the existing system-specific substructures to form the basis of the new system's substructure.

- editing estruct.h to fit the new environment

- changing makefile to suit the new compiler tool chain

- making sure that the bin and obj directories are empty

- running the local equivalent of "make all" from the root directory of the system-specific substructure, i.e.

cd linux; make all; sudo make install

I've just realised that, this type of repository structure, with its clear segregation of system specific files (both source and generated binary objects) is probably ideal for cross-compiling development.

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

Fair enough.

Judging by my experience its quite probable that the Linux branch will build immediately on the RPi once you've added any standard libraries that aren't in the default install: they're probably in the main Debian repositories so apt-get can drop them into place.

I take it you don't use any of the source version control systems (cvs, svn, git?) because if you did, you'd have already set the RPi up as a client system. So, your easiest long-term approach might well be to start using one of them and simply include the RPi as a client machine along with the other development boxes on your LAN. I used cvs extensively at work and so I did exactly that as soon as I started running Linux at home. Now I can't imagine doing any development without it.

CVS is so old it has grey hair, but it works well and is very easy to trouble shoot and fix using standard CLI utilities if you manage to screw it up: thats why I still use it rather than git.

No need: cvs uses SSH to keep local source modules in sync with the central repository and check files in and out. Since all the other version control systems do the same (allow local editing of source files while keeping them up to date with changes committed elsewhere and letting you check in tested and stable versions of new and altered files) it follows that they too use SSH rather than NFS for data transfer - especially as they all work across the Internet: think SourceForge.

I'd be interested to know about those. IME it still boils down to regression testing changes on all platforms that run that code and not committing source changes or putting code live until its working correctly on all of them.

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

Hey! What's wrong with grey hair? :-)

--
Robert Riches 
spamtrap42@jacob21819.net 
(Yes, that is one of my email addresses.)
Reply to
Robert Riches

Free Pascal and Lazarus offer advice on cross-compiling to ARM:

formatting link

if you don't want to restrict yourself to C and its attendant complications....

--
Cheers, 
David 
Web: http://www.satsignal.eu
Reply to
David Taylor

Yeah. I was running Raspbmc and just downloaded the developer provided toolchain and had at it. A few simple things only as I recall. One thing is I tried to build a patched slrn so I could customize something but slrn isn't setup for cross compilation (configure script craps out).

A couple of years ago I remember using crosstool-ng to quickly build a toolchain for cross compilation with BeagleBoard as the target. Easy as Pi...

Reply to
Anssi Saari

I believe the point about NFS is that you can easily run your new binary on the RPi after it was built on a PC if you mount the PC's drive over NFS.

Reply to
Anssi Saari

Yes, this is "easy". The source of GCC that you use on the PC (Under Linux) is the same as the one that you run on the Pi! The output it generates is just a build option during compilation of the compiler. You can build a compiler that generates Pi code on the PC, and when you use that to compile your programs they can then be run on the Pi.

In fact, such a cross-development environment is a standard item on Linux, and one of the reasons why it has become so popular for development for embedded systems.

Whem you google for "setting up gcc for cross-development" or similar, you will find the info you need.

There are complicated makefiles that build such a cross development environment from scratch. They download all the required packages (sources), compile them, and install them in a local directory. I used such a beast to build a development environment for my DVB receiver (Dreambox). Maybe it is available for Pi development as well.

Reply to
Rob

formatting link
takes some of the trouble out of building cross-toolchains.
formatting link
includes settings for targetting the rpi.

formatting link
is said to be good but I?ve not tried it yet.

Because it?s deathly slow...

--
http://www.greenend.org.uk/rjk/
Reply to
Richard Kettlewell

Last weekend I compiled a program that I developed in the ninetees and I can assure you that it compiles a lot faster on the Pi than it did in those days on systems like the Atari ST and a "Turbo PC". (an XT running at 8 MHz or an early AT)

And even those were fast when compared to the multi-user Unix development system we had at work.

Reply to
Rob

Yes, it?s fast compared to the 30 year old computers in my cupboard; indeed I?d be disappointed if it couldn?t emulate them at >=100% speed. The point is that it?s very slow compared to even a relatively modest PC, so I?d rather use the latter for CPU-intensive jobs such as compilation.

--
http://www.greenend.org.uk/rjk/
Reply to
Richard Kettlewell

I'm of the generation that rather spends a bit more time thinking about the algorithm and looking at the code than to try to get things working by iterations of compile, tweak and try.

Remember, with testing you can never prove that a program is correct, you can only show that it is wrong.

Reply to
Rob

I found this a good place to start with cross-compiling for rpi:-

formatting link

Reply to
Chris Hills

Sure. I was describing an alternative solution which doesn't require NFS to be installed and configured. If you use your (pre-existing) centralised version control package, IMO a near essential for serious software development, and develop on a headless RPi then the only network protocol you need is SSH, thus getting the same functionality without needing to set up NFS.

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

When a half hour build on an rpi takes 10 seconds on a PC, it doesn?t take many iterations before putting some effort into getting crossbuilding going is a substantial win.

--
http://www.greenend.org.uk/rjk/
Reply to
Richard Kettlewell

Why stop there, real men compile the code in their head and input the binary direct to memory on toggle switches, etc etc.

:p

Reply to
Guesser

My experience is that you achieve better quality by studying the matter for half an hour than by frantically compiling, tweaking and trying the code for the same amount of time.

Reply to
Rob

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.