Easy way to backup from SD card to USB or network drive, via command line?

G'day.

In Raspbian lite (so command line only), what is the easiest set and forget way to backup one's SD card to a connected USB drive? Or better yet via LAN to desktop computer HD.

Preferably to a compressed (so I'm not wasting space, as I frequently use SD cards with large amounts of reserve storage) image I can just quickly copy to another SD card to get going again. And with multiple copies, so I don't backup corrupted data and lose the good version before noticing. Bonus points if I can set the number of copies/space used before old versions get deleted.

I ask, since I use some cases and/or locations that make manually getting at the SD card a trial. Plus any backup system that isn't fully automated, is just asking to be ignored or forgotten.

TIA :-)

--
If you're not part of the solution, you're part of the precipitate.
Reply to
Jamie Kahn Genet
Loading thread data ...

dd piped into gzip/bzip2/7zip or dump similarly piped (you have to use restore for recovery bt it supports incremental backups which saves a

*lot* of space).

Both (along with anything else that backs up live filesystems) suffer the risk of backing up something that is changing and potentially being badly corrupted.

One option for minimising the risk of corrupted images is to use rsync to clone the filesystem to a partitiion on the USB stick then unmount it and take a compressed copy (dd or dump) of the guaranteed quiet copy.

Whichever way you go write a simple script to do the job into a file with the date embedded in the name and run it from cron.

Use a cron job to delete old backups - with a little care in the script you can have a sophisticated retention policy keeping the last few days and then a few weekly copies.

There are many many CLI backup utilities around, but personally I find it easier to write some simple scripts and use cron for simple backup requirements.

One more thing - do test your backups periodically. If you go the route with a clone filesystem on the USB stick you can even automate this as part of the backup script (with dd images mount and do a diff before compressing).

--
Steve O'Hara-Smith                          |   Directable Mirror Arrays 
C:>WIN                                      | A better way to focus the sun 
The computer obeys and wins.                |    licences available see 
You lose and Bill collects.                 |    http://www.sohara.org/
Reply to
Ahem A Rivet's Shot

rsnapshot provides a Time Machine like backup using rsync and hard links on the destination. That means you only burn space for files that change. It runs from cron and makes a tree of hourly/daily/weekly/monthly backups and you can set how many backups to keep at each level (it's just a shell script so is easy to tweak anyway). Because it's hard links it's a bit harder to work out how much space they take up (it provides 'rsnapshot du' but it can be slow)

It won't backup the disc partition structure, so you'd have to recreate that (probably write a fresh image to an SD card, delete all the files, then copy them from the backup - or write a script to drive fdisk to make the partitions for you).

Theo

Reply to
Theo

A similar but lighter-weight approach is my own:

formatting link

Reply to
Roger Bell_West

This is my preferred approach and has been for the last 10 years or so. The only difference is that I back up directly to a cycle of two USB hard drives that are kept offline in a fire safe and only mounted to make a backup. By having two, I always have one in the firesafe and reasonable protection against disk failures.

This should go without saying since it makes backups so much easier.

Not needed with my approach because rsync updates the last but one backup to bring it into sync with my online filing system. This also saves time since rsync only does the minimum work needed to sync the backup to the online filing system.

If I understand rsync correctly, merely using it causes the existing backup to be checked for readability. Since rsync has to compare every backed-up file with the filing system its backing up in order to decide whether a new file should be added to the backup or if a backed-up one should be deleted or rewritten because the master copy has been deleted or changed since the backup was made.

My guess is that corruption of the backup will cause rsync to decide that the backed up data doesn't match the files being backed up, so it will attempt to delete, add, or replace the backup contents. If the backup has become unreadable, rsync will report this and quit, which tells you that the backup is no longer usable. Does this sound sensible?

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

By default rsync will skip a file if the size and modification time of the backup matches the original unless you use the -c option so it may not read all the data.

--
Steve O'Hara-Smith                          |   Directable Mirror Arrays 
C:>WIN                                      | A better way to focus the sun 
The computer obeys and wins.                |    licences available see 
You lose and Bill collects.                 |    http://www.sohara.org/
Reply to
Ahem A Rivet's Shot

time

Fair point, though it still needs to read all the directories in order to get the backup's date&time so they'll get read checked.

Do you know how much impact using the -c option has on backup time?

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

That depends primarily on the average file size - extreme example one file of 10Gb will check date and time in next to no time but take ages to checksum, 10 million 1K files OTOH will take ages no matter what you do but probably not much worse with -c than without.

--
Steve O'Hara-Smith                          |   Directable Mirror Arrays 
C:>WIN                                      | A better way to focus the sun 
The computer obeys and wins.                |    licences available see 
You lose and Bill collects.                 |    http://www.sohara.org/
Reply to
Ahem A Rivet's Shot

you go

Thanks: a clear case of 'suck it and see', so I'll try adding the -c option to see what the impact is. It may not be too bad, since my biggest files (Postgres database) are always backed up.

--
martin@   | Martin Gregorie 
gregorie. | Essex, UK 
org       |
Reply to
Martin Gregorie

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.