Hello everyone!
I have a small OrangePi running some small services on it (some with Docker and some without Docker).
And I’d love to know how do you backup your single-board computers.
Do you just rsync the system to a storage server ? Do you plug in a USB drive and rsync on it ? Do you save only the important data or the whole system ?
For now my SBC is not backed-up and I’d like to get a good backup solution up and running quickly! (I don’t trust SD cards to last long…)
I have access to USB drives and disks and also another big server with 20TB of storage which I can make the backup to if needed!
Thanks for your help !
I wrote a bash script a while back that uses sshfs to mount an ssh server to the filesystem, then uses dd to write /dev/mmcblk0 to it as hostname-date.img and finally unmount the ssh server, every day.
I run that on each of my rpis. (just one rn, but theres been as many as 4 going).
Any time I have an issue, be that my fault or not, I can just pull the sd card and write the last .img to it directly.
There’s some extra stuff in there too: it checks for the dependancy sshfs and installs it if missing (for deploying to a new system without reconfiguring), cleans up backups older than x days, logging, and the ability to write the log file as a test instead of the whole filesystem.
Sorry, but do you have a setup where you don’t need to worry about the atomicity of that operation? It sounds simple and effective, so I’d like to do it, but I’m concerned I may get something halfway through a write.
I suppose the odds are you’d have at worst a bad log file whereas config files and binaries are used read-only the majority of the time.
I’ve run it on every pi I’ve used for several years now, though they are typically pretty quiet systems. Usually something like pihole or a reverse proxy. Not much writing going on. I’ve restored about a dozen of those images and never had an issue.
I also tend to keep 3-6 backups at a time. If the most recent is messed up for some reason, there’s others to try. (though I’ve never actually had to try more than one)
Like I do any other system. rsnapshot nightly to backup server.
It might not be applicable to you but in many cases single board computers are used where there is minimal changes in files in day to day basis. For example when used for displaying stuff. For such cases, it is useful to know that after installing all the required stuff, the SD card can be turned into read only mode. This prolongs its life exponentially. Temporary files can still be generated in the RAM and if needed, you can push them to an external storage/FTP through a cron job or something. I have built a digital display with weather/photos/news where beyond the initial install, everything is pulled from the internet. I’m working towards implementing what I’ve suggested above.
I love that idea, and I’d love to implement that. But I honestly can never figure out how people are working with services that enables the user to change settings (for example, to set their location to get their local weather) while still maintaining a read-only system.
I am still figuring it out since it is my hobby and I’m unable to devote much time to it. But I think it will be something like Ubuntu live disks which enabled you to try Ubuntu by running it from a DVD. You could run anything like web server, save files, settings etc. Only they would not persist after a reboot since every thing was saved in RAM. Only here it’ll be a write locked SD card instead of a DVD.
I’m also sure there must be a name for it and step by step tutorial somewhere. If only Google was not so bad these days…
You keep the user-changeable files on a separate filesystem. Whether that’s just a separate partition, or an external disk. Keep the system itself read only, and write-heavy directories like logs and caches in RAM.
That would not be ideal, as I want to keep most logs of the system, and I don’t have a syslog server and even if I had one I wouldn’t be able to get everything I need… But it is a quite good idea for other usecase and I might do that with my future projects that doesn’t need a rw filesystem!
The fact that it’s a “single board” computer, specifically, is mildly irrelevant, imo; just follow standard backup practices. The only way the type of computer really comes into question is whether or not it has adequate resources to run whatever backup solution that you choose. For my usecase, Borg works great, but choose whatever solution fits your requirements. The “simplest”, and lightest solution is probably rsync, but that may leave a lot to be desired.
SBCs often run on sd cards or emmc modules so there are other possibilities than a standard desktop computer.
[…] so there are other possibilities than a standard desktop computer.
Would you mind elaborating? I’m curious to know what you’re referring to.
Copying the entire drive into a bootable backup using tools like dd is more feasible when you’re whole fs is only 8-16gb.
Larger systems often require more selectivity or more sophisticated methods to reduce output size.
You can also pull the card occasionally and backup via another system easier. Some people like this route.
Re:
The fact that it’s a “single board” computer, specifically, is mildly irrelevant
Did you forget to complete your reply, or did it perhaps glitch out?
Haha sorry mate I must have assumed you had your mind reading hat on. Basically what darkassasin said above.
I only have the OS on the sd card and I pop that out and
dd
a copy to my backup drive every 6 months or so. For that reason I like to use small sd cards like 8gb size. All other drives on the machine are external or network drives and those have their own backup routine withrsync
.Do you use only the sd card or what kind of storage system do you have on your sbc?
The SBC is only running with a SD Card and nother else plugged in. But I suppose my best bet is to run a script with rsync and save what I need using rsync over SSH to my storage server
You don’t need to pop it out to DD the SD card, you can do it while it’s running. I like to pipe DD through gzip to get a compressed image as the output so I’m not sitting on 16gb file for 3gb worth of files.
That sounds really good but is that safe to do? I thought you shouldn’t dd a disk if there was some activity going on on it.
So is the output image saved to the SD card or do you save it to an external drive?
Been working fine for me for several years.
You can have it written to an external drive, or you can use tools like sshfs and ftpfs to mount remote servers as local drives then write to those. I use the sshfs route.
This will create an .img that you can just write directly to an sd card and boot from.
Very cool, thank you.
one of the benefits of things like docker is creating a very lightweight configuration, and keeping it separate from your data.
ive setup things so i only need to rsync my data and configs. everything else can be rebuilt. i would classify this as ‘disaster recovery’.
some people reeeeally want that old school, bare-metal restore. which i have to admit, i stopped attempting years ago. i dont need the ‘high availability’ of entire system imaging for my personal shit.
rsync is your friend. its ubiquitous.
Do you have tips to save multiple location, as I also have non-docker configs to backup in /etc and /home, how do you do it ? just multiple rsync command in a sh script file with cron executing it periodically ? or is there a way to backup multiple folder with one command ?
There are options good and bad.
I’d back up just the dockers. The OS can be rebuilt easily enough. Yes Rsync the data
Or. Shut it down, pull the SD card, put it in another computer and rip it as an image, Full bootable backup
I would do the first not the second.
The second isn’t a bad idea if it’s in combination with the first. Then you have an image you can restore with most of your config and you can just restore the rest from the normal backups.