I have a home server that I’m using and hosting files on it. I’m worried about it breaking and loosing access to the files. So what method do you use to backup everything?
deleted by creator
How difficult was setting up PBS on ARM? Do you just add their repos and install it on top of debian?
I wish. I spent days trying to compile it for ARM (several updates since then, including better cross-compile docs, maybe give it a try first :D), no success. In the end, I gave up and extracted the binaries from the docker image.
Ah, sounds like a pain to maintain. Perhaps I’ll find some old x86 parts or get a cheap-ish intel N single board computer instead
I use proxmox server and proxmox backup server (in a VM 🫣) to do encrypted backups.
A raspberry pi has ssh access to PBS and it rsync all the files, and then uploads them to backblaze using rclone.
https://2.5admins.com/ recommended “pull” backups, so if someone hacks your server they don’t have access to your backups. If the pi is hacked it can mess with everything, but the idea is that is has a smaller attack surface (just ssh).
PS. If you rclone a lot of files to backblaze use https://rclone.org/docs/#fast-list , or else it will get expensive
dont overthink it… servers/workstations rsync to a nas, then sync that nas to another nas offsite.
My server runs Plex and has almost 50 TB of video on it. After looking at all the commercial backup options I gave up on backing up that part of the data. :-(
I do backup my personal data, which is less than a terrabyte at this point. I worked out an arrangement with a friend who also runs a server. We each have a drive in the other’s server that we use for backup. Every night cron runs a simple rsync script to do an incremental backup of everything new to the other machine.
This approach cost nothing beyond getting the drives. And we will still have our data even if one of the servers is physically destroyed and unrecoverable.
Oh that whith the friend’s server is a good idea. Mutual benefit at little extra cost
It’s the only “no cost” option I know of that provides an off-site backup. And once it occurred to me, it was really easy to set up.
I also have a decent amount of video data for Plex (not nearly 50TB, but more than I want I pay to backup). I figure if worst comes to worst I can rip DVD/BluRays again (though I’d rather not) so I only backup file storage from my NAS that my laptops and desktop backup to. It’s just not worth the cost to backup data that’s fairly easy to replace.
Yeah, that was where I finally came out too. I still own the discs. My only worry is that some of my collection is beginning to age. I’ve had a few DVDs that were no longer readable.
3-2-1
Three copies. The data on your server.
-
Buy a giant external drive and back up to that.
-
Off site. Backblaze is very nice
How to get your data around? Free file sync is nice.
Veeeam community version may help you too
I’m not sure how you understand the 3-2-1 rule given how you explained it, even though you’re stating the right stuff (I’m confused about your numbered list…) so just for reference for people reading that, it means that your backups need to be on:
- 3 copies
- 2 mediums
- 1 offsite location
Huh. I always heard 3 copies, 2 locations, 1 of the locations offsite. Yours makes sense though.
-
If you are using kubernetes, you can use longhorn to provision PVCs. It offers easy S3 backup along with snapshots. It has saved me a few times.
Autorestic, nice wrapper for restic.
Data goes from one server to second server, and vice versa (different provider, different geolocation). And to backblaze B2 - as far as I know cheapest s3-like storage
Wasabi might also be worth mentioning, a while back I compared S3-compatible storage providers and found them to be cheaper for volumes >1TB. They now seem to be slightly more expensive (5.99$ vs. 5$), but they don’t charge for download traffic.
All my backups are in /home/Ryan/Documents. Please don’t break my Minecraft server.
Borgbase to borgbase
Rock solid for years
cronjobs with rsync to a Synology NAS and then to Synology’s cloud backup.
I run everything in containers, so I rsync my entire docker directory to my NAS, which in turn backs it up to the cloud.
ITT: lots of the usual paranoid overkill. If you do
rsync
with the--backup
switch to a remote box or a VPS, that will cover all bases in the real world. The probability of losing anything is close to 0.The more serious risk is discovering that something broke 3 weeks ago and the backups were not happening. So you need to make sure you are getting some kind of notification when the script completes successfully.
While I don’t agree that using something like restic is overkill you are very right that backup process monitoring is very overlooked. And recovering with the backup system of your choice is too.
I let my jenkins run the backup jobs as I have it running anyways for development tasks. When a job fails it notifies me immediately via email and I can also manually check in the web ui how the backup went.
I use duplicacy to backup to my local NAS and to Storj.io. In case of a fire I’m always able to restore my files. Storj.io is cheap, easy to access from any location and your files are stored and duplicated on multiple different locations.
I have used duplicity before but restoring from a new installation takes a while, as duplicity has to reanalyze the storage.
+1 for Duplicacy. Been using it solidly for nearly 6 years - with local storage, sftp, and cloud. Rclone for chonky media. Veeam Agent for local PC backups as a secondary method.
I have everything in its own VM, and Proxmox has a pretty awesome built in backup feature. Three different backups (one night is to my NAS, next night to an on-site external, next night to an external that’s swapped out with one at work - weekly). I don’t backup the Proxmox host because reinstalling it should it die completely is not a big deal. The VM’s are the important part.
I have a mini PC I use to spot check VM backups once a month (full restore on its own network, check its working, delete the VM after).
My Plex NAS only backs up the movies I really care about (everything else I can “re-rip from my DVD collection”).
Running a Duplicacy container backing up to Google drive for some stuff and Backblaze for mostly all other data. Been using it for a couple years with no issues. The GUI and scheduling is really nice too.