I have a home server that I’m using and hosting files on it. I’m worried about it breaking and loosing access to the files. So what method do you use to backup everything?
Almost all the services I host run in docker container (or userland systemd services). What I back up are sqlite databases containing the config or plain data. Every day, my NAS rsyncs the db from my server onto its local storage, and I have Hyper Backup backup the backups into an encrypted S3 bucket. HB keeps the last n versions, and manages their lifecycle. It’s all pretty handy!
Backblaze on a B2 account. 0.005$ per gb. You pay for the storage you use. You pay for when you need to download your backup.
On my truenas server, it’s easy as pie to setup and easy as 🥧 to restore a backup when needed.
Maybe I’m stupid, but what is B2? A Backblaze product?
Yes it’s their cloud storage.
I didn’t realize they did anything other than that!
They always had a personal backup product for windows. I use it to back up almost 2 TB of files on my desktop PC for a flat rate. It’s pretty convenient because it’s almost fully set & forget.
I think they had some form of cloud computing at some point but they now focus on B2 and some backup tools that utilize B2.
I also recommend B2, it’s an S3 compatible service so any backup software/scripts/plugins that work with S3 should work with Backblaze.
B2 is awesome. I have Duplicati set up on OpenMediaVault to backup my OS nightly to B2 (as well as a local copy to the HDD).
I’ll add to this that restic works amazingly with Backblaze.
Borgbackup
Restic to multiple repositories, local and remote.
Using ESXi as a hypervisor , so I rely on Veeam. I have copy jobs to take it from local to an external + a copy up to the cloud.
rsnapshot
Bash scripting and rclone personally, here is a video that helps https://youtu.be/wUXSLmGAtgQ
Various different ways for various different types of files.
Anything important is shared between my desktop PC’s, servers and my phone through Syncthing. Those syncthing folders are all also shared with two separate servers (in two separate locations) with hourly, daily, weekly, monthly volume snapshotting. Think your financial administration, work files, anything you produce, write, your main music collection, etc… It’s also a great way to keep your music in sync between your desktop PC and your phone.
Servers have their configuration files, /etc, /var/log, /root, etc… rsynced every 15 minutes to the same two backup servers, also to snapshotted volumes. That way, should any one server burn down, I can rebuild it in a trivial amount of time. This also goes for user profiles, document directories, ProgramData, and anything non-synced on windows PC’s.
Specific data sets, like database backups, repositories and such are also generally rsynced regularly, some to snapshotted volumes, some to regulars, depending on the size and volatility of the data.
Bigger file shares, like movies, tv-shows, etc… I don’t backup, but they’re stored on a distributed GlusterFS, so if any one server goes down, that doesn’t lose me everything just yet.
Hardware will fail, sooner or later. You should see any one device as essentially disposable, and have anything of worth synced and archived automatically.
A simple script using duplicity to FTP data on my private website with infinite storage. I can’t say if it’s good or not. It’s my first time doing it.
How do you have infinite storage? Gsuite?
I confirm that in the terms and condition they discourage the use as a private cloud backup and only to host stuff related to the website. Now… until now I’ve had no complaints as I’ve been paying and kept the traffic at minimum. I guess I’ll have to switch to some more cloud oriented version if I keep expanding. But it’s worked for now !
deleted by creator
So what method do you use to backup everything?
Depends on what OS that server is running. Windows, Unraid, Linux, NAS (like Synology or QNAP), etc.
There are a bazillion different ways to back up your data but it almost always starts with “how is your data being hosted/served?”
deleted by creator
How difficult was setting up PBS on ARM? Do you just add their repos and install it on top of debian?
I wish. I spent days trying to compile it for ARM (several updates since then, including better cross-compile docs, maybe give it a try first :D), no success. In the end, I gave up and extracted the binaries from the docker image.
Ah, sounds like a pain to maintain. Perhaps I’ll find some old x86 parts or get a cheap-ish intel N single board computer instead
My home servers a windows box so I use Backblaze which has unlimited storage for a reasonable fixed price. Have around 11TB backed up. Pay the extra few dollars for the extended 12 month retention of deleted files, which has saved me a few times when I needed to restore a file I couldn’t find.
Locally I run stablebit DrivePool and content is mirrored and pooled using that, which covers me for drive failures.
Proxmox Backup Server. It’s life-changing. I back up every night and I can’t tell you the number of times I’ve completely messed something up only to revert it in a matter of minutes to the nightly backup. You need a separate machine running it–something that kept me from doing it for the longest time–but it is 100% worth it.
I back that up to Backblaze B2 (using Duplicati currently, but I’m going to switch to Kopia), but thankfully I haven’t had to use that, yet.
PBS backs up the host as well, right? Shame Veeam won’t add Proxmox support. I really only backup my VMs and some basic configs
Veeam has been pretty good for my HyperV VMs, but I do wish I could find something a bit better. I’ve been hearing a lot about Proxmox lately. I wonder if it’s worth switching to. I’m a MS guy myself so I just used what I know.
Veeam can’t backup Proxmox on hypervisor level, only HyperV and VMWare
PBS only backs up the VMs and containers, not the host. That being said, the Proxmox host is super-easy to install and the VMs and containers all carry over, even if you, for example, botch an upgrade (ask me how I know…)
Then what’s the purpose over just setting up the built in snapshot backup tool, that unlike PBS can natively back up onto an SMB network share?
I’m not super familiar with how snapshots work, but that seems like a good solution. As I remember, what pushed me to PBS was the ability to make incremental backups to keep them from eating up storage space, which I’m not sure is possible with just the snapshots in Proxmox. I could be wrong, though.
You are right about the snapshots yeah. The built in backup doesn’t seem to do incremental backups.
I’ve recently begun using duplicati to backup the data from my docker containers and VMware snapshots for the guest VM itself, just currently struggling to understand how to automate the snapshots yet so I do them manually