They usually use tools like `rsync`, `tar`, or cloud services to keep backups of their Linux desktops.
They usually use tools like `rsync`, `tar`, or cloud services to keep backups of their Linux desktops.
I've been considering switching my primary PC to Linux. I've used Ubuntu for a while now—since it's straightforward and resilient—I've even set up a custom bootloader to get it running on my 2015 Mac laptop. My goal is to have robust disk image backups that let me restore the system quickly after a full wipe. Ideally, I want to automate this process so it runs on schedule and stores the images on a NAS. While I'm open to suggestions, I think it might be excessive for most users; regular file backups should suffice. Curious to see what others rely on!
Initially, backing up a live system isn't feasible because it requires restarting into another environment. A full backup seems excessive. I mainly document software installations in scripts, allowing a fresh install to automatically reinstall everything via the script on first boot. Most data resides in your /home folder, making it straightforward to schedule regular backups. I use a script that runs after logging out, syncing my home directory with the NAS. It works like this: rsync -aAXvxz --sparse --delete --delete-before --delete-excluded --progress --exclude-from 'backup-exclude.txt' /home/alexatkin/ [email protected]:/mnt/backup/alexatkin. On the surface it might seem complex, but understanding what each command does is valuable before attempting. The exclude list in backup-exclude.txt instructs rsync to omit specific files and directories.
I’ve tried Rsync before—once was enough. Full backups aren’t necessary, right? I prefer the quick “one-click” option. Taking notes on installed software is a really smart approach, especially with package managers being so user-friendly. I’m going to start using that method!
My NAS takes it further by including its /etc folder in the backup, just in case I need to reinstall and lose track of configuration tweaks. When I perform the main backup onto USB drives, I often forget to disable privatetmp on Apache. If you store backups elsewhere—preferably using systemd for automatic execution after login—I could easily include that too. Of course, you're also welcome to back it up via SSH or rsync, but I prefer an NFS mount for this task. Backing up directly to USB is significantly quicker than over a network, though it offers less flexibility.
I rely on urbackup for file backups and also handle home folders and /etc on Linux clients. It works well on Windows but doesn’t support image backups on Linux systems. For image storage, I prefer Macrium Reflect using a bootable media—it’s better than Clonezilla and it compresses the images, saving space while speeding up the process.
Linux makes everything a file, so I safeguard my essential system files and scripts. I keep only the necessary custom items in /etc and related folders, confident I can revert to a clean state. For critical data like personal documents, I store it on an advanced Git mirror server with automatic commits and pushes every 12 hours. This setup lets me regain access quickly if my primary machine fails, providing secure web access when needed. Since space is limited for media backups, I use LVM to distribute data across multiple drives and maintain strict monitoring. Replacing a drive is straightforward as long as I have a spare port, ensuring seamless continuity for the end user.
I enjoy Restic. It's straightforward to start and offers great flexibility in placing backups. @Alex Atkin UK's Rysnc option is also appealing. The key element of your backup strategy should involve restoration plans. What kind of recovery do you envision from your backup? Depending on how you plan to restore, you can build a more effective backup.