Securely transfer your RPi to a NAS for automatic backup.
Securely transfer your RPi to a NAS for automatic backup.
Finding a straightforward method to automate backups of your Raspberry Pi to your NAS is challenging due to differing configurations. You can explore various options but expect mixed advice. Your Pi runs the standard Raspberry Pi OS in CLI mode, so switching to GUI would be inconvenient unless you need it. The NAS uses TrueNAS with an SMB share, which may require specific setup steps. Consider scripting or using tools like rsync for efficient backups.
On my Linux machines I employ a script to transfer / update folders to the NAS. While I manually execute these scripts as needed, you might consider configuring a cron job for automatic execution.
I manage my unraid server with a routine daily task that copies my AppData folder (excluding certain auto-made directories packed with tiny files) to a NAS folder called LIVE. On the NAS side, I run a weekly job that compresses and archives all those files into a LaBrea folder, then scans for any files older than 30 days and removes them.
When SSH is active on the TrueNAS machine, you can employ RSYNC without needing the SMB share mounted on the RPi. This tool helps replicate directories either locally or remotely, and I prefer it for duplicating my server’s SMB folder. You can configure it as a push or pull setup based on your hosting environment: 0 0 * * * rsync -avhP /mnt/smb/ 10.0.0.8:/mnt/onsite-backup/. This cron job was created, and the main step is enabling passwordless Public/Private Key Authentication. The user can log in automatically via SSH using ssh-keygen then ssh-copy-id with the server’s IP. The options you see here are: -a for archive, -v for verbose output, -h for easy reading, -P for progress tracking. If you’re transferring files over the web, you can add -z to compress the data, which slows the process but cuts down on bandwidth and costs, especially on limited connections like cellular networks. This method is more reliable than RSYNC when using a mounted SMB share because it verifies server availability before connecting each time, rather than assuming the share is ready automatically (as with SMB).
I regularly save all persistent data from my Docker containers. Occasionally I make mistakes that cause major issues, so I try to revert things daily. Sometimes problems aren’t visible for a week or two, which is why I maintain weekly backups. However, after a month any issue should become apparent, and each backup consumes about 110GB, which isn’t ideal since I prefer to keep only half a TB of duplicate data. As for the joke, I guess you’re not used to the La Brea Tar Pits in Cali. The humor runs deeper because the script I use—mammoth.sh—has to compress the folder into a tar file named mammoth.sh. This process stores many well-preserved prehistoric specimens, especially woolly mammoths.