F5F Stay Refreshed Power Users Networks Securely transfer your RPi to a NAS for automatic backup.

Securely transfer your RPi to a NAS for automatic backup.

Securely transfer your RPi to a NAS for automatic backup.

Pages (2): 1 2 Next
R
rebelsurfer
Junior Member
32
06-14-2020, 08:16 PM
#1
Finding a straightforward method to automate backups of your Raspberry Pi to your NAS is challenging due to differing configurations. You can explore various options but expect mixed advice. Your Pi runs the standard Raspberry Pi OS in CLI mode, so switching to GUI would be inconvenient unless you need it. The NAS uses TrueNAS with an SMB share, which may require specific setup steps. Consider scripting or using tools like rsync for efficient backups.
R
rebelsurfer
06-14-2020, 08:16 PM #1

Finding a straightforward method to automate backups of your Raspberry Pi to your NAS is challenging due to differing configurations. You can explore various options but expect mixed advice. Your Pi runs the standard Raspberry Pi OS in CLI mode, so switching to GUI would be inconvenient unless you need it. The NAS uses TrueNAS with an SMB share, which may require specific setup steps. Consider scripting or using tools like rsync for efficient backups.

N
NinjaFrog9
Junior Member
5
06-15-2020, 09:04 PM
#2
On my Linux machines I employ a script to transfer / update folders to the NAS. While I manually execute these scripts as needed, you might consider configuring a cron job for automatic execution.
N
NinjaFrog9
06-15-2020, 09:04 PM #2

On my Linux machines I employ a script to transfer / update folders to the NAS. While I manually execute these scripts as needed, you might consider configuring a cron job for automatic execution.

J
JopperMan
Member
121
06-15-2020, 10:34 PM
#3
The simplest method is to use the `smbclient` tool with your SMB server details. Just run the command in your terminal or command prompt, providing the server address and credentials. It’s straightforward once you get the basics.
J
JopperMan
06-15-2020, 10:34 PM #3

The simplest method is to use the `smbclient` tool with your SMB server details. Just run the command in your terminal or command prompt, providing the server address and credentials. It’s straightforward once you get the basics.

M
mminchich
Member
149
06-16-2020, 03:28 AM
#4
I noticed only executing a rsync cron task on pi: `rsync -a / /path/to/NAS/share//backups/piBackup`
M
mminchich
06-16-2020, 03:28 AM #4

I noticed only executing a rsync cron task on pi: `rsync -a / /path/to/NAS/share//backups/piBackup`

B
Bartekdwarf
Posting Freak
791
06-16-2020, 05:13 AM
#5
Discover how to set up SMB shares on Ubuntu 18.04
B
Bartekdwarf
06-16-2020, 05:13 AM #5

Discover how to set up SMB shares on Ubuntu 18.04

L
ligitassasin
Junior Member
18
06-16-2020, 05:44 AM
#6
Thank you, I'll give it a shot.
L
ligitassasin
06-16-2020, 05:44 AM #6

Thank you, I'll give it a shot.

A
Athame_
Senior Member
734
06-17-2020, 09:27 AM
#7
I manage my unraid server with a routine daily task that copies my AppData folder (excluding certain auto-made directories packed with tiny files) to a NAS folder called LIVE. On the NAS side, I run a weekly job that compresses and archives all those files into a LaBrea folder, then scans for any files older than 30 days and removes them.
A
Athame_
06-17-2020, 09:27 AM #7

I manage my unraid server with a routine daily task that copies my AppData folder (excluding certain auto-made directories packed with tiny files) to a NAS folder called LIVE. On the NAS side, I run a weekly job that compresses and archives all those files into a LaBrea folder, then scans for any files older than 30 days and removes them.

I
iskela99
Member
247
06-17-2020, 02:37 PM
#8
I’m wondering which files are safe to remove after 30 days—likely just configuration details. You seemed to get the joke about it, so you’re probably clever about it!
I
iskela99
06-17-2020, 02:37 PM #8

I’m wondering which files are safe to remove after 30 days—likely just configuration details. You seemed to get the joke about it, so you’re probably clever about it!

C
Caio_JS
Member
53
06-17-2020, 07:25 PM
#9
When SSH is active on the TrueNAS machine, you can employ RSYNC without needing the SMB share mounted on the RPi. This tool helps replicate directories either locally or remotely, and I prefer it for duplicating my server’s SMB folder. You can configure it as a push or pull setup based on your hosting environment: 0 0 * * * rsync -avhP /mnt/smb/ 10.0.0.8:/mnt/onsite-backup/. This cron job was created, and the main step is enabling passwordless Public/Private Key Authentication. The user can log in automatically via SSH using ssh-keygen then ssh-copy-id with the server’s IP. The options you see here are: -a for archive, -v for verbose output, -h for easy reading, -P for progress tracking. If you’re transferring files over the web, you can add -z to compress the data, which slows the process but cuts down on bandwidth and costs, especially on limited connections like cellular networks. This method is more reliable than RSYNC when using a mounted SMB share because it verifies server availability before connecting each time, rather than assuming the share is ready automatically (as with SMB).
C
Caio_JS
06-17-2020, 07:25 PM #9

When SSH is active on the TrueNAS machine, you can employ RSYNC without needing the SMB share mounted on the RPi. This tool helps replicate directories either locally or remotely, and I prefer it for duplicating my server’s SMB folder. You can configure it as a push or pull setup based on your hosting environment: 0 0 * * * rsync -avhP /mnt/smb/ 10.0.0.8:/mnt/onsite-backup/. This cron job was created, and the main step is enabling passwordless Public/Private Key Authentication. The user can log in automatically via SSH using ssh-keygen then ssh-copy-id with the server’s IP. The options you see here are: -a for archive, -v for verbose output, -h for easy reading, -P for progress tracking. If you’re transferring files over the web, you can add -z to compress the data, which slows the process but cuts down on bandwidth and costs, especially on limited connections like cellular networks. This method is more reliable than RSYNC when using a mounted SMB share because it verifies server availability before connecting each time, rather than assuming the share is ready automatically (as with SMB).

S
Schocko1
Junior Member
46
06-17-2020, 10:00 PM
#10
I regularly save all persistent data from my Docker containers. Occasionally I make mistakes that cause major issues, so I try to revert things daily. Sometimes problems aren’t visible for a week or two, which is why I maintain weekly backups. However, after a month any issue should become apparent, and each backup consumes about 110GB, which isn’t ideal since I prefer to keep only half a TB of duplicate data. As for the joke, I guess you’re not used to the La Brea Tar Pits in Cali. The humor runs deeper because the script I use—mammoth.sh—has to compress the folder into a tar file named mammoth.sh. This process stores many well-preserved prehistoric specimens, especially woolly mammoths.
S
Schocko1
06-17-2020, 10:00 PM #10

I regularly save all persistent data from my Docker containers. Occasionally I make mistakes that cause major issues, so I try to revert things daily. Sometimes problems aren’t visible for a week or two, which is why I maintain weekly backups. However, after a month any issue should become apparent, and each backup consumes about 110GB, which isn’t ideal since I prefer to keep only half a TB of duplicate data. As for the joke, I guess you’re not used to the La Brea Tar Pits in Cali. The humor runs deeper because the script I use—mammoth.sh—has to compress the folder into a tar file named mammoth.sh. This process stores many well-preserved prehistoric specimens, especially woolly mammoths.

Pages (2): 1 2 Next