Restore rsnapshot Backup
Support Wissensdatenbank
Restore rsnapshot Backup
If a rsync or rsnapshot backup has to be restored, it has to be done very fast. Therefore we have created enclosed scripts to restore the data as fast as possible. Both scripts can be executed in parallel to restore the web and email data at the same time. The script starts several parallel processes to accelerate the restore. The prerequisite is that the backup server can access the target server via an SSH key.
Requirements:
– rsnapshot backup
– SSH key stored at the target server
– Plesk server
The script starts x specified rsync processes. The script can be customized for any path. E.g. “realpath” can be customized TO “/var/lib/mysql” to restore each database from file backup individually. With sshkey you can also exchange SSH backup key.
Restore web files (webrestore.sh):
#!/bin/bash #Settings process_limit=120 # Max amount of rsync processes server="sr00" # Server according to Backup path /volume1/backup/{$server} daily="0" # Daily number according to backup path starting with 0 like daily.0 /volume1/backup/$server/daily.{$daily} remote="sr00.firestorm.ch" # URL or IP of Target Server. sshkey="/root/.ssh/backupkey" # Rsync SSH Key backuppath="/volume1/backup/$server/daily.$daily/$server" # Source backup path from rsnapshot (No tailing slashes) realpath="/var/www/vhosts" # Server Source location (No tailing slashes) #Processes Array processes=() function check_processes { #For Each Rsync Process for pid in "${processes[@]}"; do #If Process has died / ended if ! ps -p $pid > /dev/null ; then #For Each Numbered Process Array for i in "${!processes[@]}"; do #If numberes Process equals died process if [[ ${processes[i]} = $pid ]]; then #Remove Process from array unset 'processes[i]' fi done fi done } #For Each Website in Backup for folder in $backuppath$realpath/* ; do domain="$(basename $folder)" /bin/echo "##############################" #wait if more than 30 processes while [ ${#processes[@]} -gt $process_limit ] do check_processes /bin/sleep 5s done /bin/echo "/bin/rsync --stats --progress -avz --delete $folder/ root@$server.firestorm.ch:$realpath/$domain/ (${#processes[@]} / $process_limit)" /bin/rsync --stats --progress -avz --delete -e "ssh -p 22 -i $sshkey" $folder/ root@$remote:$realpath/$domain/ & processes+=($!) done
Restore email data (webrestore.sh):
#!/bin/bash #Settings process_limit=120 # Max amount of rsync processes server="sr00" # Server according to Backup path /volume1/backup/{$server} daily="0" # Daily number according to backup path starting with 0 like daily.0 /volume1/backup/$server/daily.{$daily} remote="sr00.firestorm.ch" # URL or IP of Target Server. sshkey="/root/.ssh/backupkey" # Rsync SSH Key backuppath="/volume1/backup/$server/daily.$daily/$server" # Source backup path (No tailing slashes) realpath="/var/qmail/mailnames" # Server Source location (No tailing slashes) #Processes Array processes=() function check_processes { #For Each Rsync Process for pid in "${processes[@]}"; do #If Process has died / ended if ! ps -p $pid > /dev/null ; then #For Each Numbered Process Array for i in "${!processes[@]}"; do #If numberes Process equals died process if [[ ${processes[i]} = $pid ]]; then #Remove Process from array unset 'processes[i]' fi done fi done } #For Each Mailname in Backup for folder in $backuppath$realpath/* ; do domain="$(basename $folder)" /bin/echo "##############################" #wait if more than 30 processes while [ ${#processes[@]} -gt $process_limit ] do check_processes /bin/sleep 5s done /bin/echo "/bin/rsync --stats --progress -avz --delete $folder/ root@$server.firestorm.ch:$realpath/$domain/ (${#processes[@]} / $process_limit)" /bin/rsync --stats --progress -avz --delete -e "ssh -p 22 -i $sshkey" $folder/ root@$remote:$realpath/$domain/ & processes+=($!) done
wiederherstellen: