Hello,
At the moment I am managing to backup my web directory using the following:
Could someone help me to polish this script please.
What type of loop, can I use so that I don't need to manually add new web directories?
Below is what I got so far.
At the moment I am managing to backup my web directory using the following:
Code:
#!/bin/sh
# Backup all website directory and file Script
# Set general variables
time_stamp=$(date +%d-%m-%Y)
umask 0077
# Backup and Compress all Website directories
cd /usr/local/www/webs/
tar -jcvf domain1.fr.tar.bz2 domain1.fr
tar -jcvf domain2.com.tar.bz2 domain2.com
tar -jcvf domain3.co.uk.tar.bz2 domain3.co.uk
tar -jcvf domain4.me.tar.bz2 domain4.me
# Move backups to locations
mv domain1.fr.tar.bz2 /backups/webfiles/domain1.fr/${time_stamp}_domain1.fr.tar.bz2
mv domain2.com.tar.bz2 /backups/webfiles/domain2.com/${time_stamp}_domain2.com.tar.bz2
mv domain3.co.uk.tar.bz2 /backups/webfiles/domain3.co.uk/${time_stamp}_domain3.co.uk.tar.bz2
mv domain4.me.tar.bz2 /backups/webfiles/domain4.me/${time_stamp}_domain4.me.tar.bz2
What type of loop, can I use so that I don't need to manually add new web directories?
Below is what I got so far.
Code:
#!/bin/sh
#Backup folder
BACKUPFOLDER=/backups/webfiles
# Website root directories
webroot='/usr/local/www/webs'
# loop trough the webroot directory and do compression
for i in $webroot
do
???