Solved Little script to backup my directory

Hello,

At the moment I am managing to backup my web directory using the following:
Code:
#!/bin/sh
# Backup all website directory and file Script

# Set general variables
time_stamp=$(date +%d-%m-%Y)

umask 0077

# Backup and Compress all Website directories
cd /usr/local/www/webs/
tar -jcvf domain1.fr.tar.bz2 domain1.fr
tar -jcvf domain2.com.tar.bz2 domain2.com
tar -jcvf domain3.co.uk.tar.bz2 domain3.co.uk
tar -jcvf domain4.me.tar.bz2 domain4.me

#  Move backups to locations
mv domain1.fr.tar.bz2 /backups/webfiles/domain1.fr/${time_stamp}_domain1.fr.tar.bz2
mv domain2.com.tar.bz2 /backups/webfiles/domain2.com/${time_stamp}_domain2.com.tar.bz2
mv domain3.co.uk.tar.bz2 /backups/webfiles/domain3.co.uk/${time_stamp}_domain3.co.uk.tar.bz2
mv domain4.me.tar.bz2 /backups/webfiles/domain4.me/${time_stamp}_domain4.me.tar.bz2
Could someone help me to polish this script please.

What type of loop, can I use so that I don't need to manually add new web directories?
Below is what I got so far.

Code:
#!/bin/sh
#Backup folder
BACKUPFOLDER=/backups/webfiles

# Website root directories
webroot='/usr/local/www/webs'

# loop trough the webroot directory and do compression

for i in $webroot
do
???
 
Code:
for dir in $webroot/*; do
  echo "directory: $dir"
done

As always, sh(1) will fail if there are spaces in any of those directory names. Quoting can mostly work around that, with the shell fighting at every step.
 
As always, sh(1) will fail if there are spaces in any of those directory names. Quoting can mostly work around that, with the shell fighting at every step.

When I have to loop through files or directories, I always use this method:
Code:
# $my_listing_command is usually something like 'ls' or 'find'
$my_listing_command /my/path | while read f;
do
  do_something_with "$f"
done
This way I'm sure that spaces are never a problem.
 
Hi guys,

Taking your advise onboard, I came up with the following:
Code:
  1 #!/bin/sh
  2
  3 # Website root directories
  4 webroot='/home/sysadmin/test'
  5
  6 # loop trough the webroot directory and do compression
  7 rootdir= find $webroot -maxdepth 1 -type d ! -iname ".*"
  8 for dir in $rootdir; do
  9   echo "directory: $dir"
10 done
This result in the following outcome:
Code:
/home/sysadmin/test
/home/sysadmin/test/databaseExport
/home/sysadmin/test/dotfiles
/home/sysadmin/test/bugPatch
/home/sysadmin/test/404_error_template
/home/sysadmin/test/myScripts
/home/sysadmin/test/migrationFiles
Now..how can I replace the echo for the real thing?
In my original script I do tar -jcvf domain1.fr.tar.bz2 domain1.fr
If I do
Code:
for dir in $rootdir; do
    tar -jcvf $dir.tar.bz2 $dir
done
nothing happen, non of the directories are compressed

I'll appreciate any feed back.

Thank you
Fred
 
Try this:
Code:
WEBROOT=/usr/local/www/webs
BACKUP=/backups/webfiles

cd $WEBROOT
find . -maxdepth 1 -type d ! -iname ".*" | while read D; do
   # I hate spaces in file names
   NOSPACE=`basename "$D" | tr ' ' '_'`
   mkdir -p "$BACKUP/$NOSPACE"

   TIMESTAMP=$(date +%d-%m-%Y)
   TARFILE="$BACKUP/$NOSPACE/${TIMESTAMP}_${NOSPACE}.tar.bz2"

   tar -jcvf "$TARFILE" "$D"
done
 
Thank you very much Dies_Irae
Could you please explain me what this code work? I looked at basename but not sure how it work here
Code:
NOSPACE=`basename "$D" | tr ' ' '_'`
 
basename returns the last part of a path:
Code:
$ basename /usr/local/bin/firefox
firefox
The command basename "$D" | tr ' ' '_' takes the last component of the path "$D" and replace all spaces with an underscore '_'
 
Back
Top