Solved Script doesn't run when executed in cron

fred974

Daemon

Reaction score: 36
Messages: 1,557

Hi all,

I have a backup script that work as expected when run manually but when I tried to execute it via cronjob, then nothing happen..
crontab -l
Code:
SHELL=/bin/sh
PATH=/bin:/sbin:/usr/bin:/usr/sbin
HOME=/var/log
#
#minute (0-59)
#|   hour (0-23)
#|   |    day of the month (1-31)
#|   |    |   month of the year (1-12 or Jan-Dec)
#|   |    |   |   day of the week (0-6 with 0=Sun or Sun-Sat)
#|   |    |   |   |   commands
#|   |    |   |   |   |
#### run database backup script every day at 4am
00   4    *   *   *   /root/dbatools/backupDatabases.sh >> /var/log/dbbackup.log 2>&1
Could someone please tell me if you see anything obviously wrong with my cron?

Thank you

Fred
 

SirDice

Administrator
Staff member
Administrator
Moderator

Reaction score: 7,179
Messages: 29,464

Check /var/log/cron to see if it actually runs. Also look at your /var/log/dbbackup.log file, the most common mistake is not realizing cronjobs have a limited path.
 
OP
OP
fred974

fred974

Daemon

Reaction score: 36
Messages: 1,557

Thank you for your input, from the command bellow I can see that the script is called but I haven't got any more info...
cat /var/log/cron
Code:
Nov  9 04:00:00 holy newsyslog[6882]: logfile turned over due to size>100K
Nov  9 04:00:03 holy /usr/sbin/cron[6888]: (root) CMD (/root/dbatools/backupDatabases.sh >> /var/log/dbbackup.log 2>&1)
Nov  9 04:00:10 holy /usr/sbin/cron[6886]: (root) CMD (/usr/libexec/atrun)
Nov  9 04:05:12 holy /usr/sbin/cron[6947]: (root) CMD (/usr/libexec/atrun)
Nov  9 04:10:04 holy /usr/sbin/cron[6962]: (root) CMD (/usr/libexec/atrun)
Nov  9 04:11:00 holy /usr/sbin/cron[6972]: (operator) CMD (/usr/libexec/save-entropy)
cat /var/log/dbbackup.log
Code:
/root/dbatools/backupDatabases.sh: mysql: not found

mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151030/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151031/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151101/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151102/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151103/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151104/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151105/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151106/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found
mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151107/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found

mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151108/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found

mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151109/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found

mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151110/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found

mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151111/*.gz: No such file or directory
/root/dbatools/backupDatabases.sh: mysql: not found

mv: rename /backup/mysql/inprogress/*.gz to /backup/mysql/20151112/*.gz: No such file or directory
vi backupDatabases.sh
Code:
#!/bin/sh
#
# Script for backup MySQL databases in crontab
# Tested on ubuntu +10.04 LTS
#
# Created by Jesper Grann Laursen, powerlauer AT gmail DOT com
# https://github.com/lauer/scripts/blob/master/mysql/backupDatabases.sh
#
# configfile
config=/etc/backupDatabases.conf

### Example of configfile
###
## backuppath
# backupdir='/backup/mysql'
## database informations
# DBuser='backup'
# DBpass='backup'
###
### End of configfile

# Errorvalue
error=0

# day of week
date=$(date +%Y%m%d)
datetime="$date.$(date +%H%M%S)"

# load configfile
if [ -f $config ]; then
        . $config
else
        echo ""
        echo "Error: Need config file: $config"
        echo ""
        exit 1
fi

# Setupcheck
if [ -z "$backupdir" -o -z "$DBuser" -o -z "$DBpass" -o ! -d "$backupdir" ]; then
    echo "Error: Remember to setup the username, password and path to backup"
    exit 1
fi

DBlogin="--user=$DBuser --password=$DBpass"
DBoptions="--opt --hex-blob --force"
dblist=`echo show databases\; | mysql $DBlogin | /usr/bin/tail -n +2 | grep -v information_schema`
logfile=$backupdir/backup.$datetime.log

# clean from old stopped backup
rm -f $backupdir/inprogress/*
rm -f $backupdir/backup.*.log

echo -n "Backup started: " > $logfile
date >> $logfile
echo "" >> $logfile

# clean old backups (more than 7 days old)
oldbackuplist=`find $backupdir/* -type d -mtime +7`
for olddir in $oldbackuplist
do
        echo ""
        echo "Deleting: $olddir" >> $logfile
        rm -f $olddir/*
        rmdir $olddir 2>> $logfile
done

mkdir -p $backupdir/inprogress
for dbname in $dblist
do
    echo "Backing up $dbname " >> $logfile
    echo " $(date +%H:%M:%S) - Dump cycle" >> $logfile
    mysqldump --single-transaction $DBoptions $DBlogin $dbname > $backupdir/inprogress/${dbname}.$datetime.sql 2>> $logfile
    if [ $? -eq 0 ]; then
        echo " $(date +%H:%M:%S) - Compression Cycle" >> $logfile
        gzip $backupdir/inprogress/${dbname}.$datetime.sql >/dev/null 2>&1
        echo " $(date +%H:%M:%S) - $dbname finished!" >> $logfile
    else
      echo " $(date +%H:%M:%S) - Failed to make dump! ($dbname)" >> $logfile
      error=1
    fi
    echo "" >> $logfile
done

echo "Moving compressed files into $date" >> $logfile
echo "" >> $logfile
mkdir -p $backupdir/$date
mv $backupdir/inprogress/*.gz $backupdir/$date

echo -n "Backup ended: " >> $logfile
date >> $logfile
mv $logfile $backupdir/$date
rmdir $backupdir/inprogress

if [ $error -eq 1 ]; then
                echo "Error: Some databases were not completed!"
                echo "See logfile: $backupdir/$date/$(basename $logfile)"
    exit 1
fi
 
OP
OP
fred974

fred974

Daemon

Reaction score: 36
Messages: 1,557

Hi wblock@
I cannot open the link..
Could you please sho me an example of what you mean by script do not have paths?
The script run fine when executed manually.. I only have problem when runing it via cron job
 

Oko

Daemon

Reaction score: 769
Messages: 1,620

Hi wblock@
I cannot open the link..
Could you please sho me an example of what you mean by script do not have paths?
The script run fine when executed manually.. I only have problem when runing it via cron job
This is environmental variable path
Code:
PATH=/bin:/sbin:/usr/bin:/usr/sbin
which is known to the script called by the cron. Now use the command which to check the full path to command mysql You will notice that mysql path involves something like

/usr/local/bin

So cron has no clue where to find mysql. When you call the command from the command line your shell reads your .cshrc profile which has the following path
Code:
set path = (/sbin /bin /usr/sbin /usr/bin /usr/games /usr/local/sbin /usr/local/bin $HOME/bin)
As you can see you have /usr/local/sbin in that path.

Moral of the story is that you have to fix path in your crontab file or even better use only absolute paths to each command in the crontab file.
 

wblock@

Administrator
Staff member
Administrator
Moderator
Developer

Reaction score: 3,633
Messages: 13,850

Could you please sho me an example of what you mean by script do not have paths?
Consider this line from the script:
Code:
dblist=`echo show databases\; | mysql $DBlogin | /usr/bin/tail -n +2 | grep -v information_schema`
That runs mysql. But it runs it without an explicit path, assuming that the shell will know where to find it. That often fails with cron(8), which has a limited path. Assuming the mysql binary is in /usr/local/bin/, that should be:
Code:
dblist=`echo show databases\; | /usr/local/bin/mysql $DBlogin | /usr/bin/tail -n +2 | grep -v information_schema`
It's interesting that there is a full path for tail, which probably does not need it in a crontab.

Years back, I just decided to declare the full path for every external command at the start of a script, then use those variables later. For example, I would do this for the code above:
Code:
GREP="/usr/bin/grep"
MYSQL="/usr/local/bin/mysql"
TAIL="/usr/bin/tail"
...
dblist=`echo show databases\; | ${MYSQL} $DBlogin | ${TAIL} -n +2 | ${GREP} -v information_schema`
I write sh(1) in this very strict way with the forlorn hope that it will make sh(1) less awful. It doesn't, but at least it's technically correct.
 
OP
OP
fred974

fred974

Daemon

Reaction score: 36
Messages: 1,557

Thank you guys,
You made it really clear and I know what do do now :)
 

allen.konstanz

New Member


Messages: 5

For me works doing this:

First set you password and DBDescription to a encrypted file with this command bellow:
mysql_config_editor set --login-path=DBDescription --host=DB#.pair.com --user=DBUser --port=3306 --password

See if that works:
mysql_config_editor print --login-path=DBDescription

The output will be something like this:
Code:
[DBDescription]
user = root
password = *****
host = localhost
port = 3306
Now create a shell script:

Bash:
#!/bin/sh
#Set paths
MYDUMP=/usr/local/bin/mysqldump
BACKUPDIR=/usr/local/www/apache24/data
MYGZIP=/usr/bin/gzip
MYFIND=/usr/bin/find

#Dump
$MYDUMP --login-path=DBDescription DBNAME > $BACKUPDIR/mysql-DBNAME.`date '+%d-%B-%Y--%Hh'`.sql

#GZIP the file
$MYGZIP $BACKUPDIR/mysql-DBNAME.`date '+%d-%B-%Y--%Hh'`.sql
And finally create a job with cron.
Here I use:
# nano /etc/crontab
At the final line of the file add:
Bash:
00 04 * * * root /usr/local/www/apache24/data/mybackup.sh
Based on the information of the page: link
 
Top