tar zfc has problems with too large files

Hi,
I have written a backup script, which creates a tar.gz of a dir.
The dir contains different file types mostly about 10MB all together we have about 60GiB of data.

Now tar should tape this together and compress it then with gzip. The archive gets the date as name and is now parked for about 30 days in an extra backup dir.

So much to theory. The script works properly with GNU/Linux but under FreeBSD there many issues.

I have found now that the issue is a tar issue, the rest of the script works good and the script is even listet in the cron logs.


There was until now only one time where the script worked proberly but until now I couldn't locate any new backups.
When I execute the script manuelly I see how tar is already working but at the end something everytime makes tar to fail.
I know this because the script ends with:
Code:
tar zcvf ${BACKUPDIR}/${DATUM}.tar.gz ${SOURCE} && halt -p
And the server is always on when I come back....

Sry for the awful english, Regards
 
This is the problem
I can't post an error directly. I just can say what doesn't work.
The server has to be available from 8 - 24h when I now start the backup via ssh over night, I come in the morning back to the ssh terminal which lost the connection after some hours..
Because I lose the ssh connection I am never able to see whats going on.

Regards
 
Yes, but you might want to redirect STDERR too, not just STDOUT.
 
@SirDice
Ah ok, this was new to me in relation to (ba)sh.
So when I have understood this right I should use
Code:
tar cvzf ${BACKUPDIR} ${SOURCE} 2>&1 output

@wblock
ok, changed

Regards
 
bsus said:
So when I have understood this right I should use
Code:
tar cvzf ${BACKUPDIR} ${SOURCE} 2>&1 output
This will produce an error.

[cmd=]tar -zcvf ${BACKUPDIR} ${SOURCE} > output 2>&1[/cmd]

Keep in mind this only works with (ba)sh. C-shells use a different way.
 
bsus said:
This is the problem
I can't post an error directly. I just can say what doesn't work.
The server has to be available from 8 - 24h when I now start the backup via ssh over night, I come in the morning back to the ssh terminal which lost the connection after some hours..
Because I lose the ssh connection I am never able to see whats going on.

You just described the main reason for the existence of terminal multiplexers like sysutils/screen and sysutils/tmux

Here's what you do:
  • install sysutils/tmux on the remote server
  • connect to remote server via ssh
  • start tmux()
  • start the backup process

Now, if you lose the SSH connection, it doesn't matter, the backups will still be running in the tmux session. Just reconnect to the server via SSH, and reconnect to the tmux session:
$ tmux attach

It will be like you never lost the connection. :) You can even scroll through the tmux screen buffer using CTRL+B, [ (that's the left square bracket) then the cursor keys/page up/down keys (hit ESC to break out of scroll mode).
 
@phoenix
thanks for the good tipp ;)

So, I ran the backup script threw.
First of on the backup should have about 30GB space.
Code:
ls -l /media/backup   
total 30371384
-rw-r--r--  1 root  wheel   3095080960 Jul 14 23:43 14-07-2011.tar.gz
-rw-r--r--  1 root  wheel  27982143697 Jul 20 21:12 20-07-2011.tar.gz
Is this realistic?


The script itself ended with
Code:
tar: output: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors.
 
bsus said:
The script itself ended with
Code:
tar: output: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors.

You've missed the redirect (>).
tar tries to archive the file output which doesn't exist.
 
Back
Top