• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

What is the best compression method for backups?

T. Braun

New Member

Thanks: 3
Messages: 4

#26
Another possibility if a dedicated backup tool might be an option: I'd like to recommend the restic project. I've been using it for more than a year now on multiple FreeBSD servers and I'm really happy with it. It's an encrypted backup, files are split into chunks and compressed, deduplication of the chunks is included, it's very reliable and (after the first initial backup) incremental backups are incredibly fast. Restore also works flawlessly, as I've had to test on multiple occasions.
 

herrbischoff

Active Member

Thanks: 69
Messages: 165

#27
Have you considered borg?
I can attest to the reliability of Borg. It's the main backup tool in place at all of my clients' servers and has proven rock solid thus far. Several backups are multiple terabytes in size and every restore ever has been successful. Also, due to (optional) built-in AES-CTR-256 encryption, offsite storage of the backups is one less headache to have.

https://borgbackup.readthedocs.io/en/stable/
 

Chris_H

Aspiring Daemon

Thanks: 129
Messages: 914

#28
tar -cPv --bzip2 -f <tar filename> --exclude <exclude filename> <target dir for archiving>

The above command does not work... However, the following command does:

tar -cPvjf <tar filename> --exclude <exclude filename> <target dir for archiving>
I've noticed that too. But then again, why would I not want to use
tar -cPvjf <tar filename> --exclude <exclude filename> <target dir for archiving>
in the first place? Less characters to enter, easier to understand, and remember...
In the end. I do remember struggling with something like that in the past, and as memory serves; I think I finally determined that it was the dangling -f that tripped it up.

But for my money
tar cvf - <some-data/some-place> | xz -9e> <some-place/some-filename>
works really well, for all my needs.

--Chris
 

phoenix

Administrator
Staff member
Administrator
Moderator

Thanks: 1,124
Messages: 3,944

#29
tar -cPv --bzip2 -f <tar filename> --exclude <exclude filename> <target dir for archiving>

The above command does not work... However, the following command does:

tar -cPvjf <tar filename> --exclude <exclude filename> <target dir for archiving>
tar -cPv --use-compress-program /path/to/program -f <tar filename> --exclude <exclude filename> <target dir for archiving>

Then you can use any compression program you want.

Or, do it the old-fashioned way where you tar everything up into a single giant tarball, and then run the compression program against that file afterward.
 

Eric A. Borisch

Well-Known Member

Thanks: 204
Messages: 313

#30
Or the Unix way:

tar -vcPf - --exclude <excl> | lbzip2 > file.tbz

This outputs from tar to stdout (-f -) and then compresses (stdin to stdout) with lbzip2 (replace with the tool of your choice) and redirects this output into file.tbz.
 

poorandunlucky

Well-Known Member

Thanks: 26
Messages: 359

#31
I haven't read the thread, but I just want to say that I think that for something like that you should definitely run benchmarks to see what works best with your hardware and your load... You should prepare packages that are representative of the data on your system, maybe an image if you have enough time/space, and compare the methods/algorithms...

Backups are recurrent, time, and resource-consuming... Because of that small differences can become big differences over weeks, months, years, decades...

Also, if you're in a mission-critical situation, you may also want to benchmark and factor-in your decompression time...
 
Top