today's sense of tar.xz/tar.gz

Hallo,

since it is off-topic:

Is there any sense of tar'ing the archives? Many people are not using tapes anymore. In Most cases compression rates does not really matter since fast internet and high space capacity.

But some programmes like Matlab are needing over 10 GiB space. Why isn't the UNIX World using 7z which has a better compression rate than xz or gz (in default compression). It is open source. Most decompressors are able to handle multiple formats so backwards compatibility (to old repositories/packages) should be easy to make.

Greetings
 
Maybe you had no chance to see other than a fast Internet connection in your life.
My first internet login was through a 14K4 modem, it took forever to download a single JPG. Today's kids have been ruined by 100Mbit+ download speeds.
 
I myself am not too interested in compression since I don't transfer the archive over the network. The tar 'utility' as it is in FreeBSD works perfectly for my needs.
 
Note that tar(1) does not do compression. Compression is done after the tarball has been created. Yes, FreeBSD's tar(1) can do compression but this is a two stage process, first the tarball is created, then it's compressed. Which means you can use any compression you like, just not directly with the tar(1) tool itself, as that only supports a couple of compression algorithms.
 
I still remember 56K Modem. At it was bad. But why do you still tar the files? There are archive formats which have both group files and compress instead of using two utilities.
 
Because it's pretty much universally available on every UNIX and UNIX-like system.
 
Because it's pretty much universally available on every UNIX and UNIX-like system.
But doesn't it make trouble if you just want to extract one file or replace one file with another without extracting and compressing all files again?
 
There's definitely sense in all this: standardization. I'm a very devoted RAR user myself, been using it ever since the program just came out (as the good ole DOS archiver, with ditto Norton Commander-like interface) and registered it multiple times. This was super exciting because the first time I did so was for my BBS and guess what? The author himself send me the registration key by direct netmail. Good times! (read: he send me a netmail (predates e-mail, used on FidoNet) and instead of using the network he mailed it to me directly).

Nowadays I keep that BBS license in my archives and registered it again (as WinRAR this time) but now for my defacto domain name. And I also use this on most of my FreeBSD servers.

Having said that: here I am, also still heavily using tar. Either with GZip (-z), BZip2 (-j) or xz ([-J).

My main concern is safety. When something goes wrong for whatever reason then all I need is my trusty FreeBSD rescue CD and I'll know up front that I can access these archives. Either using the provided tar or /rescue/tar. That last part is a big thing for me because that means it can be used as your last line of defense.

Another concern is speed. Higher compression ratio's often translate to longer compression times, and that's not always desirable. And with ZFS around where all diskspace is equally used amongst all filesystems there really isn't as much concern for space for me right now.

And what the others said: if this isn't good enough (each to their own) then all it takes is a shell script which can take care of the extra compression for you.
 
Also, you may not need any compression at all, e.g. if you have a bunch of jpeg files, and try to combine them by adding to a zip or rar, you'll get a larger size (with default options). So you can simply tar them.
By the way, that was the way how, e.g. HP-UX, used to write floppies: just a simple tar without any pre-formatting.
 
Back
Top