Hi,
I don't know what is the right way to do that. On a server, I run 5 jails, each having various ports installed (not service jails). They are used as virtual machines by different users (each jail having is own administrator).
As some jails use the same package (e.g. apache and either mysql or postgresql), it would be nice to find a way to reduce memory and disk usage by sharing data...
The "world" on which jails run is a read-only filesystem mounted through unionfs (modifications handled in a more or less "cow" way).
The benefits I see are:
- reduce memory footprint both in RAM and HDD (shared binaries loaded only once in virtual memory, which also improve CPU cache usage [I am wrong ?]).
- easier to track vulnerabilites and update software, except...
The problem is here, I can't force someone to update packages in his jail. In absence of zfs "dedup", files installed multiple times have to be tracked manually (and it seems preferable in this case).
The problem is, how to find a clean way to do that, without corrupting pkg db.
My idea currently is to provide up-to-date most common packages files on a ro unionfs, then use these files instead of the default one when user install some package (running a command after pkg_add, or playing with MASTER/SLAVE mode to filter files getting installed).
Then ... play with file links to replace new files with shared ones? Use unionfs whiteouts to reveal or hide shared files?
I am not a specialist, and this work is just for a custom "home-made" server.
Thanks,
I don't know what is the right way to do that. On a server, I run 5 jails, each having various ports installed (not service jails). They are used as virtual machines by different users (each jail having is own administrator).
As some jails use the same package (e.g. apache and either mysql or postgresql), it would be nice to find a way to reduce memory and disk usage by sharing data...
The "world" on which jails run is a read-only filesystem mounted through unionfs (modifications handled in a more or less "cow" way).
The benefits I see are:
- reduce memory footprint both in RAM and HDD (shared binaries loaded only once in virtual memory, which also improve CPU cache usage [I am wrong ?]).
- easier to track vulnerabilites and update software, except...
The problem is here, I can't force someone to update packages in his jail. In absence of zfs "dedup", files installed multiple times have to be tracked manually (and it seems preferable in this case).
The problem is, how to find a clean way to do that, without corrupting pkg db.
My idea currently is to provide up-to-date most common packages files on a ro unionfs, then use these files instead of the default one when user install some package (running a command after pkg_add, or playing with MASTER/SLAVE mode to filter files getting installed).
Then ... play with file links to replace new files with shared ones? Use unionfs whiteouts to reveal or hide shared files?
I am not a specialist, and this work is just for a custom "home-made" server.
Thanks,