Best way to keep two machines in sync with FreeBSD

Imagine laptop A and B
Laptop A is the work horse and more frequently used.
Laptop B is used every once in a while.
Both are not the same make/company laptops.

What's the best way to keep the two laptops in sync so that using the latest one replicated/syncs to the other laptop?
Such a solution, preferably, should also include libraries etc installed to replicate an up to date copy from the latest laptop used.

Should one be using rsync, zfs (current fs is this), something else?
 
Do you want the mirrored install to be the one the second laptop boots from?

It is hard to sync a booting filesystem like that.

It would be easier to have the mirror on a second harddrive and in case of emergency boot off that.
 
Do you want the mirrored install to be the one the second laptop boots from?
Not really unless I'm misunderstanding what you mean - I'd want both of them to have their own libraries/drivers according to their hardware - since they're different make. But the rest (software installs etc) to be the same. Does that make sense?
It is hard to sync a booting filesystem like that.
Yes I guess. Was thinking if there was a simpler way to do something like this.
It would be easier to have the mirror on a second harddrive and in case of emergency boot off that.
Do you mean something like a dd clone? that wouldn't be compatible with drivers/boot loading files etc ..... or maybe some other options?
 
dd'ing a live rad-write filesystem is not supported (although it often works). It is also hard on SSD wear to do a full dd every time. And you would be screwed if your primary computer gives up in the middle of a dd.

The problem with syncing-to-root is best explained with /etc/rc.conf. Obviously you don't want to sync that onto a booting filesystem.
 
Looking closer at it ain't trivial, but for sure solvable.
I (almost) solved the problem for my situation to have 1 Workstation, the work horse as you say, and my laptop I use only when I'm travelling, and it shall provide the same working area as if I'm at my workstation. When I return, the situation on my workstation shall be updated automatically to the changes I did while I'm abroad.
And, as you already figured out, too, the hardware differs, so there can't be exact 1:1 copies of each on both.

I differ four kinds of files:
1 the system itself, including userland (ALWAYS separate system from data!)
2 individual config files for each machine
3 the files I have in in my home
4 files which are under version control

1 I do not automatically. I update, and upgrade by hand. But you have to look out to keep both systems at the same versions. Otherwise for example my firefox will strike. My ~/.mozilla/ is also synced with rsync. Maybe not the best choice, more kind of 'brute force', and one may consider to use the service firefox provides instead, if one uses firefox at all, or look for other, better options.
However, if I get a .mozilla/ into my home that differs from the firefox version installed, my browser acts like being freshly installed.
You may observe other uncomfortable things if both machine do not run the same versions.
So system and userland shall be kept at the same version.

2 The system wide config files (e.g. /etc/ ) are backupped for each machine individual, and are not synced.

3 Almost everything I keep in my /home/ I synchronize with rsync, while each machine has an individual exclude file which excludes for example my window manager's config file, while my workstation has 4 monitors, and my laptop of course only one with another resolution. So I need individual config files for such things, which shall be not kept identical, so excluded from rsync process.

4 Also excluded from the rsync process are all directories that contain stuff are under version control (shell scripts, programming, LaTeX, whathaveyou.)

Additionally I don't do the synching directly between both machines, but I have a little server that keeps an copy of my home (backups), so actually it looks like that:
Workstation <-> Server <-> Laptop

There are for sure other ways. But that's mine just for to consider to differ several kinds of files, and how to deal with them. It also helps a lot to sort the different kinds into own directories. I have an own directory under /home/ that contains everthing version controlled. This makes the exclude file a lot more easier as when those directories are spread everywhere - but that's very personal, how to organize your home, also for this are solution how to automate.
 
This gets me wondering about your underlying objectives. Keeping something like /home and /usr/local in sync with rsync should be doable, and do your system/packages updates on both laptops when you do them while using the same repositories. You won't be actually synchronizing every single byte between the machines, but is that truly necessary? Especially seeing as both machines are not identical, you're probably looking at a few configuration differences between them in /etc and maybe even in /usr/local/etc. What's the use case?
 
The first /etc difference is the hostname in /etc/rc.conf. You can't have two live booting filesystems with them indentical. (I mean it would run, but...).
 
dd'ing a live rad-write filesystem is not supported (although it often works). It is also hard on SSD wear to do a full dd every time. And you would be screwed if your primary computer gives up in the middle of a dd.
Yea makes sense - I assumed it wasn't a live system you were recommending to dd - never tried it with a live one 😰

So the only options I can think of are probably rsync, zfs?

Maturin that's great advice - and makes sense - did you come up with it or did you read about it somewhere since you mentioned there are other ways it could be done. Would love to see what's possible in the ways to do it.

but is that truly necessary?
Should be functionally equivalent - basically not doing double work when trying to use two machines as and when I wish.
What's the use case?
Idk - convenience and flexibility to use whichever machine I want?
 
well , You have your HOME directory , particularly $HOME/.config directory , then /var/db/* structure where all system databases for ports & packages reside and then /usr/local/* for all non-system software and ofcourse /etc/* hierachy config files. for name resoou have lution , time sync and system startup.
You have /var/spool/* /var/cron/* hiearchies for email and CRONTABS ....... I have probably forgotten a few more . All these PATHS need to be copied from main system to secondary to keep them somwhat in step.
 
Maturin that's great advice - and makes sense - did you come up with it or did you read about it somewhere since you mentioned there are other ways it could be done.
I learned about rsync in this forums. (thanks guys!)
Once I got warm with it ( rsync -aud sourcedirectory/ targetdirectory/ (a-rchive, u-pdate, d-elete) is one example from its manpage, and a good start with rsync to keep two directories synchron) it's a standard routine I do a lot with.
With the rest I came up by myself (and that's how the scripts look like 😂 - but it works for me.)

The most simple approach was: every machine has it's own system (equal versions), only synchronize your /home/ with rsync, excluding individual configs, version controlled files, and what makes trouble.
The scripts also check about different packages installed:
pkg leaf > workstation
pkg leaf > laptop
diff workstation laptop > difference
which is also not most elegant because there already are difference here: my laptop has AMD GPU, my workstation runs NVidia, my laptop has a webcam and WLAN, my workstation not etc.
But it's a start, an idea you may build upon your own solution (of course with everything [even better] you will receive by others here.)
 
I'm pretty happy myself with my own Nextcloud instance. My work machines may not be identical, but the bulk of my files lives in there and is readily available with native clients for everything I use. Pictures I take using my phone go there almost instantly, great for snapping diagrams off whiteboards and getting them on my desktop.
 
Another option might be to treat the two laptops as X displays, and have a third machine which is your target single system. On each of the laptops you would run an entire X desktop over ssh from the remote third machine. So the two laptops work as x-terminals. Your user experience at the two laptops is identical because you are running the same desktop remotely from the server (the third machine). This has the advantage that you only have a single definition of the common data, and the laptops are merely a terminal onto it. The laptops themselves only need to have a minimal install of freebsd, an ssh client and xorg. Of course this is cheating because you need a third machine which is the box you are really working on. But then you have no need to rsync data between different boxes, and all syncing problems and the associated race conditions go away since there is no syncing to do. I've done this kind of thing in the past, it can be very useful; one of the reasons X11 being a network protocol is so useful. Of course this only works if you have network connectivity between the machines when you need to use them. Someone may correct me but I think it's not possible to do this with wayland, although I'm not certain about that; but X11 was specifically designed as a network protocol to perform exactly this type of operation. Someone once said "the network is the computer". :)

There is a thread here that discusses using X to forward an entire desktop in this kind of approach

Your two laptops in this configuration essentially work as X-terminals.

Going back in time... this is really the equivalent of having two RS232 terminals in different parts of the building talking to your department DEC unix machine, but with graphical desktops connected over the LAN (or over the internet, that is the power of using ssh to do this) instead of serial VT102's.
 
Here is what I do (and would do):

As far as system installation and maintenance is concerned, start from the same point: install the same version of FreeBSD on both. Then do all system administration on both machines simultaneously, or copy your actions. So for example open two shell windows, and do "freebsd-update fetch/install" and "pkg upgrade" on both. Of, if you have to do trial-and-error, do it on one machine, and record exactly what you have done, and then copy that on the second machine. As an example, I have my main server at home, and a cloud-hosted machine, and I religiously write down what sys admin I do on the main machine, and then regularly replay that on the cloud machine.

One divergence is that because of different uses, different packages might be installed on the two. That's a minor complication, I find that easy to deal with.

For home files (user file systems), there are lots of options. One already mentioned is to use rsync (or zfs send/receive). The problem with that is that you have to very careful: Either declare one machine to be the master and the other the follower, and then only do work on the master (so the follower is a cold standby). Or think through which direction you want to copy modified files. So how about another option: If the master machine is always on, and the network between the two machines is always available, why not have the follower machine remote mount the master's home/user file system, for example using NFS? Or set up a third small server machine to be a file server for both?
 
Yeah, if you have a fleet of same make/model machines, like a whole lab of them, then it makes sense to automate the maintenance. But when you have just two machines, and they are both different makes, and have rather different hardware, you're better off sharing some files from /home, but that's it. Just use scp(1) to copy the files you need from one machine to the other.
 
Ssh provides the sshfs utility which is a nice way to mount a remote drive with your shared data (ie, equivalent to what used to be done with nfs). I think on freebsd the package you need is fusefs-sshfs, iirc.
 
If you want to try out the X11 approach... I just rememberd freenx, which is an optimised (compressed) version of the X11 protocol which adds higher performance and session control. That means that like gnu screen or tmux, your X desktop session is persistant. You log in from one laptop, do some work, log out, then later log in from a different laptop, and your desktop session retains it's state from the previous login, which is very useful. In fact NX was a commercial product, I don't know if they are still in business, and 'freenx' was the open source version, it may not have all the features of commercial NX. The chief advantages were higher speed over slow internet links and the session management, which is clearly very useful. Someone has written a nice write-up about using freenx on freebsd here:- https://marc-abramowitz.com/archives/2006/02/16/freenx-on-freebsd/ . Note that NX is NOT like vnc - it is the full native X11 protocol, not a screen scraper.

But if you want to experiment with that approach, I suggest get it working over ssh first, then maybe investigate freenx as a later stage; you may well find doing things over ssh is good enough if you are working in a LAN environment or across fast broadband internet links.
 
I read the thread...
I just wanted to share my solution.
I also have a work laptop (Linux/Debian) and a home laptop (FreeBSD).
For work, I use software that is available on all operating systems.
And I sync my work documents, code, and configuration files via sFTP on my "private Cloud".
I also use sshfs.
So, I have two different, independent operating systems and only sync my work files.

I think it's difficult to completely synchronize laptops A and B at the file system level.
Maybe instead of synchronization you should use Ansible to control the configuration and make it the same on each laptops.
 
Imagine laptop A and B
Laptop A is the work horse and more frequently used.
Laptop B is used every once in a while.
Both are not the same make/company laptops.

What's the best way to keep the two laptops in sync so that using the latest one replicated/syncs to the other laptop?

Don't do master master replication. Strictly speaking it is impossible to get it right without outside sources of truth. If you rely on a device to assess its state, a hardware issue or configuration issue of the device might result in wrong choice of sync direction.

I would first look into if all user-data is portable. Eg. nothing relies on any hardware ID in your $HOME. Then I'd identify software that has complex file system usage, like IDEs and browsers that may have caches, transients, but also important user data conflated in a same tree. For each one of those there already should be a .gitignore somewhere on the web.

So git for home dir, Ansible or your scripting if you wish, to keep the systems provisioned to same config.

P.S. git is not trivial, you need some intelligent and advisably step-by-step script where you can review the work about to be done, using rsync to sync actual members of $HOME against a git cloned elsewhere.
 
Are these 2 machines used to "develop" stuff or "run the developed stuff"?
Do they actually need to be get in perfect sync at the OS level or is it the data?
Example:
Doing sw development for work, assume git is the medium of "choice". So you git clone from a repo @ ${WORK} on machine a, do a bunch of work, on a branch eventually you push that branch to the the servers at $WORK.
Machine b, boot up, do a git pull.

Basically what needs syncing, data or everything.
 
Back
Top