I need help understanding pkg upgrade

Backup is a concept, not a piece of software. There's nothing more needed than disk space on another machine. Same for a virtual machine to play araound with those updates (and maybe step back an try another way again). Otherwise you're playing russian roulette with a live systems - may cost much more than some time and disk space.

I've seen far to many companies with not the smallest two-digit million euro turnovers without backups and without development machine losing complete servers. Said "hey, we have raid", and the technician took out the wrong of the two remaining disks after noticing all other disks died a long time ago (and the last two also sounding … weird). Or crashing raid controllers - can be fun, too. And I never get the "why". No backup, no mercy.

You've (or your company) decided to go high risk. So you're even going high risk on updates.
It's not a company. It's a small group of hobbiests, all volunteer, that subsists on donations from its participants. The $477 quarterly that they pay for hosting and DNS services is just about all they can afford. I'm trying to convince them to go to a dedicated solution, but that ups their costs by one-third. In case anyone is interested, the website is https://www.stovebolt.com/.

I worked professionally in IT (security) for 20 years. I know all about robust backup systems and the cost of not backing up. I've seen professors lose twenty years of data because they didn't have backups. I'm doing the best that I can with an extremely limited budget.

In the fifteen years that I've been doing this, I've had to restore from backup once. We lost maybe half a day's worth of traffic on the forum. The domain owners are well aware of the limitations of their system and quite prepared to have it all go away with no recourse. When I have fretted about being down for a couple of days, they had said, Don't worry about it. Take all the time you need.

IF I lost a server, they would have to run a fundraiser to pay for it. Then I would have to install a fresh FreeBSD and all the apps, and then restore the data from the backups I have. It wouldn't be pretty, and it would take a lot of my time, but it's doable in my current setup. The only things we would lose are whatever changed since the last backup.

I get what you're saying. It doesn't apply to me.
 
Could a small drive be added?

A 16 GB USB flash drive might be ample, although in your volunteer scenario you might find someone to donate an old spare hard drive (small HDDs are throwaway items, these days).

You could:
  1. use gpart(8) or whatever to create a single ZFS partition e.g. /dev/da0p1
  2. create a ZFS pool e.g. zpool create portsdrive da0p1
  3. install ports-mgmt/poudriere-devel
  4. change three lines in a preconfigured poudriere.conf
  5. poudriere ports -c
  6. poudriere jail -c -j thirteen -v 13.0-RELEASE
  7. poudriere -ports -u
  8. poudriere bulk -v -j main mail/courier-imap
That's mail/courier-imap as one example; build as few or as many ports as you want. The speed with which poudriere can build things into a repository will be a breath of fresh air to you.

Steps (1)–(6) can be one-off, need not be repeated. Step (7) whenever you want to update the ports tree.

For step (4), these three lines in your /usr/local/etc/poudriere.conf:

Code:
ZPOOL=portsdrive
DISTFILES_CACHE=/usr/ports/distfiles
PACKAGE_FETCH_BRANCH=latest

Your /usr/local/etc/pkg/repos/poudriere.conf:

JSON:
{
    "poudriere": {
        "url": "file:///usr/local/poudriere/data/packages/thirteen-default",
        "enabled": true,
    }
}

As a side note, that's ZFS at its simplest; no learning curve. (Just be careful about specifying the partition to give to ZFS. da0p1 above is just an example.)





True. Compare, for example, the latest and quarterly columns at <https://www.freshports.org/lang/gcc/#packages>.

Whilst things such as pkg-install(8) are reasonably good at working with dependencies, mixing latest with quarterly will make things unnecessarily difficult. If you're accustomed to using any utility to build and install from ports – from latest – then (on the same computer) constraining `/etc/pkg/FreeBSD.conf` to quarterly is unlikely to add value; change it to latest.
Thanks for this, Graham. You've given me an idea. I could buy a USB flash drive twice the size of the mail server's hard drive for less than $25 and use that to do what you're suggesting. If it craps out, I just buy another drive. Hmmmm...
 
It's not a company. It's a small group of hobbiests, all volunteer, that subsists on donations from its participants. The $477 quarterly that they pay for hosting and DNS services is just about all they can afford. I'm trying to convince them to go to a dedicated solution, but that ups their costs by one-third. In case anyone is interested, the website is https://www.stovebolt.com/.

I worked professionally in IT (security) for 20 years. I know all about robust backup systems and the cost of not backing up. I've seen professors lose twenty years of data because they didn't have backups. I'm doing the best that I can with an extremely limited budget.

In the fifteen years that I've been doing this, I've had to restore from backup once. We lost maybe half a day's worth of traffic on the forum. The domain owners are well aware of the limitations of their system and quite prepared to have it all go away with no recourse. When I have fretted about being down for a couple of days, they had said, Don't worry about it. Take all the time you need.

IF I lost a server, they would have to run a fundraiser to pay for it. Then I would have to install a fresh FreeBSD and all the apps, and then restore the data from the backups I have. It wouldn't be pretty, and it would take a lot of my time, but it's doable in my current setup. The only things we would lose are whatever changed since the last backup.

I get what you're saying. It doesn't apply to me.
Nice site. Nice forum. May I suggest you start a topic on that forum, asking for one of the forum members to donate an old inexpensive used laptop, or other i386 compatible box, just for forum maintenance purposes? Then you might be able to postpone your upgrade just long enough to take your time in replicating the software configuration(s) you need on this "new" machine. You might even get lucky and wind up getting more than one such machine; lots of people might have old hardware lying around that they've replaced and aren't using anymore. Your forum members seem like very resourceful people.

It wouldn't need to be a server-grade piece of hardware, but rather, just good enough to serve as a software configuration model, and possibly (but not necessarily), just good enough that it might serve as a temporary replacement in a pinch, in case one of your live server upgrades fail.

In such a case, or any case of live server hardware or software failure, you might then be able to restore your most recent backup on the model machine, and go live with it, while you could then take your time and duplicate the software configuration you've just prepared and documented on the model machine on the primary server hardware. It would alleviate a lot of the stress and uncertainty you're dealing with now.

In future, once you have the software config on the model machine synch'ed up with the config on the live machine, you'd be able to test future software upgrades on the model machine without risking the stability of the live server or servers. And, if the hard drive on the model machine is large enough, you might even be able to use it as a multi-boot system, to model all of your server configurations. It seems like a lot of work, I know, but it also seems like you're already working overtime dealing with this situation right now, plus all the stress and uncertainty that goes with it. Just my humble opinion, take it for whatever you think it's worth.
 
Thanks for this,

You're welcome, NB there were some typos in my original post. Now corrected …

… hopefully nothing else wrong there <https://forums.freebsd.org/goto/post?id=527130> but corrections are welcome.

PS there's an assumption of FreeBSD 13.0-RELEASE.

Also, I didn't mention port options. For a simple setup with a single poudriere jail, as far as I know you can put most of what's required in
/usr/local/etc/poudriere.d/make.conf – here's mine:

Code:
# <https://github.com/freebsd/poudriere/issues/867>
MAKE_JOBS_NUMBER=3

# <https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=252099#c5>
# <https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=252099#c18>
WITH_DEBUG_PORTS += multimedia/webcamd sysutils/bsdisks

ICA_CERTS=QuoVadisEuropeEVSSLCAG1.crt

# <https://forums.FreeBSD.org/threads/71438/post-517873>
LICENSES_ACCEPTED += commercial

WITHOUT_LLVM_TARGET_ALL=

(Probably not an entirely sane example … I don't know where I got the notion of a debug build of sysutils/bsdisks because <https://www.freshports.org/sysutils/bsdisks/#config> there's no such option.)
 
I feel your pain. At work we have a build server and our own package repository and builds can very take a long time on a system with 16 cpus!

I also know that the ports system is broken. How to fix it, I do not know, it's above my station and frankly I don't care.

Let me give you an example: I recently wanted to build doxygen and so issued make config-recursive. After about 20 minutes of answering questions for over 50 ports I abandoned it when it said it needed to build gcc!! Seriously, it wanted to build gcc!. This is by no means an isolated or unique incident.

I then downloaded doxygen, fixed two things with the CMakesfiles.txt to get it to build and built it in 15 minutes, no 50+ dependencies except flex & bison & maybe 1 other. (This was on an Armv7 device as well).

So, I'm convinced on this: ports is broken. Use pkg only. Always!
I build gcc early on. And doxygen - only after graphviz and ruby. Learned my lesson on how to do it right, even if it takes time. Well, I do have the hardware, too - the Ryzen 5 1400 is no slouch.
 
mark_j If you are building from ports then, if gcc is a dependency, don't be surprised when it needs to be built. It doesn't mean ports is broken. I also question your need to answer questions when you were using config-recursive.
Gcc is not a dependency, that's the problem. At least not directly. With a completely new system, all that's required is cmake, a c++ compiler, flex and bison and, sigh, python. So if I built python or flex or bison or cmake via ports it would eventually lead to needing to build gcc10, possibly just to build one module used by one of the aforementioned applications.

It's broken.

Use pkg.
 
Gcc is not a dependency, that's the problem. At least not directly. With a completely new system, all that's required is cmake, a c++ compiler, flex and bison and, sigh, python. So if I built python or flex or bison or cmake via ports it would eventually lead to needing to build gcc10, possibly just to build one module used by one of the aforementioned applications.

It's broken.

Use pkg.
you only end up with gcc if you build webcamd, which has a dependency on some Linux libs. Linux libs depend on being built by gcc... That's why I build GCC as early as practical. No, ports are not broken, but they do have the risk of getting into circular deps if you specify too many deps in your makefiles. Took me a few tries to figure out how to break out of that vicious circle - and it worked, because I was paying attention. :p
 
It matters not. The fact you have to build GCC anyway is just ridiculous.
Unless you're compiling the linux kernel which uses some hacky-shit only provided by GCC, there's no reason for it to be even used. Clang does it all. If it doesn't then hack the code to fix the hack and stop being a lazy port maintainer. If GCC is providing some interface to Fortran or something then yes it's a valid requirement. In this case, as I said, use pkg.
That's what I am trying to re-inforce to the OP. Dump ports, it's a slow grind into boredom. Recursively building tool after tool with potential for failure at any point, why bother? It's a waste of your time.

I cannot recall what it was a few weeks ago, but experimenting with a port that suddenly wanted Clang10 on an installed system with Clang11. Get out!

Use pkg for everything and only where you cannot use the binary package, use the port and customise. This is an absolute on smaller computing power machines and those who cannot be bothered answering 1000 prompts.

Do what you want if you have a build server; that's not the OPs issue.
 
Do what you want if you have a build server; that's not the OPs issue.
Mark, if I'm understanding you correctly, you seem to be saying use pkg for everything you can, and build the custom ports yourself. Is that right?

Here's a couple of problems that I ran into over the past couple of days getting the mail server updated.

1) I had to install and configure Roundcube because Squirrelmail is EOL. So, I ran pkg install roundcube. Later, I was running portmaster, and it wanted to update roundcube. So, clearly, pkg is "behind" ports (for roundcube at least.)
2) I locked several important apps and than ran pkg upgrade. It upgraded/installed 121 packages. I then ran portmaster. It wanted to update a bunch more. (I have 349 ports installed on that machine. I don't even know why 2/3rds of them are needed.) So, there is obviously a difference between pkg and building ports with portmaster.

How do you reconcile that? I require certain php extensions for some of the stuff we're running. But pkg removes them because they're not on its list of options. How do I resolve that using pkg?

I'm open to new ideas. I just need to understand how it works and what the risks are.

For those who have expressed concern about backups, here's what I'm doing. I wrote a script that creates .tgz files from the bits that need to be backed up (including a mmddyyyy.all.sql file that backups up the mysql dbs), then writes them to /var/backup/. The filenames use the pattern mmddyyyy.filename.tgz. Then they are uploaded to my Dropbox folder. Each day, when the script runs, it deletes the previous day's file from /var/backup/ and the previous 7th-day file from Dropbox. This keeps the /var partition from filling up while keeping the most recent backup handy on the hard drive and keeps the previous 7 day's backups on Dropbox in case I have a disaster that requires an older backup.

Kludgy, I know, but it's the best I can do with no money for backup software. If you need me to restore something from last month or last year, you're out of luck.
 
assumption of FreeBSD 13.0-RELEASE.

Was that a false assumption?

pschmehl which version of FreeBSD do you use?

𡀦… pkg is "behind" ports (for roundcube at least.) …

Apparently not behind for roundcube-php74 on FreeBSD:13:amd64:

1629028911692.png


<https://www.freshports.org/mail/roundcube/#packages>

Code:
% grep -A 2 1.4.11 /usr/ports/mail/roundcube/Makefile
DISTVERSION=    1.4.11
PORTREVISION=   1
PORTEPOCH=      1
%
 
OP: For backup software, you can always try something from ports: sysutils/bacula11-server. This is enterprise-grade stuff, available for free. Or, you can put together a patchwork of simple file copy utilities available in the base install, write a script to back everything up, and put it into cron so that your backup is on a schedule. All you need is enough disk space, and a willingness to study how those systems work. Just take a look at what FreeBSD even has to offer! Even uploading to DropBox if you want!

Remarks from mark_j are exactly the reason to NOT mix packages and ports. If you picked packages at the start, stay with them. If you picked ports, stay with them. The best way to handle upgrading this one port (that you needed compiled with your own custom options) is to have a separate build server that will build your package. Then, AFTER the package is built, you install it on the main machine. If that solution doesn't work, then I'm afraid you're stuck.
 
grahamperrin, I'm running 11.4-RELEASE on both servers. I had a bad experience upgrading to 13.x-RELEASE on another server, so I've opted to stay with 11.4-RELEASE for now.
 
I'm tired of running portmaster -ad and taking two or three days to build all the ports, resolving so many issues that pop up along the way.
I've been using ports-mgmt/portmaster for years but never updated any programs unless portmaster pulled them in during a program build.

That came back to haunt me recently and I had a genuine mess consisting of over 120 ports to clean up that required the use of pkg and about 3 days for me to reslove.

I'm old too, not quite as old as you, but neither of us too old to learn from mistakes. Yesterday I used portmaster -a only a couple weeks after last using it. This time there were only 53 ports that needed updating. I started in the morning and was done before the day had passed.

The only program that caused a stop was updating multimedia/vlc and that was near the very end. When it stopped I ran portsnap fetch update. Then ran portmaster -a again to get a good look at what I was gong to have to do to set thing right.

During the time that had passed since morn the problem child file that caused portmaster to balk had been updated and portmaster listed it in the plan of what it was going to install. I scrolled up to restart the build with the same command initially used and was finished an hour or so later with all programs up to date.

I don't run a server, only laptops, so downtime only concerns me and this time used in what I see as wisely. If I was running a server I might update more frequently to keep downtime blocks to a minimum. But that's me.
 
Trihexagonal, I try not to disrupt service operations any more than I have to, so I update infrequently. But there are some ports that take forever to update. llvm80, llvm90, and cmake are three that make me groan every time they show up on the list (which is often.) There are some others, but those three always take a very long time to build. I don't even know what those ports are needed for.
 
pschmehl said:
I don't even know what those ports are needed for.

From pkg-descr: "The LLVM Project is a collection of modular and reusable compiler and toolchain technologies.".

These are compilers. They are used for building ports, not for running ports. If you have to keep operations going while updating ports you might want to install these once and keep 'em. See pkg help set to mark these so that pkg autoremove does not remove these compilers.
 
Trihexagonal, I try not to disrupt service operations any more than I have to, so I update infrequently. But there are some ports that take forever to update. llvm80, llvm90, and cmake are three that make me groan every time they show up on the list (which is often.) There are some others, but those three always take a very long time to build. I don't even know what those ports are needed for.
devel/cmake has a truckload of dependencies, but once those are satisfied, it takes less than 10 minutes to compile (Well, I do have a Ryzen 5 1400, and it's surprisingly capable). I'd suggest NOT cleaning out those deps, either. Otherwise, ports will pull them right back in when it's time to upgrade something.
 
But there are some ports that take forever to update. llvm80, llvm90, and cmake are three that make me groan every time they show up on the list (which is often.) There are some others, but those three always take a very long time to build.
Oh, yes, cmake is a beast (and there have been a good few updates and new versions in the last few months). And llvm versions are x2 or x3 worse! If you are pulling in llvm versions then generally it's worth taking the time to figure out what's pulling them in - there will be some build config option that you've ticked and if you don't need that option then you can save yourself a lot of build tome. But figuring that out can be time-consuming (but worth it in the long run if you can avoid building llvm. For a time on 13.0 if you used MySQL it would pull in llvm90 but that wouldn't be the case on 11.4).

The longer you leave updates, the bigger the mess when you do get around to doing them, so I tend to try and run updates at most monthly (this is running servers, so there's no so much in terms of desktop applications and all those dependencies). Then if any glitches, it's obvious what updates caused them and a bit easier to work around. But then we come back to the volunteer nature of the work etc. so not so clear-cut for you.
 
pschmehl, I ran portmaster -a after updating my ports tree today to see what it reported, 2 days after updating all my programs, and it showed graphics/libepoxy and graphics/mesa-libs as being the only ports with an update. I may start running that on a daily basis to keep the time down factor to a minimum. richardtoohey2 talks about it in his post preceding mine.

Yes, some of them can take several hours to compile depending on your machine specs. If you're updating your server it probably doesn't have a number of ports my laptops do.

There's no doubt pkg is quicker and there have been several times I've had to mix pkg and ports to get past a fail point. I haven't ran into a problem from mixing them but have dealt with ports long enough I can deal with them.

That's where people who do not yet have that experience become frustrated and think about giving up on FreeBSD. And why I recommend people new to FreeBSD don't mix pkg and ports. But that's where experience comes from, solving problems.

So you may be damned if you do, and damned if you don't. Running a server adds more heat to that fire for you. It's up to you to decide how much you can take.

If I wanted to switch a machine I'd been using ports on to pkg I'd start doing it without rebuilding the machine. This is one I've mixed them on, am still using ports on a regular basis and it's running smooth as can be.
 
TL;DR

If you have a powerful machine available somehow, use it to build the packages with ports-mgmt/poudriere. POUDRIERE will create a repository with your custom packages and then you just need to configure the server to use that repository as its pkg repository. If the eventual powerful machine isn't running FreeBSD, you can ever do it on VirtualBox or something.
 
I wonder if ports-mgmt/synth is a good solution here? I used to use it on a few machines in a previous company, but IIRC you can build packages that require specific options, but for other things it will just use the package. Saved me a lot of time building, but did require some reading and a little trial and error.
 
Mark, if I'm understanding you correctly, you seem to be saying use pkg for everything you can, and build the custom ports yourself. Is that right?
Absolutely. BUT, I am only talking within this perspective:

1. You don't want to bother answering 100 questions to get some port to build, and/or
2. You have a limited capacity machine on which to run ports.

Nowadays, the complexity is enormous. Every port seems to have its own build system, where it needs to pull in python, perl, ruby, go, java and (God help me!) Rust to run a test that outputs "hello world". It's just gone mad. (I use cmake a lot nowadays, but, seriously, autotools did the job. Now there's more build tools than I can count; ninja, meson,automake,cmake,gmake,scons,sbuild, etc, etc, etc).

Why torture yourself when a build server has already done the hard yards and built a package for you?


Here's a couple of problems that I ran into over the past couple of days getting the mail server updated.

1) I had to install and configure Roundcube because Squirrelmail is EOL. So, I ran pkg install roundcube. Later, I was running portmaster, and it wanted to update roundcube. So, clearly, pkg is "behind" ports (for roundcube at least.)

Why? Didn't you just install roundcube? What makes you think updating via ports will make it better? I don't use portmaster (never have) but can't it be used to pull in packages in lieu of building from source to satisfy dependencies? Just what I'm suggesting.

Are packages behind ports? Well it depends on the package repository. Is it pointing to latest or quarterly? Compared to ports? Ports are, with hesitation, always ahead of packages.
If you're going to mix and match, you best get those two synchronised before you go updating.

2) I locked several important apps and than ran pkg upgrade. It upgraded/installed 121 packages. I then ran portmaster. It wanted to update a bunch more. (I have 349 ports installed on that machine. I don't even know why 2/3rds of them are needed.) So, there is obviously a difference between pkg and building ports with portmaster.

This sounds like I'm picking on you, but I am not. However, you raised the spectre so I will address it.
Why are you upgrading? What are you hoping to achieve? Is some software you're using broken? Have security issues?
Install the software, configure it and leave it. You'll thank me for that sage advice. :)

How do you reconcile that? I require certain php extensions for some of the stuff we're running. But pkg removes them because they're not on its list of options. How do I resolve that using pkg?

Build the port. Off hand, I would suspect some package that has hard-coded requirements for php extensions is probably bad anyway. Regardless, build from ports. However, you need to ensure your port tree is synced as close to the quarterly as possible. If you're using latest, then you're in for a world of hurt.

I know the consensus is don't mix/match ports and packages, and yes, it's probably good advice. I don't know your level of programming knowledge so this also might be bad advice from me. All I can say is for many years I have been doing this and NOTHING has ever happened I can't fix quite quickly (usually versions of libraries).

YMMV.

I'm open to new ideas. I just need to understand how it works and what the risks are.

Don't update unnecessarily. If you want the latest/greatest/flashiest feature, then stick to the latest branch of ports and update every day. I hope you like wasting time. :(

If you stick to quarterly a lot more packages are stable. Security fixes and that's about all. No need to update, just install your package and relax.


For those who have expressed concern about backups, here's what I'm doing. I wrote a script that creates .tgz files from the bits that need to be backed up (including a mmddyyyy.all.sql file that backups up the mysql dbs), then writes them to /var/backup/. The filenames use the pattern mmddyyyy.filename.tgz. Then they are uploaded to my Dropbox folder. Each day, when the script runs, it deletes the previous day's file from /var/backup/ and the previous 7th-day file from Dropbox. This keeps the /var partition from filling up while keeping the most recent backup handy on the hard drive and keeps the previous 7 day's backups on Dropbox in case I have a disaster that requires an older backup.

Kludgy, I know, but it's the best I can do with no money for backup software. If you need me to restore something from last month or last year, you're out of luck.

How big are the backups? I'd buy two 128GB+ USB flash drives and use them in rotation for, say, a month plus your off-site.

In summary. If you're scared that mixing ports and packages will make your system unusable, that you don't feel capable to deal with an odd situation should it arise and you're worried about the impact on others, then you should probably stick to packages (where customisation is out) or ports (where customisation is in but is tedious) and never mix the two.
 
I wonder if ports-mgmt/synth is a good solution here? I used to use it on a few machines in a previous company, but IIRC you can build packages that require specific options, but for other things it will just use the package. Saved me a lot of time building, but did require some reading and a little trial and error.
Thanks for that. I'll take a look.
 
mark_j, thanks for your insight. I'll try to answer your questions frankly.
  1. Yes to both - don't want to answer 100 questions and have limited capacity (but sufficient) to build ports. Honestly? I'm getting older, and tired of wrestling with ports.
  2. I'm upgrading because I'm a big believer in keeping applications current, for security reasons. One never knows when an app might open a hole in your system
  3. I don't want flashy, latest/greatest. I do want secure. I don't run any non-ssl/tls stuff on my servers except for port 25 for mail. All websites are ssl, mail is imaps, no ports are open that aren't being currently used, and mysql only listens on localhost
  4. I'm not scared of running into problems. I'm just tired of dealing with them. I haven't had a problem yet that I wasn't able to fix, but there's always a first time. I'm not a programmer, but I have some basic knowledge - enough to read and understand code and figure out the cryptic instructions that programmers typically provide for configuring their apps. I used to be a port maintainer for FreeBSD but gave it up when I retired.
  5. I kind of like the idea of using USB sticks. The only downside is then I have to drive down to the colo to insert them. Not a major pain, but still....
 
Back
Top