Backdoor in upstream xz/liblzma leading to SSH server compromise

A bit paranoid, but intentionally quote the commit messages, not to confuse future visitors. This is NOT BECAUSE FreeBSD IS AFFECTED.

author Xin LI <delphij@FreeBSD.org> 2024-04-05 06:39:23 +0000
committer Xin LI <delphij@FreeBSD.org> 2024-04-05 06:39:23 +0000
commit 2f9cd13d6c1824633251fb4267c9752d3b044a45 (patch)
tree 92e731e6757c448fc93afaa5fd1fda1601a14847
parent fcace5ab088edfc5b74e0cd9e731639bf07a9437 (diff)
download src-2f9cd13d6c1824633251fb4267c9752d3b044a45.tar.gz
src-2f9cd13d6c1824633251fb4267c9752d3b044a45.zip

Revert "MFV: xz 5.6.0"
This commit reverts 8db56defa766eacdbaf89a37f25b11a57fd9787a,
rolling back the vendor import of xz 5.6.0 and restoring the
package to version 5.4.5.

The revert was not directly due to the attack (CVE-2024-3094):
our import process have removed the test cases and build scripts
that would have enabled the attack. However, reverting would
help to reduce potential confusion and false positives from
security scanners that assess risk based solely on version
numbers.

Another commit will follow to restore binary compatibility with
the liblzma 5.6.0 library by making the previously private
symbol (lzma_mt_block_size) public.

PR: 278127
MFC after: 3 days
Note that this MFC target is only for stable branches (stable/14 and stable/13), as xz 5.6.0 is not yet MFS (Merge From Stable)'ed into any releng branches, thus does not affect Releases and Patch Release (for example, something like 14.0-p1).
 
Competent, trustworthy, cheap. Pick two, you can't have all three.

Also, I noticed in that linked Reddit commentary that sometimes, a good defense means rebuilding the entire system to NOT depend on a component (xz/LZMA in this case). Now that's pretty painful - not everybody can afford a Threadripper or a Xeon to do that, and it can be plenty complicated to hunt down the broken dependencies and find acceptable alternatives.

Well, at least I learned one more reason to avoid Linux :p 🤣
 
I like and use Debian stable, never had any major problems with it, including this xz scandal.

But I think that FreeBSD should not "rest on laurels", so to speak. Because something like this can happen even to it or BSDs in general.

It's a complex operating system with many lines of code added by many contributors and I can imagine a malicious contributor subtly adding nefarious code into FreeBSD - and I'm talking about the FreeBSD itself, not external packages/ports.

I think that contributions to open-source projects should be more carefully scrutinized before being accepted. It's harder that way but unfortunately necessary because some people don't behave nice.
 
But I think that FreeBSD should not "rest on laurels", so to speak. Because something like this can happen even to it or BSDs in general.
I wasn't the only one in this thread stating this. Several people (correctly) analyzed it was "luck" (with being a less attractive target than major Linux distros probably playing a role)

I think that contributions to open-source projects should be more carefully scrutinized before being accepted. It's harder that way but unfortunately necessary because some people don't behave nice.
Sounds simple, and who will do it? Like in this example, the single exhausted maintainer (and original author)? Just impossible.
Or distributors? Practically impossible by the sheer amount of software they build and package.

This isn't an easy to solve issue.
 
I did not say that this is an easy to solve issue. But something must be done, if the open source initiative is to be preserved.

I wonder how many backdoors are in the closed source projects, since by definition their making is enveloped in shadows and we don't know what happens.
 
I did not say that this is an easy to solve issue. But something must be done, if the open source initiative is to be preserved.

I wonder how many backdoors are in the closed source projects, since by definition their making is enveloped in shadows and we don't know what happens.
Well, consider why some projects are closed source - too many cooks spoil the broth. Sometimes, people want to work in peace, undisturbed. Both 'closed source' and Open Source projects have boundaries that frankly do need to be respected by others if you want to come out with a quality product. Lots of stuff needs to be balanced and fine-tuned. Sometimes, people can't handle that very well, and then the situation blows up in everybody's face.

Frankly, there's too few people asking "Why did this happen?" and trying to get the story from everyone involved.

  • Bosses have an interest in coordinating people's efforts, and making sure rules of the game are followed, even if participants don't exactly agree with the rules. There's a goal that the team needs to be guided to, like a quality product. Boss is supposed to be able to see the 'big picture' and the individual parts to make it happen. If you have a competent boss, that does make a difference.
  • Participants have an interest in being appreciated for being actually useful. Yeah, that means dealing with expectations like pulling their own weight, cooperating with other participants efficiently, and accepting rules even if they don't quite make sense. Competent participants make a difference, true. But, as I mentioned earlier, being reliable is also important, as is compensation for your efforts.
  • Customers: Why do they want your product, rather than someone else's? That kind of counts for something in the project... 🤷‍♂️ If customer's expectation of something changes, that can blow up in people's faces if not managed right.
My point being, it is important to make an effort to ask "Why?" of as many participants as practical. Yeah, that is an effort - I have just explained how come. I personally think that making that effort is better than off-the-cuff shooting after seeing a couple quick comments/commits. Otherwise, it's just Wild West where nobody gives a rat's ass about you or anyone else, then just take your lumps and roll with the punches. 😩
 
I wonder how many backdoors are in the closed source projects, since by definition their making is enveloped in shadows and we don't know what happens.
The bulk of closed source software is very boring. It is things like accounting software in banks, huge quantities of business logic and web applications for all sorts of industries, and even more important embedded software. I used to work (about 25 years ago) in a company that made hugely complex and high-tech industrial equipment (semiconductor wafer inspection machines). We had the best optical microscopes in the world, and the fastest image processors that weren't in classified projects; yet over half of our engineers were software engineers. Another example: I think the Boing 767 (released in the late 70s or early 80s) was the first airplane that was not capable of lifting one copy own software documentation when printed on paper. In the late 90s or early 2000s, there was a Stanford study on what the largest software companies in the US were (in terms of how many software engineers they employed): Place 1 was shared by GE and GM, with Boing in place 3. The only "computer" company that made the top 10 was IBM (not because of its systems software, but because of its army of 100s of thousands of programmers working on contract for banks, insurance, and all the rest). In the last 25 years, software has become even more pervasive, and nearly all of it is closed source.

So: the bulk of all closed source projects have no backdoors. That's because nobody is interested in hacking into my microwave or dishwasher. Nor do they have interfaces that are hackable. Now today lots of IoT devices are network connected ... but the bulk of them have minimal interfaces, and most use very well tested and designed software kits. The average home thermostat or traffic light controller does not have an open SMTP port, and is not reachable by telnet or ssh. (Yes, sometimes IoT devices are hackable, but the bulk of them are boringly not so).

I think in reality you are asking: Of the closed source projects that the average COMPUTER USER (not average human) comes in contact with, how many have back doors? You might be talking about something like Microsoft Word, Google Search, Adobe Photoshop. Or maybe about the phone book app on your cell phone, or the cute video player for cat movies. My educated guess: Relatively few back doors. For two reasons: First those are developed by engineers who are employees, and have gone through relatively extensive background checking (formally, and much more so also informally). Second, being able to hack my phone book app or my photoshop is just not very interesting to either a hacker trying to make money, or a nation state trying to acquire intelligence or achieve world domination. Yes, there are exceptions (look up the hack into the Iranian centrifuges used for nuclear material enhancement), but that's the exception, not the rule.

Now, you might be asking about general-purpose software, such as operating systems; after all Solaris and AIX still exist (in a state of suspended animation). Good question, but I think the same arguments from above apply: Written by paid and trusted professionals. Also, these pieces of software are just not commonly used any more. What's the point of hacking Ultrix or Primos? Nobody is running them in production. Today, Linux has a market share of over 90% among servers (which is where the valuable data is), that's the prime target for attack.
 
ralphbsz, thanks for your detailed answer.

I think I get what you're saying, that in closed source projects the developers are paid employees who are verified and additionally they would not want to risk legal repercussions.

But I have read that many companies are not willing to extensively test the software they create because it costs money to have an QA team. And that seems very risky to me, because I think that in this case the temptation to introduce a backdoor is greater.

Or the company itself does this - I have read about the rootkit introduced by Sony, for example.

So, I still think that open source development is preferable, because while it is easier to introduce bad code, it is also easier to discover it, at least in theory.
In a closed project such code is removed only if its existence is publicly revealed/discovered, otherwise it would remain undetected for years.
 
Yes, sometimes IoT devices are hackable, but the bulk of them are boringly not so
IoT consumer devices are quite common and cheap precisely because people started using Open Source firmware with them... Software has become a bit like an agricultural commodity. High quality stuff is tightly controlled, and is expensive, even for simple stuff. Low quality stuff that is hackable, it's not subject to the same tight controls, but is free. Yeah, you can find something that is actually high quality among the free stuff, but then you discover it takes up a LOT of your own time to process it and make it useful.
 
something must be done, if the open source initiative is to be preserved.

But I doubt this will mean help and resources for open source developers; probably lots of words and policies. But we will see.
 
But I doubt this will mean help and resources for open source developers; probably lots of words and policies. But we will see.
Perhaps Blender, PHP, OpenSSL, Apache and Python will "rewrite their software in Rust" and lead the way.

I promise, once they complete that task, I will do the same ;)
 

But I doubt this will mean help and resources for open source developers; probably lots of words and policies. But we will see.
Sounds more like another set of shackles for FOSS or a stick for the big corps to hit small developers with. It's a move towards "uncertified" or software that has not been audited, being stigmatised. We should all know who gains from that by now...

In my view it's more of this typical "security theatre", which will result in security policies and yet more standards, but not in better written software.
 
An interesting writeup of the particulars of the backdoor:

My initial understanding after a quick read of that article is that a Linux x86_64 object file was hidden in a test case. The backdoor reassembles and decrypts that file, and tries to inject it into the Linux linker. So not only is it Linux-specific, it also won't work on Linux/ARM.

I want to emphasize that this is no particular reason for comfort for the platforms that were not affected. They simply weren't targeted.

Some see this as a failure of FOSS. I see it as a demonstration of its strength. There are thousands and thousands of people much smarter than me using the same software I depend on, and they have full access to all of its source. Had this been installed into some obscure piece of proprietary software, it likely never would have been found. I wonder how much proprietary software actually has back doors and no one notices. Sunlight is the best disinfectant.
 
That it was a special file for a special OS for a special CPU is not so important. Since the linker relevant sections of the ELF format are touring complete (I may search the stream when I have the time), there is no need to load that. You may trick the linker into assembling the stubs itself, then only the OS is an issue as our LD should refuse to link Linux objects without complaining.
 
Some see this as a failure of FOSS. I see it as a demonstration of its strength. There are thousands and thousands of people much smarter than me using the same software I depend on, and they have full access to all of its source.
I think it also points out a "pain point" of FOSS: the eyes need to look, there needs to be "trust but verify" or harder/stronger vetting of potential commiters.
It's a lot of effort to verify every commit, every push request, but someone has to do it. All the analysis one would think "a rigorous code review should have raised flags" but real life makes people lax sometimes.

Now the OpenBSD team and some of their practices make more sense and maybe serve as a model for other projects.
 
I think it also points out a "pain point" of FOSS: the eyes need to look, there needs to be "trust but verify" or harder/stronger vetting of potential commiters.
It's a lot of effort to verify every commit, every push request, but someone has to do it. All the analysis one would think "a rigorous code review should have raised flags" but real life makes people lax sometimes.

Now the OpenBSD team and some of their practices make more sense and maybe serve as a model for other projects.

Not having enough (or any) reviewers would pretty much stop me from being able to contribute.
 
  • Like
Reactions: mer
Not having enough (or any) reviewers would pretty much stop me from being able to contribute.
The quality of the reviewers also matters. ${WORK} I've run into people more concerned with how the code is formatted rather than "does the change do what it is supposed to do". Form over function. Granted, consistency in code is good especially if lots of people are working on it. But if a change doesn't do what it's supposed to do then why bother.
 
Some see this as a failure of FOSS. I see it as a demonstration of its strength.
I fully agree, but I think that's the intent in many of plethora of articles on the subject. I would suggest that it's clear to most of us here that this individual or individuals could have just as easily infiltrated a proprietary project and it would have been much harder to find their "payload".

Those seeming to want to misrepresent the root cause as one of "security" are those same people who have a corporate agenda and their legions of willing mouthpieces. Things get lost in translation and to those reading mainstream tech press articles, it will simply appear that xz-utils was pretty much a hobby project and wonder why something apparently so vital was not in the hands of a "reputable company" - and that it took no less than a Microsoft employee to find it.

They will of course entirely miss the point, that the kludge providing the vector for this was implemented, by developers working on the top handful of Linux distributions all of which are corporate backed and controlled - all of which utilise systemd (itself a corporate backed project overseen by a Microsoft employee). i.e: the corporate world has been running all things Linux for many years, and this is what they came up with. They will not see that this is a side effect of dependency bloat, via a 3rd party patch to sshd, to allow interprocess communication between it and systemd and that liblzma was in fact loaded by part of systemd by design, not sshd.

Security wise, whatever you do and whatever policies or measures you introduce, you cannot prevent a determined infiltrator, who is skilled and patient enough to wait it out a number of years in order to carry our their objectives.

In this case they built up confidence with the developer of a FOSS project. If it were a proprietary project they would have sought employment and done much the same thing - and who's to say they haven't...

You could say that thorough code audits, such as those talked about by the OpenBSD project, have the answer to this, but there's no worthwhile comparison here - it would be like Theo retiring and handing over to someone else. So all of the checks and audits would be useless if the person signing everything off is the one you can't trust.

There is no answer except that, in the spirit of FOSS, this was discovered due to the "many eyeballs" philosophy. It doesn't work every time, but it worked this time - or so we're led to believe.
 
The quality of the reviewers also matters. ${WORK} I've run into people more concerned with how the code is formatted rather than "does the change do what it is supposed to do". Form over function. Granted, consistency in code is good especially if lots of people are working on it. But if a change doesn't do what it's supposed to do then why bother.
With your point of view you may actually like my projects because the formatting is an absolute disgrace but it functions perfectly.
 
With your point of view you may actually like my projects because the formatting is an absolute disgrace but it functions perfectly.
That's why they make emacs init files and "modes". Select all, reformat. Multiple people on a project, everyone use the same configuration. I've had issues trying to get consistent configuration when people are using different editors/IDEs.
Consistently disgraceful formatting, I can actually adjust to because it forces me to concentrate on what the code is trying to do.
I abhor when a change has gratuitous white space changes.
 
Those seeming to want to misrepresent the root cause as one of "security" are those same people who have a corporate agenda and their legions of willing mouthpieces. Things get lost in translation and to those reading mainstream tech press articles, it will simply appear that xz-utils was pretty much a hobby project and wonder why something apparently so vital was not in the hands of a "reputable company" - and that it took no less than a Microsoft employee to find it.

They will of course entirely miss the point, that the kludge providing the vector for this was implemented, by developers working on the top handful of Linux distributions all of which are corporate backed and controlled - all of which utilise systemd (itself a corporate backed project overseen by a Microsoft employee). i.e: the corporate world has been running all things Linux for many years, and this is what they came up with. They will not see that this is a side effect of dependency bloat, via a 3rd party patch to sshd, to allow interprocess communication between it and systemd and that liblzma was in fact loaded by part of systemd by design, not sshd.
Yeah, this is the kind of situation that prompted the FSF to take a rather extremist position of only accepting libre software under its umbrella... there's even a Linux distro out there that was cobbled together using only components that were developed as libre projects, iirc. Searching for that kind of stuff on Google turned up individual projects that have 'libre' as part of the name, but not a decent article explaining what that movement is about.

To be honest, I don't really care if an Open Source project is corporate-backed. Hell, even FreeBSD has corporate backing. Exactly what does that mean, though? For some people, the very term 'corporate backing' is a trigger to back into a corner, and throw a tantrum about corporations being evil, and that only libre software is worthy of using. In reality? The very term 'corporate backing' is very loosely defined. For example, HP released quite a few printer drivers to CUPS, an Open Source project... even Free Software Foundation has a list of commercially produced hardware that has drivers with palatable licenses... Not a very long list, but it's still corporate-backed stuff... In case of FreeBSD, the Foundation maintains grants that major corporations contribute to, probably as tax-deductible donations. My point being - not many people even really know what 'corporate backing' even really means.

Maybe we should stop looking for scapegoats, and focus on getting the damn software to work right?
 

I liked that one - speaking as a Valgrind developer. It's almost always the same thing - blame it on a false positive. Usually it's just bugs and losers that 'know' their code is right. This time it was to hide some skulduggery.

Thankfully I don't use Linux much and still have Fedora 39 - only Rawhide and possibly 40 are affected.
Thoughts on the infamous Debian Openssl patch?
 
Thoughts on the infamous Debian Openssl patch?
I thought this was a key point:
To do that, you have to understand the code involved and thedetails of the bug; those require understanding a little bit about entropy and random number generators.

Compilers generate warnings about things, blanket policies of "fix all compiler warnings" are often noble but need to be understood so you don't break things. I've been at places where "fix the warning by typecast" was the norm.
 
Back
Top