Slouching towards a cryptography monoculture

I'm saddened and disappointed by the current trend of jettisoning support for Libressl.

I'm puzzled by the failure to learn from the Heartbleed fiasco, which happened just seven years ago. I'm naturally paranoid, but this is not always a flaw. I'm not the only one who suspects new Openssl "features" are designed to break compatibility with Libressl and return to market dominance. I think PHK's pointed criticism still applies:
View: https://www.youtube.com/watch?v=fwcl17Q0bpk


The Openssl Software Foundation is still a for-profit corporation offering commercial support and FIPS compliance. I also find it interesting that the vulnerability comparison section of the Wikipedia page referenced in this Python library has now disappeared. I think it probably looked something like this:

Even if you think that my tinfoil hat is too tight and has cut off circulation to my brain, maybe you'll agree that monocultures are inherently fragile:
 
And then, you can just look at that from the pragmatic side. I have submitted quite a few libressl-patches for FreeBSD ports. Some day I was asked by one of the ppl maintaining Qt to please test their branch on github with libressl before they commit it to official ports. Of course I found something that needed to be changed.

Let's face it: LibreSSL promised to be a drop-in replacement for OpenSSL, but it never was in practice. And I'm not even talking about binary compatibility here ... plain API stuff. Combine that with the fact there is no standardized API for a crypto library (so, OpenSSL API became the "de-facto standard" everyone uses in their projects). Now combine that with the fact that OpenSSL also learned lessons from the fiascos happening before.

Maintaining compatibility with LibreSSL is a major PITA. I still use it, but I might drop it as well once I hit a real roadblock. You just can't expect distributors to sacrifice countless hours/days of work, just so all software they package works fine with LibreSSL.
 
I imagine it's a lot of work switching to a fork of a library that many applications and service depend on. I doubt it's worth the trouble than just fixing OpenSSL collectively. Honestly, I think there should be a from scratch rewrite with some OpenZFS-style governance. The OpenBSD folks are a rigid bunch I say.
 
It's not surprising in that distro's are dropping LibreSSL. As like some of the news said about it, that fork started off fixing what OpenSSL wouldn't even consider fixing to start with. Didn't help, when OpenSSL decided to do a major API design breaking change and refused to support a transition layer so other projects can catch up. Even later, LibreSSL just could not keep up on maintaining compatibility with OpenSSL. Even on Gentoo, LibreSSL ended up being a second class person that you constantly had to fight everything to keep the system semi-stable.
 
Not sure what I'm supposed to take away from this other than that project is not doing "careful work" in the eyes of some people?
 
Not sure what I'm supposed to take away from this other than that project is not doing "careful work" in the eyes of some people?
The problem, for me, with rust is everything other than the language itself. And in that project you linked to, I'm not sure I'd trust it anymore than something written in modern memory safe C++ (e. g. looking at the places they use "unsafe").

But the npm style approach that rust projects seem to take wrt. dependencies makes it harder to trust rust projects for me, for example.
Also, it's not entirely clear to me exactly _how_ much safer this is. rustls uses ring, which is derived from BoringSSL which of course is ultimately OpenSSL plus maintenance work. So at least some of the time the actual code running is (or could be?) literally identical. This is a contrast to some projects that weren't so performance critical and were able to entirely replace C with Rust like-for-like. Is it worth it anyway? Maybe.
No one in fancy languages camp wants to deal with backward compatibility obviously.
 
At the risk of being rude, replying in quotes is just whataboutism. I'll just elongate my original post by saying that in the spirit of the thread being concerned about monoculture, a second implementation in a memory-safety-oriented language like Rust (which would have prevented something like heartbleed) would be "of interest" to people (as demonstrated by the mere existence of what you linked:MesaLink) but of course (as contained in the links) implementations would need to be careful, which can mean any sort of thing from dependency curation, build tooling, how closely algorithm implementations are just straight up copied, not using the unsafe hatch, to being obstinate and/or lazy in implementing to a standardized specification.
 
Maybe the 'Heartbeat' incident was the minor problem.
The major problem might have rather been the intervention of the radicals around de Raadt that changed the SSL world.
This was a loss of control accident, a kind of Chernobyl.

Now this undermining strategy is interesting.
So many "donations" of dubious origins pouring into this disguised "nuclear" company, and out of it.
Bribing free software developers, what a disgusting trend 🤢

I guess the trustability of a particular software team can soon be measured by looking at their attitude against libressl.
In cases like HardenedBSD I guess it is not a lack of good will, but of resources.
But with big donation takers like Mozilla Foundation, this might become interesting to look who pushes most for abandoning libressl.
 
Excuse me please but I think there is GnuTLS, too? Please correct me if I was wrong.
 
I'll say this for LibreSSL: I have fewer problems building it across multiple platforms. Too many dependencies in the tool chain expected by OpenSSL.

Unfortunately the API has a large surface area that will take years to cover.

When LibreSSL was born, they should have named it OpenTLS.
 
Well, can't say I'm surprised to see this turn of events. In fact, I predicted this as soon as I learned about LibreSSL. There's nothing here to feed a good conspiracy theory, it all boils down to old fashioned common sense. As Zirias also rightfully mentioned: that "heartbleed comment" goes both ways; why rely on a nobody with all the risks of a possible new security disaster attached to it? You can say about OpenSSL what you want but it has build up quite a reputation with regards to reliability.

Instead of trying to become an "OpenSSL killer" they would have been better off trying to become "a better crypto engine". I mean... if GNUTLS can (somewhat) pull this off, then why not a new clean engine?

There have been so many projects which tried to become "a better xx project", only to crash and burn. It's one thing to make golden promises, it's another to actually live up to those claims.
 
The Openssl Software Foundation is still a for-profit corporation offering commercial support and FIPS compliance.
There's nothing wrong with that; it's not bad per se. Programmers have to pay their bills, too. What's important is that their product is open source, and it is.
I also find it interesting that the vulnerability comparison section of the Wikipedia page referenced in this Python library has now disappeared. I think it probably looked something like this:
Yes, IIRC it was like that. Let's peek into archive.org: Wikipedia:LibreSSL@Feb. 17th, 2018
Of course, behaving like a monopolist & changing public information available in a wiki is what we don't want, and it's good to keep an eye on this. Especially when it comes to vital building blocks of network security. This does not mean it was them (or on their demand) who changed that Wikipedia page, though.
Even if you think that my tinfoil hat is too tight and has cut off circulation to my brain,
Noone thinks that, since your statements are backed up by facts.

On BearSSL, IIUC it's goal is to provide a minimal set of features, to be used in embedded environments? Whereas LibreSSL aims at beeing a replacement for OpenSSL, which has, due to it's age, some old baggage. I don't know who decides on all these new security protocols & encryption standards, neither can I judge wether they're reasonable or just maketing blabla. Eventually it's the programmers/SW-engineers; if they do not use that new stuff, it would be easier to switch to LibreSSL or any other alternative TLS library. This domain (cryptography & crypt. applied to network security) is evolving rapidly; so maybe it's simply the lack of qualified manpower that LibreSSL suffers from.
 
At the risk of being rude, replying in quotes is just whataboutism.
No, it's using facts that can be easily verified.
I'll just elongate my original post by saying that in the spirit of the thread being concerned about monoculture, a second implementation in a memory-safety-oriented language like Rust (which would have prevented something like heartbleed) would be "of interest" to people (as demonstrated by the mere existence of what you linked:MesaLink) but of course (as contained in the links) implementations would need to be careful, which can mean any sort of thing from dependency curation, build tooling, how closely algorithm implementations are just straight up copied, not using the unsafe hatch, to being obstinate and/or lazy in implementing to a standardized specification.
Right, and Rust makes things like dependency curation difficult. This approach has known problems

It's hard to take seriously the security claims of any language that uses this dependency distribution model.
 
Context free facts aren't an argument. I don't know what you're saying about these things. Are they good, bad, irrelevant? Are you making the exact same argument as these people or something more nuanced?

Also, NPM and PIP aren't Crates.io, so I'm not so sure the "approach" is a 1:1 comparison considering how the "exploits" worked. You're also confusing memory safety for build "security." They're two different issues.
 
Also, NPM and PIP
I am glad you brought this up(outside repository). When I first used PIP I wondered about its security implications.
Here is a tool to install applications that are not visible to pkg. So they could be risky by not being updated by pkg.
Not to mention separate repositories from official FreeBSD.
I will admit when porting a Linux based program it can be useful for chasing dependencies.
Even building/porting applications which do not exist in FreeBSD ports system.

So should PIP be available in pkg form where unknowing python users can really shoot themselves in the foot badly.
When I hear users mention PIP I tend to cringe. You could really dig a shallow grave there.
 
From what I've seen on Gentoo, there isn't a good solution with pip. A lot of it is there is tons of packages on pip that you'd have to provide, where as the other side is like you said in that people will use it and shoot themselves in the foot and complain that stuff is broken. Didn't on Gentoo, their package manager depended on python that pip regularly messed with.
 
Fresh new high-severity vulnerabilities in Openssl:

Lovely spin at the end of TFA
OpenSSL has come a long way in terms of security since the disclosure of the Heartbleed vulnerability back in 2014.

Why do they feel like they have to market Openssl?

I don't see equivalent problems in Libressl:

But hey, let's keep on keepin' on. What could go wrong?
 
I don't see equivalent problems in Libressl:
LibreSSL: Releases
Oh sure, and right now I feel pretty relaxed for using LibreSSL, no need to rush any upgrades :)

But the problem persists, there's no standard for a crypto API, so OpenSSL API is the "de-facto standard", and LibreSSL is just following it, which unfortunately has gone wrong many times, requiring to patch software written for OpenSSL…
 
Jose, can you judge about the raison d'être (means of existence/right to exist) for all the new encryption standards that force us to rely on OpenSSL & hinder to "simply" switch to an alternate TLS library like LibreSSL? I can not, since I'm a crypto-noob. IIUC many are meant to be an answer to newly found weaknesses & really do enhance security like e.g. PFS (Perfect Forward Security), Multi-Factor-Authentication & DANE. As long LibreSSL does not support the most wanted of these new standards, we have no choice.
 
Please keep in mind as you read what follows that a little knowledge is a dangerous thing, and I only have a little knowledge in this subject.

From what I can tell, the deprecation of Libressl is not being driven by a lack of new features, but rather missing support for new APIs introduced by the Openssl project.

Indeed, the initial revision of the Libressl fork actually had more features than the Openssl release it aimed to replace. Libressl has also introduced a new API, libtls which aims to be "a new TLS library, designed to make it easier to write foolproof applications."

My understanding is that the API plumbing is separated from the actual crypto implementations in both Libre and Open ssl. The crypto code is written and maintained by cryptography experts. The API code is pretty standard systems code.

The Openssl project has introduced many API changes since the Heartbleed fiasco, and has deprecated the 1.0.2 API that was the compatibility basis for the Libressl fork. A cynical person would surmise that these changes and deprecations are at least partially driven by a desire to regain the de-facto monopoly status they had before the fork.
 
The Openssl project has introduced many API changes since the Heartbleed fiasco, and has deprecated the 1.0.2 API that was the compatibility basis for the Libressl fork. A cynical person would surmise that these changes and deprecations are at least partially driven by a desire to regain the de-facto monopoly status they had before the fork.
That's not impossible, but unlikely. A "legacy" API often isn't suitable when you move forward, which is the reason for:
Libressl has also introduced a new API, libtls which aims to be "a new TLS library, designed to make it easier to write foolproof applications."

Now, a new API by LibreSSL won't have any chance when almost every OSS project that needs cryptography uses OpenSSL. So, without some open standard for a crypto API, having more than one implementation always means that one sets the "de facto standard" and all others have to follow. This isn't a good situation.
 
Back
Top