Fadlangs in base.

So the most straightforward thing to add new languages into base would be to choose from any of languages (other than already picked clang and clang++) that LLVM project officially has in their releases. If there's none that is wanted there, simply, it's tooooo early to consider, unless FreeBSD switches (again) to different toolchain project.

If Rust is incorporated, Rust can be the candidate.

That doesn't work since Rust would want a specific LLVM version, and would want frequent updates, while the C compiler does not and shouldn't be messed with needlessly.
 
That doesn't work since Rust would want a specific LLVM version, and would want frequent updates, while the C compiler does not and shouldn't be messed with needlessly.
Exactly. What I meant is that if Rust in base realizes, it should be better AFTER LLVM project itself starts providing Rust frontend. Not now.
And it would be wanted that Rust is standardized by ISO/IEC earlier than that.
 
While Rust had a growing following, you never heard too much about it until the US government recommended its usage for security purposes. Then, all of a sudden, there were articles everywhere stating everything had to be written or re-written in Rust immediately.

Not sure in which context you say we never heard too much about it, but Rustacean Station and Rust in Production are two podcasts that show there's a lot of real stuff being developed in Rust. This Developer Voices episode talks about why it was critical for InfluxDB to move from Go to Rust. Developer Voices is an incredible podcast btw.

So... I'm not saying anything about Rust in freebsd-src at all. But I think to dismiss it as a fadlang where nobody's doing anything serious is just ignorant.
 
Exactly. What I meant is that if Rust in base realizes, it should be better AFTER LLVM project itself starts providing Rust frontend. Not now.

I was talking about that state. You can be sure that the LLVM version required for the Rust part in FreeBSD will be different than what the C folks certified even afterwards.

And it would be wanted that Rust is standardized by ISO/IEC earlier than that.

That doesn't stop the language from evolving, just look at C++ which goes forward every three years.

Even if the Rust spec was static that doesn't mean that the LLVM version could be static. Newer LLVMs might have bugfixes that Rust in FreeBSD wants.
 
Yeah, What I've written is completely meaningless unless LLVM project implements Rust frontend by themselves, regardless it is a fork from original Rust or not. But it would require standardization at some point. Not same as the one by Rust project, but compliant with (not yet exisiting) ISO/IEC xxxxx:yyyy.
 
And unfortunately, if any of drivers that we need but only available via LinuxKPI start using Rust, we cannot avoid Rust for at least ports kmods. And it could mandate LinuxKPI having partial implementation by Rust.
Hope LinuxKPI doesn't forced to incorporate codes in Rust, but only ports kmod parts only.
 
The Linux kernel people seem to have to put a lot more effort into interfaces to C API than they thought. They are on it for years.
Indeed. They call them "abstractions" but really its just thin+fat bindings hell. Generators like bindgen/swig can't hope to solve this issue (an LLM might be viable in the future once the monetization novelty wears off and actual engineers start exploring the tech).

RetroComputingCollector, specific "modern" languages aside, I strongly believe it is keeping the codebase homogenous which is the key thing leading to decent software. RedoxOS is ironically a good example of this; the fact that they are using it as an attempt to "prove" Rust is "relevant" may (unknown to them) keep the codebase clean and focused simply by not mashing in a load of ecclectic noise into the codebase and build systems. And Rust is crap to bind against, so it also helps eliminate Python, Perl and all that clutter too. (Ironically if we didn't care about binding other languages, C++ would be way more prevailant too. The Rust community completely overlook this responsibility as a systems language. This convenient oversight probably helps their viral crusade to be fair)

That said, I don't think bringing Rust into the FreeBSD base would even satiate the Rust guys. Our LLVM stack will always be a couple of steps behind the "exhilaration" and wild west they crave. But of course PkgBase *will* fsck(8) that up, so I suppose they might find a good home here.
 
patmaddox You can find a podcast about anything so that's no indication of anything. The last I heard of Rust was when Mozilla introduced it long ago and you'd hear about it here and there but, compared to every other popular language--not much. Like AI, now it's been in the hearts and minds everywhere, overnight, and that's just too fast.
 
if any of drivers that we need but only available via LinuxKPI start using Rust, we cannot avoid Rust for at least ports kmods. And it could mandate LinuxKPI having partial implementation by Rust.
That's a good point. We have a lot of drivers, sufficient for lots of purposes, already. If that were to happen, FreeBSD will still be fine. Only newer drivers from Linux, such as for newer graphics cards, wifi, scanners, and Bluetooth would be affected. It may stop FreeBSD from using newer drivers written in Rust.

The upside is those codes in Rust could be rewritten in C under a more permissive or file-based license. It takes keeping patrons and having lots of developers.

Refering to Redox & Rust:
codebase clean and focused simply by not mashing in a load of ecclectic noise into the codebase and build systems. And Rust is crap to bind against, so it also helps eliminate Python, Perl and all that clutter too.
Python is a necessary scripting language for math and science. If you mean its currently used where not needed, or that it is not needed for many typical programs, then ok. Not sure if you mean doing away with Python for most purposes including for math and science, or for a typical dependency in common programs. As for Perl, it's time for it to be replaced by Ruby.

While Rust being difficult to bind against has an advantage, there could also be disadvantages to that. Python and other scripting languages are often dependent on non-scripting programming languages. Also, a Rust program may need more advanced math functions from Python.

You can find a podcast about anything so that's no indication of anything. The last I heard of Rust was when Mozilla introduced it long ago and you'd hear about it here and there
Rust is here to stay, and will have a major following for at least 15 years from now. After that, it will become niche level, but still relevant. Something that learns its lesson from Rust and other modern languages will replace or complement it, similarly to how C++ complements C. D, C# and other languages are still around, and will be as long as C is mainstream, even if they're not prevalent. Starting now though, D, C# and other older languages based on C will start declining, because modern languages already are being developed to do what those didn't solve. That momentum won't pick up, until Zig or something that learns lessons from it is at a stable point.

Redox is the next mainstream OS. Until a better language comes along, and a comparable OS is built with that, Rust will be mainstream. What will keep Rust from gaining as much as it can are difficulty, and the way Cargo is redundant to any OS ports tree. Difficulty of Rust is a trade off in this situation for security, so that's acceptable for the purpose. Cargo is going to hold Rust back.

The language that becomes prevalent in the future, even in the market of security will use a package manager which installs libraries only, not entire programs. The language might even be based on Rust, somewhat backwards compatible with it.
 
... I strongly believe it is keeping the codebase homogenous which is the key thing leading to decent software. ...
Counter-example: Digital's VMS has a whole lot of programming languages available, including Bliss (a descendant of B, the predecessor of C, which was used for what we would call the kernel), assembly, COBOL, FORTRAN, PL/1, RPG-2, and eventually C. To make sure the libraries for ALL languages were kept in good condition and always available, the development team decided that each supported language was going to use in some utilities that were part of the base (non extra cost) installation. The craziest example is that the monitor utility (sort of what ps and top do on a Unix machine) was written in RPG-2, which is usually purely a report generator language. It must have been difficult, but they did it. Why? Because they wanted to to deliver a high-quality product, the OS development team needed to keep their skills up up and keep software maintained in all these languages.

Shipping decent software stems from a quality mindset. Not from using just one language.

Now, whether the FreeBSD core organization has the manpower required to maintain more than one language in base and make sure it is a high-quality product is an interesting question. Or whether it has the manpower to maintain other things that are installed with the base installer (see the lengthy discussion about KDE in the installer). Or whether it even has the manpower to maintain the core OS itself. I keep getting back to my standard observation: The WiFi subsystem had little maintenance done to it for the last 10 years, and lost the ability to be a general-purpose AP about 10 years ago (I spent weeks debugging that). Now finally the foundation is funding an overhaul, but nowhere have I seen that the AP functionality is getting fixed; the work is support for higher speeds and new devices, for the laptop crowd. In the meantime, under Linux even a lowly RPi makes a fine access point. The FreeBSD community is abandoning server and embedded functionality in the quixotic quest for desktop usage.
 
Why? Because they wanted to to deliver a high-quality product, the OS development team needed to keep their skills up up and keep software maintained in all these languages.

ralphbsz, thats mad. Kinda feels like the old Gnome 2 notetaking app "Tomboy". Shoehorned into Gnome, just to help mono penetrate the ecosystem.

With VMS, was there any evidence/postmortem/debrief that ended up suggesting that this was a good idea and successful in hindsight?

I guess David Cutler didn't take this idea forward as inspiration for NT? That said, I suppose the Windows platform had enough languages in there at this point.
 
No, it's not mad. It's development management (software engineering management) making sure that all the languages that are shipped with the OS are well supported, by forcing themselves to use them. In Silicon Valley that's called "eating your own dog food". It was also supposedly to make product management doesn't cut any languages from support, or make the language-specific libraries non-free in future versions. The idea was that customers pay for compilers, but once they have compiled into an executable, that executable can run on all machines.

Given how Digital failed (with a whimper, not with a bang), how VMS was "supported" forever even after sales of systems dwindled to near zero, and how much staff left after the sale to Compaq -> HP -> VSI, I doubt that there was every any postmortem. Digital didn't fail due to bad technology, it failed due to gargantuan management blunders.
 
No, it's not mad. It's development management (software engineering management) making sure that all the languages that are shipped with the OS are well supported, by forcing themselves to use them. In Silicon Valley that's called "eating your own dog food".
Dogfooding is an absolute must. But I would suggest that the norm today is to *reduce* the number of languages shipped with the OS, so there is less to support. Simple cost savings (similar principle for free open-source projects too in terms of time).
It was also supposedly to make product management doesn't cut any languages from support, or make the language-specific libraries non-free in future versions. The idea was that customers pay for compilers, but once they have compiled into an executable, that executable can run on all machines.
OK, if the business model is to push as many languages onto customers as possible because you are the compiler seller, then this is quite a different thing. That said, I don't believe this business model can work anymore, now that most of the industry has normalized on C (for better or for worse).
 
and lost the ability to be a general-purpose AP about 10 years ago (I spent weeks debugging that). Now finally the foundation is funding an overhaul, but nowhere have I seen that the AP functionality is getting fixed
I was actually using TP-Link Archer T2U Plus in the hostap mode (to connect my iPhone, since the poor thing doesn't have an Ethernet port) for the last 2 or 3 years. I even got 30 Mbits per second out of it, which is probably the record for FreeBSD 13/14 (this issue is supposed to be patched in 15).
 
Much software has been written in the C language that has successfully stood the test of time (Linux, *BSD, even the Windows-Kernel!, etc.)

So it is possible to write correct, robust and what-not code. The success story of C proves that the language does something right.

What is the exact problem? Is it that being successful with C is difficult? I.e. that much experience and training in needed?

Why are training, time-tested idioms and good coding standards not enough?

Switching from C to anything else will come with a cost. What is it that could outweigh this cost?

I have never been a friend of programmer's convenience.
 
What is the exact problem? Is it that being successful with C is difficult? I.e. that much experience and training in needed?

Why are training, time-tested idioms and good coding standards not enough?

C is arguably underfeatured. You can't even have hashtables working for multiple data types without casting things around. Error handling is also highly deficient. No integer overflow checking, not even optional. No array bounds checking, not even optional. The preprocessor macros suck, even M4 is better.

Those problems don't go away with conventions.
 
[…] No integer overflow checking, not even optional. […]

Those problems don't go away with conventions.
Thanks! Valid points, indeed. I was shocked when I learned (rather late in my life) that signed integer overflow is undefined behavior in C. It is easily catched in Assembly, but not in C, where you need some compiler-specific __builtin__add_with_overflow_checking stuff.
 
Back
Top