Rust in the FreeBSD kernel

In the final analysis you have a microprocessor executing machine code. Any rules that the production process of the target machine's software attempted to enforce, by using a particular high level language translator, whether it be rust or any other high level language, can always be subverted; if the attacker can deliver his exploit code to the target machine and then kick the cpu into running that exploit with a sufficiently high privilege level. Processor designers have attempted to provide various ways of locking down the hardware (eg, intel SGX, 'Intel Hardware Shield' , the AMD/ARM equivalents, and so on) which in turn were rapidly broken or are in the process of being cracked by security researchers and hackers.

It's a never-ending arms race. Whether software written in 'Rust' is really any more secure than software written in C is open to question. I suspect as one category of exploits is closed off, others will be opened. The programming techniques that mitigate against well-known exploits in C code such as buffer overflows have been known for many years, they only need to be taught and actually used. Will considerably more complex translators like Rust really make it impossible for the programmers (fresh graduates, H1Bs or offshore) that companies typically want to hire, to write code that can be exploited? I doubt it. So I remain skeptical of the claims of the people selling "safe" high-level language translators; it sounds like snake-oil to me. It very much remains to be seen whether real software written in 'Rust' is actually any "safer" than software written in C.
 
Last edited:
Unfortunately, no.

It would simply cause lack of drivers for new hardwares, unless hardware manufacturers are mandated to disclose ALL information about their hardware that are needed to write device drivers, free of charge and with permissive licenses like BSD 3 or 2 clause.

Or hardware manufactures provides (regardless open source or proprietary) device drivers for FreeBSD like nvidia does currently. On nvidia GPUs, same GPUs running on Linux runs on FreeBSD, too, on X11 and Wayland.

Another possibility (and would be best for all non-Windows nor non-macOS OS'es) would be making UEFI to be complete VM host and all devices to be mandated to use UEFI runtime services, which are well abstracted.

Aren’t AMDs/Intels ISA docs open to the public? Whats to stop us from reimplementing their drivers or porting drivers over to our own driver model? I think most of our network drivers are in house. Surely we can do the same with GPUs? My understanding is limited, of course, but im optimistic.
 
That's either not the driver writer's bug
Design bug or not, it needs to be handled correctly without crashing. Going back to the Android OpenGL context example, the responsibility is on the app developer to handle lost GL contexts. There are i.e discussions like this.

and also would happen to any other language, so what's your point?
That is my point really. Rust doesn't solve these lower level memory issues any better than any other language (even C).

or an additional contract that the driver writer has to attend to (which also would happen to any other language).
Yep. But in the ideal world we want to be able to verify these contracts rather than just guess at them. Its silly to maintain these gigantic Rust bindings, only to *still* have to guess at the lifetime contracts anyway.
 
We Already have 2 languages. C and asm.
So adding another means the 3rd.
Not sure I buy that. My limited understanding of compilers is high level language -> intermediate machine-agnostic representation -> machine-specific assembler code. Assembly is always there.
 
Aren’t AMDs/Intels ISA docs open to the public? Whats to stop us from reimplementing their drivers or porting drivers over to our own driver model? I think most of our network drivers are in house. Surely we can do the same with GPUs? My understanding is limited, of course, but im optimistic.

Most of the more complex drivers for Intel chips in the Linux kernel are written by Intel. It is simply unknown whether the documentation is sufficient for writing drivers without internal help.
 
In reply to Jose...

Well, strictly, machine code, ie object code, is always there. Assembly is a human-readable language that translates directly to machine code. So... C and asm do count as two separate languages, although C is the only one of the two that is an abstracted 'high level' language. So I think that strictly speaking, he's right.

But yeah, I know where you're coming from..
 
Yes and no. Do you have the constructs in a language to stop the code from using these pointers behind your back while you still figure out if they are valid?
Do you have them in C? Then Rust will use them? 🤷‍♀️

Design bug or not, it needs to be handled correctly without crashing. Going back to the Android OpenGL context example, the responsibility is on the app developer to handle lost GL contexts. There are i.e discussions like this.
That's the pointer that the consumer from the driver has. Not the driver. Not the driver's issue (note closed with no commits or changes...), but perhaps an OGL specification issue where it needs to signal that all those pointers are invalid. If there's some kernel level thing that requires this, then... just do it?

That is my point really. Rust doesn't solve these lower level memory issues any better than any other language (even C).
It's currently solving a different problem, which is the driver produces no memory problems within its domain, but in the future, that problem could be solved with Rust, if the drivers are also Rust. 🤷‍♀️ People are not stupid, they see what class of problems Rust could solve in the immediate timeframe and the future.

Yep. But in the ideal world we want to be able to verify these contracts rather than just guess at them. Its silly to maintain these gigantic Rust bindings, only to *still* have to guess at the lifetime contracts anyway.
See above.
 
Could someone please explain to those of us who haven't taken a CS class since the early 90s: what is Rust? My own limited coding for the past few decades is in Matlab (not a real language, just a handy math environment with a "compiler" you may choose to use (most don't)). Seems like Rust is triggering folks, and I'd like to understand why. Thx...
 
Could someone please explain to those of us who haven't taken a CS class since the early 90s: what is Rust? My own limited coding for the past few decades is in Matlab (not a real language, just a handy math environment with a "compiler" you may choose to use (most don't)). Seems like Rust is triggering folks, and I'd like to understand why. Thx...

Rust is a high-performance Algol like language like C and C++ with zero or low cost abstractions. It uses manual memory management (not a garbage collector) like C and C++, but it is specifically designed to raise problems with your manual memory management at compile time. C and C++ catch these at runtime or not at all. Since you can't interface to a traditional OS or hardware in a memory safe manner it also allows specific segments of your code to be unsafe.

A lot of security bugs are the result of manual memory management going wrong, so Rust is recommended as a better language in this regard. The idea is that you have to carefully review less code (just the unsafe sections).

When considering languages for OS work keep in mind that you usually use C, not C++. C's expressive power is very low. So using Rust instead gains you many features you might want compared to C. Compared to Rust I'd say C++26 is ahead, though.

Problem is that these are many passionate Rust fans that want to insert it into C projects or want to rewrite existiing code in Rust.
 
Seems like Rust is triggering folks, and I'd like to understand why. Thx...
Because 22 years ago Perl was ejected from base for good reasons. It took alot of work to correct that work.
Please learn from the past.
Minimal base system is best.
I am not a programmer. Just a user viewpoint.

 
Seems like Rust is triggering folks, and I'd like to understand why. Thx...
Problem is that these are many passionate Rust fans that want to insert it into C projects or want to rewrite existiing code in Rust.
I’m not a programmer (can do only some sh, bash and awk), but following different events and dramas online, it looks to me that Rust doesn’t have “Rust Language” problem, it has “Rust bros” problem, and that’s what some folks are allergic to.
Just my 2¢ 🤷‍♂️
 
That's the pointer that the consumer from the driver has. Not the driver. Not the driver's issue (note closed with no commits or changes...), but perhaps an OGL specification issue where it needs to signal that all those pointers are invalid. If there's some kernel level thing that requires this, then... just do it?
I am hesitant to assume this is an OpenGL spec issue because it is specific to Android (i.e desktop GL on Linux/Windows/Web never strips the context). So it is definitely within the OS layer (perhaps not driver layer). Don't get me wrong, the Android platform design *and* OpenGL design are pretty horrible and full of issues.

It's currently solving a different problem, which is the driver produces no memory problems within its domain, but in the future, that problem could be solved with Rust, if the drivers are also Rust.
I don't think we disagree here. This is why Rust in all layers (i.e Redox OS) is so interesting compared to Rust being layered upon C. It means that full coverage of ownership is more likely and dumb designs like the above are made harder to end up with. Having i.e Rust -> C -> Rust just undermines so much of the lifetime verification.

Coupled with hardware verification like CHERI and this can even propagate down deeper into the hardware layer (admittedly that will be runtime verification rather than compile time so C++ with effort could also achieve similar).
 
I’m not a programmer (can do only some sh, bash and awk), but following different events and dramas online, it looks to me that Rust doesn’t have “Rust Language” problem, it has “Rust bros” problem, and that’s what some folks are allergic to.
Just my 2¢ 🤷‍♂️
I think the language itself has some problems. There is an old rule of thumb that states that the practical, engineering usefulness of a programming language (whatever it's theoretical merits) is inversely proportional to the number of pages in the book you have to read to learn it. My copy of K&R is just over 200 pages, and that's the revised ansi version, longer than the original. Wirth's "Pascal User manual and Report" was about 250 pages, and I believe the original pre-ansi version of Wirth was under 200 pages.

Whereas "The Rust Programming Language", 3rd ed. by Nichols et al is already over 600 pages. For comparison Strouestrup's 2013 C++ book was 1300 pages. The Rust language is complex and has a reputation of being difficult and time-consuming to learn (for humans, anyway).

Furthermore the language is unstable; new and revised features are still being added. The language is not standardized; there is no ANSI standard, and there appears to be only one compiler available at present. It will not surprise me if Rust continues to grow and follows a similar trajectory to C++. I will not be surprised if that Rust book grows to 1000 pages some time fairly soon.

For a 'dip-your-toe-in-the-water' flavour of what Rust is like to work with, have a look through this site about implementing linked lists in rust https://rust-unofficial.github.io/too-many-lists, which is actually quite a good exploration, although he mentions that it was written using the 2018 flavour of rust, so the language itself may have moved on since then. I can follow the gist of it, but I don't pretend to be a rust programmer myself. Maybe I'll work through it when I get a bit of time, he's made a good effort in writing it.

And then you can have a read of what Brian Kernighan thought of it, although I don't think he gave it a very thorough try-out. https://biggo.com/news/202509020213_Kernighan_on_Rust_I_Found_it_a_Pain

Of course if future software is going to be machine generated by AI and stitched together by cut'n'paste(tm) "vibe coders" who don't understand anything about the code themselves, maybe these considerations are less of an issue.
 
I think the language itself has problems. There is an old rule of thumb that states that the practical, engineering usefulness of a programming language (whatever it's theoretical merits) is inversely proportional to the number of pages in the book you have to read to learn it. My copy of K&R is just over 200 pages, and that's the revised ansi version, longer than the original. Wirth's "Pascal User manual and Report" was about 250 pages, and I believe the original pre-ansi version of Wirth was under 200 pages.

Whereas "The Rust Programming Language", 3rd ed. by Nichols et al is already over 600 pages. For comparison Strouestrup's 2013 C++ book was 1300 pages. The Rust language is complex and has a reputation of being difficult and time-consuming to learn (for humans, anyway).

Furthermore the language is unstable; new and revised features are still being added. The language is not standardized; there is no ANSI standard, and there appears to be only one compiler available at present. It will not surprise me if Rust continues to grow and follows a similar trajectory to C++ (and Perl 6). I will not be surprised if that Rust book grows to 1000 pages some time fairly soon.

For a 'dip-your-toe-in-the-water' flavour of what Rust is like to work with, have a look through this site about implementing linked lists in rust https://rust-unofficial.github.io/too-many-lists, which is actually quite a good exploration, although he mentions that it was written using the 2018 flavour of rust, so the language itself may have moved on since then. I can follow the gist of it, but I don't pretend to be a rust programmer myself.

And then you can have a read of what Brian Kernighan thought of it, although I don't think he gave it a very thorough try-out. https://biggo.com/news/202509020213_Kernighan_on_Rust_I_Found_it_a_Pain

Of course if future software is going to be machine generated by AI and stitched together by 'cut-n-paste'(tm) "vibe coders" who don't understand anything about the code themselves, maybe these considerations are less of an issue.
Thanks for the links 🙏 I had one C++ book somewhere in the mid ‘90s, can’t remember number of pages, but it was thickest book on the shelf (and that’s was next to the “Microsoft Windows 95 Resource Kit”). I really tried, I did, but stupid me never managed to grasp OO. Furthest I went was K&R C (ANSI), and that only for very simple routines.

Nevertheless, I love reading about programing and building (downloaded src) for various platforms/OSes is my hobby.

I know about Kernighan opinion on the subject, but from what I read online about it from many other reputable programmers, it’s not that language is bad per se, it’s compiler that is problematic.

It reminds me of Itanium catch 22 – CPU is revolutionary, it just lacks compiler that can produce code for it.

Looks like that it’s similar situation with Rust – language is great, it’s just lacks decent compiler 🤭
 
Furthermore the language is unstable; new and revised features are still being added.
Simple additions would be acceptable, if those are "enough logical".

But modifying and deletion in language specs itself are NOT acceptable.
As it makes "studies" meaningless. Learn once, earn 50 years!

On simple addition, new thing to learn appears, but anything already learned/studied ara still completely usable. Significant differences there.

And new language for new components only is fine.
So Rust for ports, except for kmods are "basically" fine.

But if the language want to replace existing codes, IT SHALL ASSURE SANE LINKING WITH EXISTING AND NOT YET REWRITTEN CODES, WITHOUT SINGLE BIT OF FIXES.
Otherwise, it SHALL provide translator for existing (targeted to be replaced) languages that works 100% flawlessly work as documented (for existing language). But it's not realistic, unless the language itself is designed to be a translator into existing language (for FreeBSD base, C, paritially asm and C++ in llvm?).
 
The compiler as such is fine.

It is the ecosystem with library distribution that draws criticism by some/many.
AFAIK, maybe ‘nightly’ is fine, but more than 40% of their pkgs use RUF, so compiler/language have some miles and miles to go before they get to be standardized, which IMHO should be requirement to be accepted and used in the production.
Please correct me if I'm wrong, I would like to know more about it (honest), if possible explained in 5-year-old terms, as I said I'm not a dev.
 
There is an old rule of thumb that states that the practical, engineering usefulness of a programming language (whatever it's theoretical merits) is inversely proportional to the number of pages in the book you have to read to learn it...
First edition of the The Java Programming Language was 333 pages. The fourth, and likely last is 928. Talk about foreshadowing what would happen to the language 😞
 
Well, let's not forget that these "<xyz> the language" books contain ever growing libraries.

The Common Lisp standard is large, although the set of language semantics is smaller than most other languages. But there;s an extensive library which you might or might not use.
 
Perhaps. I omitted the classic 'Programming Perl' book because half the book was functions that are basically c-library or system call wrappers. But K&R doesn't have a voluminious and ever growing library, and the C library itself has not suffered from that type of huge growth. K&R isn't much thicker now than it was 30 years ago. And I can compile C code I wrote 30 years ago too, perhaps with a few small modifications. And if you dig out PJ Plauger's book on the standard C library... that also has only changed a little. Having a standards body control the language is a major benefit to maintaining backwards compatibility, and hence software longevity.
 
Seems like Rust is triggering folks, and I'd like to understand
Legitimately poses a threat in the form of
  • "New stuff to master, this affects my status"
  • "I don't understand, kill with fire"
  • "That's how Dad did it and it's worked out pretty well so far. Change is bad"
Also, Threatens us with a good time (tm) by eliminating classes of memory errors.

If it didn't pose a threat, people would just ignore it.
 
Back
Top