The Case for Rust (in the base system)

Getting back to the original post, Rust in the base system, after projects are built there's no Rust nor C anymore, it's all binaries.
Lets say that Rust gets a better binary with better process memory management, no leaks, no backdoors, etc, I would say that can be achieved with C as well.
It's just a matter of being very cautious when writing the code.
So it's up to the maintainers to decide what to use to generate those binaries it's not of my business.
My concern is more that in case those Rust maintainers "leave" the group for some reason who will continue their work.
Is Rust a fashion language? It's only 9 years old right?
Do they teach Rust at school?
If not then there's a big change that it will get dropped some day.
 
Getting back to the original post, Rust in the base system, after projects are built there's no Rust nor C anymore, it's all binaries.
Lets say that Rust gets a better binary with better process memory management, no leaks, no backdoors, etc, I would say that can be achieved with C as well.
It's just a matter of being very cautious when writing the code.

It's a matter of being cautious when writing the code, being cautious when reviewing the code, being cautious when calling or interfacing the code somewhere else, being cautious when maintaining and refactoring...

My concern is more that in case those Rust maintainers "leave" the group for some reason who will continue their work.
Is Rust a fashion language? It's only 9 years old right?
Do they teach Rust at school?
If not then there's a big change that it will get dropped some day.

Rust demonstrates that memory safety through compiler-checked ownership is feasible and useful in a broad range of applications. Whether you see Rust as a language that will join the club of mature programming languages, or as a mere proof of concept fashion language, the actual concept of compiler-checked ownership will not go away. It has its merits and will influence the design of future programming languages.

Safe bet that Rust is taught at schools and university, there's currently no other option to introduce it's principles of memory safety. Understanding these principles is valuable even if you use a different programming language.
 
I see the major difference in using libraries that have to hand over memory to you, and maybe only temporarily so. Very easy for library writer and library user to misunderstand each other.

Rust's built-in mechanisms help here.
 
The entire point is that we tried that for fifty years and it didn't work.
Agree, but with a small correction: Programming in C or C++ and having memory safety and absence of race conditions / deadlocks can be done. But it is very difficult and tedious. You need very good people, budgeting time for detailed design, reviews, and testing (by intelligent human testers), and a central authority for architecture decisions. You can only use external libraries that are vetted to the same high standards. I worked for about 10 years in a group that produces a large and complex piece of system-level software (a cluster / distributed file system for HPC), and our bugs that got out to customers were not the simple C++ stuff like using freed memory, clobbering memory, race conditions or mistakes in parallelism. They were instead bad design choices, overlooking edge cases, or simply typing the code wrong. One of my favorite bugs (which causes large data loss, and had foreign policy consequences) was roughly this bit of code, correctly commented:
Code:
B();
// Do A() before B(), because if multiple nodes perform B() in
// parallel, at least one has to have done A() before. This only
// matters in the rare race condition that one node crashes
// between A() and B(), while another node is restarting during
// that moment.
A();
The comment explained exactly what needed to be done and why, and was absolutely correct; the code was literally two lines (one function call each), and blatantly wrong. This code was written by someone with a PhD in theoretical computer science, one of the greatest practitioners in distributed computing, and it was reviewed by about a half dozen senior engineers and PhDs, and nobody noticed and A and B were out of order. It was tested by a group of about 20 very evil testers, who read source code and tried to engineer specific tests for edge cases, and they never hit the bug. Yet this was shipped to a large government customer, and crashed there. No amount of converting this code from C++ to Rust would have helped here, and it shows that humans can occasionally make mistakes.

I don't like rust. Spec too big.
The C++ spec is much bigger.

My current Rust gripe (after less than 1K lines of code): The standard library is far too small. For many relatively simple functions, you have to use open source crates. There are often multiple crates that sort of do similar things, and it is hard to pick which one to use. Many of the crates that contain basic functionality (like tree walking, argument parsing, queues for implementing parallelism) feel like they were not thoroughly designed and tested; the feel more like someone's weekend hack. But don't take this as gospel, before I pass judgement, I need to write another few thousand lines.
 
Safe bet that Rust is taught at schools and university, there's currently no other option to introduce it's principles of memory safety. Understanding these principles is valuable even if you use a different programming language.
How will students learn about memory if they don even need to deal with it?
But I agree with everything else you've mentioned.
Personally I've tried Rust and did not like it.
I don't like dealing with multiple libraries and the conflicts and some of them developed for the same purpose but with different names I have to look for whats the best one to work with.
It reminds me Python.
 
You need very good people, budgeting time for detailed design, reviews, and testing (by intelligent human testers), and a central authority for architecture decisions. You can only use external libraries that are vetted to the same high standards.
Who vets the Rust compiler and how?
 
One of my favorite bugs (which causes large data loss, and had foreign policy consequences) was roughly this bit of code, correctly commented:
Code:
B();
// Do A() before B(), because if multiple nodes perform B() in
// parallel, at least one has to have done A() before. This only
// matters in the rare race condition that one node crashes
// between A() and B(), while another node is restarting during
// that moment.
A();

Gotta love this one. I'm pedantic about comments, if I had spotted that in a review I would probably have demanded to correct the "obviously wrong / outdated" comment... ;)

Rust lacks a formal language specification. It only has a "reference". How is that acceptable when adding it to the OS?

As long as its documented, this is mostly an issue if you have multiple implementations of the compiler toolchain. As in: Should I file a bug with gcc or with clang if they differ in behavior? Also language specs are moving targets if they have some real-world relevance, and are usually extended by non-standard features and compiler specific attributes etc.

What matters is the amount of undocumented behavior and the pace of backward-incompatible changes.

How will students learn about memory if they don even need to deal with it?

Rust is actually quite close to dealing with memory, compared to something like Java or Python. But obviously you need to learn more languages to cover different paradigms, like explicit memory handling, OO inheritance or pure functional programming. All I say is that it's currently difficult to avoid Rust if you want to demonstrate the paradigm of compiler-checked memory safety.
 
One of my favorite bugs (which causes large data loss, and had foreign policy consequences)
Foreign policy consequences... 🤣 That's like saying Uncle Sam sold buggy snake oil (a.k.a. Internet) to Venezuela as payment for real oil (the kind that gets refined into gas/petrol)... So now Venezuela has crappy Internet and no oil to sell to countries other than Uncle Sam... All because of a bug in the IPv4 stack???

My point here is that sometimes, such connections are rather far-fetched, and there's a lot of other factors along the chain of reasoning that have a rather outsize influence on what happens along the way.

If a plane needs to keep flying (and not fall out of the sky due to a software bug), that is one thing. But foreign policy - that's a different game.
 
astyle there are scenarios easy to think up where data loss can influence policy. But most of them don't contain honorfull conduct.
 
astyle there are scenarios easy to think up where data loss can influence policy. But most of them don't contain honorfull conduct.
yeah, WAY too easy to think up - and a LOT of effort to prove correct/incorrect. This is why I'm trying to caution against making far-fetched connections - this is how nonsense is born on the Internet, and nobody even stops to think how much sense it even makes. This is not reddit or 4chan, we gotta be able to police ourselves like level-headed adults who are above the silly game of one-upmanship.

I frankly think this thread merits closing - it's no longer about technical merits of rust, and the conversation has been going on for longer than it should have.
 
The comment explained exactly what needed to be done and why, and was absolutely correct; the code was literally two lines (one function call each), and blatantly wrong. This code was written by someone with a PhD in theoretical computer science, one of the greatest practitioners in distributed computing, and it was reviewed by about a half dozen senior engineers and PhDs, and nobody noticed and A and B were out of order. It was tested by a group of about 20 very evil testers, who read source code and tried to engineer specific tests for edge cases, and they never hit the bug. Yet this was shipped to a large government customer, and crashed there. No amount of converting this code from C++ to Rust would have helped here, and it shows that humans can occasionally make mistakes.
There have been things like Path Expressions (Nico Habermann and Roy Campbell 1975/76 paper/thesis) that could be of help in such situations. The idea is to specify concurrency in a way that a compiler can generate code for. For example "path {read}, write end" says any number of reads can overlap but a write can not overlap with reads or other writes. A compiler can generate code using mutexes under the hood to ensure this path constraint is followed. It would've been much more helpful if such a feature was implemented in a language and used by a competent programmer than writing a big fat comment that most people don't quite read accurately!

Unfortunately even "modern" languages like rust eschew such concurrency features. Even Go shies away from providing sufficiently rich support for concurrency. In a way it seems not much has changed since the seventies.

Path Expressions paper by Habermann
Path Expressions thesis by Roy Campbell
 
Back
Top