The Layman's Rust Thread

In order not to pollute the technical conversation going on around Rust in the Case for Rust in Base thread, I'm starting this thread for broader, ignoramus discussion.

A few days ago the famous annual Linus Torvalds interview was released, and he said two things in seperate contexts but which, considered together, are quite interesting. I don't think it's some crypto-message, I just think it's two things that he probably doesn't consciously realize are related.

The first thing is that he is worried about what I may call "developer deflation," a general over all decrease in amount of trained programmers working professionally, and a specific decrease in those that do exist who have active interest in developing Linux. In this context, he mentioned that, being realistic, Linux will have to depend more and more on autogenerated code for long term maintenance and development.

The second thing he said was also in the context of "specific Linux developer deflation." It is that, even though Rust in the Linux kernel hasn't caught on like he had hoped, he still thinks it is important, for the simple reason that it may combat the deflation.

What I don't think he quite goes on to connect is that Rust will probably help solve long-term maintenance problems with Linux, not because Rust has somehow caught like wildfire among programmers, but because Rust is probably a far more autogenerated code-friendly language than C. Not only in real terms, but most likely in perceived terms of autogenerated software companies.

Of course, the whole idea of autogenerated software is that a coding language is not needed, binaries can be directly produced. But if the goal is to transition maintenance and development of software that is very large, complex and sprawling, and already has a codebase, and cannot be et wrested entirely from traditional human-to-code maintenance, then a stopgap at least is needed. How does a program reason about a kernel? How do you measure outcomes such that you can adjust the originating binary on the basis of it? Computer code at least presents an atomizable outcome that computers can measure.

The underlying thesis here is that the main selling point of Rust, memory management with low-level performance, is of debatable transcendence for human coders, but is game changing for an autogenerated program that will have a much easier time with specific limits than with indeterminate loose ends.
 
What is Rust actually intended to do?

My son works extensively in ACH with web and SQL on the back end.

He mentioned Rust but we didn’t have enough time before his plane departed.
 
To answer that, you first would want to answer what C is good at, and why. C is good at systems programming, a term that refers to low-level software that has few layers between itself and the hardware, because the C programmer deals with abstractions that are not far removed from that hardware, and thus he produces software that doesn't have to do too many processes to achieve a given thing. Other programs would, behind the scenes, have many processes occurring to translate the abstraction down to the machine level. Additionally, the way C itself "compiles," goes from human code to computer binary string, also goes through few steps to translate into binary, while other languages, again, have to go through several processes to translate down.

So far, C has been the only serious contender that provides suitably fast speed and precise resource usage for systems programming. It is also, by the way, excellent (on account of its basic paradigm) for all sorts of other types of programming.


In previous years, Google launched an initiative that other actors later became involved with to create a programming language that can compete with C in terms of closeness to machine binary logic. Their reasoning is that C, in its simplicity, places a task on the programmer that they felt they could remove: the programmer has to worry about how hardware resources, "memory," are allocated to the different abstractions (variables) as the program processes them. This can be tricky, and oversight can cause variables to be stored longer than needed, or not long enough, or in awkward sections of hardware, that cause the data to become corrupted, or to enter a physical space that another program has access to (memory leaks).

Rust does something that other languages have done before: automatically determine where and when data is stored in memory, and for how long. Unlike those other languages, it can do this without sacrificing processing speed (amount of processing per instruction).

That, in a nutshell, is what Rust is intended to do.
 
In order not to pollute the technical conversation going on around Rust in the Case for Rust in Base thread, I'm starting this thread for broader, ignoramus discussion.

A few days ago the famous annual Linus Torvalds interview was released, and he said two things in seperate contexts but which, considered together, are quite interesting. I don't think it's some crypto-message, I just think it's two things that he probably doesn't consciously realize are related.

The first thing is that he is worried about what I may call "developer deflation," a general over all decrease in amount of trained programmers working professionally, and a specific decrease in those that do exist who have active interest in developing Linux. In this context, he mentioned that, being realistic, Linux will have to depend more and more on autogenerated code for long term maintenance and development.

The second thing he said was also in the context of "specific Linux developer deflation." It is that, even though Rust in the Linux kernel hasn't caught on like he had hoped, he still thinks it is important, for the simple reason that it may combat the deflation.

What I don't think he quite goes on to connect is that Rust will probably help solve long-term maintenance problems with Linux, not because Rust has somehow caught like wildfire among programmers, but because Rust is probably a far more autogenerated code-friendly language than C. Not only in real terms, but most likely in perceived terms of autogenerated software companies.

Of course, the whole idea of autogenerated software is that a coding language is not needed, binaries can be directly produced. But if the goal is to transition maintenance and development of software that is very large, complex and sprawling, and already has a codebase, and cannot be et wrested entirely from traditional human-to-code maintenance, then a stopgap at least is needed. How does a program reason about a kernel? How do you measure outcomes such that you can adjust the originating binary on the basis of it? Computer code at least presents an atomizable outcome that computers can measure.

The underlying thesis here is that the main selling point of Rust, memory management with low-level performance, is of debatable transcendence for human coders, but is game changing for an autogenerated program that will have a much easier time with specific limits than with indeterminate loose ends.

By the way, this would also explain why so much obsession on overtaking existing kernels instead of developing Rust-native kernels, like Redux. There is no science in autogenerating Rust code for a brand new kernel that you would be better off simply generating straight binaries for. This would be far more effective, as the autogenerated program would be optimizing starting from hardware. It is almost impossible that this isn't going on already behind closed autogenerated software company doors.

The interesting thing is to place existing infrastructure in the hands of autogenerated software paradigms. Not only would Rust's suitability for these programs help that, but its alineness to human programmers would have the effect of either pushing them out or incentivizing them to work only with use of autogenerated code.

Why would you bother? Because Linux, for example, is already everywhere. Torvalds did all the PR work and the sales for you. Market penetration is expensive business. There is also the fact that you would want humanpersons to do much of the initial Rust coding, as that would give the autogenerated software something to feed on. Sometimes, the most expensive refinement steps are the first ones. This is why, for example, chess programs still use openings that were designed by humans 150 years ago. Stabbing around in the dark is far more expensive for computers than for humans. Computers are better at correcting an existing attempt.

This would also suggest that maybe Rust's incompatibility with C is a feature, not a bug. You want people and programs to retrace the steps of the C programer, using an existing design but rewriting from scratch, because that is how the machine will get refined.
 
Some interesting thoughts, but I think something is missing.

I once was part of a team for adaptive hardware (not sure how far I can go into details) some 20 years ago. We had speedups of x1000, from standard C code to FPGA. And that went down due to "shareholder value" because one party could not tell it's investors they would invest into this black magic instead of paying the hedge funds.

We cannot approach this from a single vector, not only the software engineering. Other interested parties will also p*ss in the porridge when and how they see fit.
 
The underlying thesis here is that the main selling point of Rust, memory management with low-level performance, is of debatable transcendence for human coders, but is game changing for an autogenerated program that will have a much easier time with specific limits than with indeterminate loose ends.
Maybe. Though two things I feel impact this.
  • A "Rust from the ground-up" approach like RedoxOS is really needed to avoid impossible autogeneration scenarios (i.e handling lifetimes) around C libraries. Linux is already compromised (in unsafe C and GNU "ideas" ;))

  • If we are generating code, then perhaps sticking to a simple portable language (exactly like C) is likely. The generation tools (probably LLMs/AI) really doesn't need higher level. Higher level languages are for us humans. If we are out of the picture, then we would probably even go back to assembly if it was portable.
What Rust is really good for is smacking developers in the ears if they use poor memory architecture. And this is needed. Unlike Java that just lets them do weird stuff (even returning stack variables from functions, aka escape analysis), if they can learn via Rust, then when they do need performance with C, they understand why they can't do dumb stuff with memory. I can see it being used to teach in schools. That might actually be the only way it hits the industry, with the kids graduating and bringing it with them (aka the Apple approach).
 
If we are out of the picture, then we would probably even go back to assembly if it was portable.

If you were out of the picture, not even assembly would be needed. Autogenerated software could just refine itself on pure binary.

The point is: if you have to use a human code, because you are accepting the necessity to interface with a team that includes human developers, what is the optimum strategy?

The reasons you quote for using Rust on peoplepersons is exactly the reason I am saying it would be selected for machines. The contention is that the cost of learning memory safety for a person is actually far lower than for autogenerated programs. A human can sort of keep in the back of his mind that a loose end exists and needs to be closed. A computer is far more likely to go "aaaah no limit found errorerror." I am not trying to claim this in an absolute sense. I am just saying that the relative cost for a computer to keep track of unenforced memory standards is much higher than for a person.

A "Rust from the ground-up" approach like RedoxOS is really needed to avoid impossible autogeneration scenarios (i.e handling lifetimes) around C libraries. Linux is already compromised (in unsafe C and GNU "ideas" ;))

The reason I am claiming for autogenerated software writing Rust for Linux is not that it is the optimal way to write a kernel, but because there are serious incentives to take on inefficency costs in order to get at the end product: Linux. Using Linux here would not be a design decision, it would be the goal of the design. Given that goal, do you accept the permanent (hypothetical) cost of memory tracking imposed by C, or do you take an upfront hit in rewriting in Rust to have long term cost savings, along with other possible benefits quoted above?

I can see it being used to teach in schools. That might actually be the only way it hits the industry, with the kids graduating and bringing it with them (aka the Apple approach).

I am pretty sure that a large part of the strategy for getting market penetration in Rust was teaching it to university students. Established and self-taught programmers are far less likely to see enough payoff in learning a whole new paradigm to do something that they con do only marginally less efficiently (and only in terms of the subject of memory safety, ignoring every other difficulty that might exist) in C. Also university students are far more impressionable and tend to gravitate to want to feel part of something, to which I also attribute the heavy doses of snark in Rust apologetics around the internet.

Some interesting thoughts, but I think something is missing.

I once was part of a team for adaptive hardware (not sure how far I can go into details) some 20 years ago. We had speedups of x1000, from standard C code to FPGA. And that went down due to "shareholder value" because one party could not tell it's investors they would invest into this black magic instead of paying the hedge funds.

We cannot approach this from a single vector, not only the software engineering. Other interested parties will also p*ss in the porridge when and how they see fit.

Whereas I can appreciate that your reasons for being opaque are real, theya re just opaque enough that I can't with certainty tell what you are aiming at. Without lifting opacity altogether, maybe you can give additional remarks.
 
How does/Who does the validation of autogenerated code? AI tools? Who programs the tools with the parameters of system behavior?

I'm not against tools and autogenerated stuff, but at some point it needs to be validated as correct.
What Rust is really good for is smacking developers in the ears if they use poor memory architecture.
Learning good habits regardless of the language is a good thing. If a language like Rust helps enforce the proper mindset, then an individual will keep that regardless of language.
 
I don't like anthropomorphizing computer programs, but when I say autogenerated software, you can assume I mean that which has been marketed as aritficial intelligence.

I'm not against tools and autogenerated stuff, but at some point it needs to be validated as correct.

The quesiton here is not what you think. The question is what people with their finger on trillion dollar buttons think.
 
I don't like anthropomorphizing computer programs, but when I say autogenerated software, you can assume I mean that which has been marketed as aritficial intelligence.



The quesiton here is not what you think. The question is what people with their finger on trillion dollar buttons think.
I think the question is what value gets added for the end user. Plenty of examples where the corporate trillion dollar fingers flop because noone actually buys it.

Create a shit product, you may get a few people to buy it, but your market shrinks when they realize the product is shit.
 
Is Linus talking about LLM generated code here? Or human-written scripts writing C code like in the AMD video drivers currently?

Well, in strict terms, he is talking about code review by LLMs. But that's just Linus's self-bullshitting because he already took such a hard stance on the subject. "Just the tip."


I think the question is what value gets added for the end user. Plenty of examples where the corporate trillion dollar fingers flop because noone actually buys it.

Create a shit product, you may get a few people to buy it, but your market shrinks when they realize the product is shit.

There are too many variables here beyond "what will be the quality of the code the machine will write tomorrow for X kernel interface." If you are responsible for a trillion dollars, you are looking at multiple variables, and thinking in terms of decades at least.
 
He does go in detail into "vibe coding", whatever that is supposed to be. He talks about using autogenerated software to write code.

The video is here. For reasons, it is not simple for me to access youtube and get the proper link. Just add this to the url, or use google tWx769t1JKg
 
Point being, it's not really important what Linus thinks about it. It's definitely noteable that he recognizes it to some degree, and it's what sparked off my thoughts here.

But using autogenerated software generated code (I know, a mouthful, but precision matters) in the Linux kernal is a reality, already common practice. It will only grow, for reasons Linus also notices. Among others, people that might be turned off by programming with its rigour might still jump in if they can get their phone to write them the code. It's not really something that can be stopped at this point. Neither is the fact that real programmers will incrementally decrease, and jobs for them possibly at even a faster rate.
 
That's a very two dimensial interpretation of the various things he says about autogenerated code.

---

The most noteable part of the video is the fact that Torvalds, Linus Torvalds, head of Linux, is talking about the future of Linux development is uncertain. The influx of developers is just not whre it should be.

How does that problem get solved?

In my opinion, the answer is self evident. The real question is: is it better solved with Rust?
 
Then maybe rewatch the video. I'm sure anybody else that does with any amount of good faith will see how I arrive at that summary.
 
The most noteable part of the video is the fact that Torvalds, Linus Torvalds, head of Linux, is talking about the future of Linux development is uncertain. The influx of developers is just not whre it should be.

How does that problem get solved?

In my opinion, the answer is self evident. The real question is: is it better solved with Rust?

I don't think Rust helps with an influx of new developers as it is a difficult and low-fun, bad for prototyping language.

A lot of people might be scared off taking a career in software engineering because of all the layoffs and the job situation while the suits think their existing developers are now twice as effective because they use LLMs.
 
I don't think Rust helps with an influx of new developers as it is a difficult and low-fun, bad for prototyping language.

My writing style is often drawn out and boring, so I can understand where you might have skipped a few paragraphs. But my entire thesis is that Rust is the way here, not because it is people friendly, but because it is machine friendly, while still being human readable.

That is is human-unfriendly will only speed up the adoption of autogenerated software ode, because people will just want to skip to the end. But they will still review it. All of this is what autogenerated software requries to refine itself, and it also shifts the burden incremnetally away from people (who don't like Rust) and towards the machines (cost effective).
 
My writing style is often drawn out and boring, so I can understand where you might have skipped a few paragraphs. But my entire thesis is that Rust is the way here, not because it is people friendly, but because it is machine friendly, while still being human readable.

Yeah but to attract people you have to give them something fun to do. Rust is apparently less fun than C. I don't think how increasing the amount of Rust code in a project gets in more volunteer developers.
 
Back
Top