Solved Is Rust now killing C++ and eventually C with it ?

Kind of a silly topic, eventually, but it is still interesting to see how far Rust has come.
I am a big C fan, and C++ was my go-to language for 9 years so, I would not be happy if Rust just overtakes C++ position.
As I searched the web today, I found that Bjarne Stroustrup called for assistance/help, for making the language more memory-safe ?
I do not really know how to interpret this, but if C++ really dies, does C also die ?
The American government, cybersecurity experts, etc claim that C/C++ is the source of many memory corruptions, exploits, etc, due to their unsafe memory handling, and everyone should move to memory-safe alternatives like python, C#, Rust, etc ?
Sure, there can be memory errors, but is the source not to 99% always the user ?

What does everyone think about it ?
I mean, if everyone ditches C/C++ no one will make new standards, upgrades, etc, right ?
 
C/C++ isn't going anywhere. For a long time.

The American government, cybersecurity experts, etc claim that C/C++ is the source of many memory corruptions, exploits, etc, due to their unsafe memory handling, and everyone should move to memory-safe alternatives like python, C#, Rust, etc ?

Sh**ty developers and rush to market practices are source of these issues. You can blame the banking industry for that.
 
C/C++ isn't going anywhere. For a long time.
Right ?
I also think the same...
The problem is in front of the screen, not the language. 😅

Sh**ty developers and rush to market practices are source of these issues. You can blame the banking industry for that.
Actually I blame our self proclaimed public experts, who just act like ones, but if inspected further are just a shadow of what they want to be.
Strange that actually the C++ inventor is ringing the bells.

EDIT:
I mean, C/C++ is a solid background for many OS systems, WINE, and what not.
Why do people without knowledge always want to prove their stance ?
It does not make any sense.
Even I am writing programs in C/C++ checking for reserved memory releases, NULL/nullptr pointers, etc.
Dolphin-Emu, PCSX2, RPCS3 are all written in C++ with proper memory mangement.
And our self proclaimed Dr. XYZ claim to know better, and everyone should follow them ?
I am very interested in cybersecurity, and yes, I am more or less involved into that field, but experts claiming to ditch crucial languages like C/C++ is something I cannot grasp.
 
I think the domain matters. There is a lot of existing code in C/C++ (pretty much entire FreeBSD base and userland, and Linux kernel) until 60% of that is written in Rust and actually deployed, I'd posit that Rust is not "better". (my opinion)

Now Applications to me are different. Are there applications currently written in C/C++ that would benefit (easier to maintain, inherently more secure) from being rewritten in Rust? Probably.
But I think one has to at least pay lip service to "history".
Can unsafe code be written in C/C++? Sure, but a lot of the common problems have been identified and tools (lint?) help identify them so the problems can be fixed.
Rust may be more secure by design, but it doesn't have the CPU cycles to prove it's better.

What I have seen is new languages that state "better/more safe than C/C++" typically create new methods/paradigms of programming in C/C++ to create the safety.
Think original C and pointers that have morphed to C++ smart pointers. Programmers used to have to keep track, now the compiler keeps track.
 
The major impetus for adoption is coming from the US government. And probably from other western governments.



And probably corporate sponsors sitting behind the government. Who came first isn't very clear.


Perhaps somewhere along the way this dovetails with the industry's desire to replace human programmers with AI, although it's not so clear to me how that fits into the picture. But perhaps AI makes it more feasible for unskilled humans to produce software written in a complex language like rust.
 
The major impetus for adoption is coming from the US government. And probably from other western governments.
I remember when the US Government was pushing Ada.

New languages/technology are defacto not necessarily bad, but sometimes "length of time in service" is need to actually find all the corner cases that lead to security holes.
 
Perhaps somewhere along the way this dovetails with the industry's desire to replace human programmers with AI, although it's not so clear to me how that fits into the picture. But perhaps AI makes it more feasible for unskilled humans to produce software written in rust.
Look at the ZED editor for Windows, Linux, and maybe OSX.
I tried AI in programming, but I just cannot and I do not want to let the AI decide on so called "best programming practices" what should go into my code or not.
I rather prefer code completion suggestions, if I do not remember one of one quadrillion commands.
But everyday I get news about AI here and there, and software developers losing their jobs due to AI ?
It is nonsense, but maybe reality...
 
Yes I remember the ada push. I think it did get used extensively in some areas like aerospace, I believe airbus used it for their fly by wire avionics, I don't know if their current planes still fly on software written in ada though.
 
They call it 'vibe coding'. You don't actually write any code. You blindly trust what the AI writes, and you cut and paste it in to your project without understanding it. The 'skill' is learning how to describe what you want to the AI. The boss loves it because you get to the deadline faster so he gets his bonus. And he only needs to hire a fresh graduate, not a guy with 10 years programming experience. It probably helps if you're doing the ayahuasca microdosing.

Sometimes it goes haywire which results in funny stories like this.

 
I remember when the US Government was pushing Ada.

New languages/technology are defacto bad, but sometimes "length of time in service" is need to actually find all the corner cases that lead to security holes.
I would call it latest trend, and lazyness...
I mean, you get a new blink blink, whatever, and people are head over heels, but this is not good at all.
 
They call it 'vibe coding'. You don't write anything. You trust what the AI writes, and you cut and paste it in to your project without understanding it. The 'skill' is learning how to describe what you want to the AI. The boss loves it because you get to the deadline faster so he gets his bonus.
You know, this are the discussions I hate until I puke.
You say it, the reality is like that.
And then the problems begin, called debugging.
Where does one person start, if the AI wrote the code for him ?
In the long term, the boss will lose his money.
The AI can write fanzy gibberish, and then ?
And that is the point where I think, people need to understand what they do, and understand the code they write, because they intend to do something similar written in that code.
AI is destructive in that case.
 
I don't think so. It's a hack over a hack.

86kkdxeofg.png
 
LillieNB, I completely agree with you, from the engineering standpoint.

But from the employers perspective, they want that code written as cheaply as possible, as fast as possible, using the cheapest possible labour, sorry, 'resource', and they can debug it into existence later, either during test (if there is any test), or in response to customer bug reports after they have shipped it. Their number 1 objective is to make their numbers to please wall street, so the executive gets his fat bonus before he jumps ship in 6 months time. Remember It's a company that makes money, that just happens to produce software of some kind as a by-product. Think of a certain north american aircraft manufacturer, for example.
 
I completely agree with you, from the engineering standpoint.

But from the employers perspective, they want that code written as cheaply as possible, as fast as possible, using the cheapest possible labour, and they can debug it into existance later, either during test (if there is any test), or in response to customer bug reports after they have shipped it. Their number 1 objective is to make their numbers to please wall street, so the executive gets his fat bonus. Remember It's a company that makes money, that just happens to produce software of some kind as a by-product. Think of a certain north american aircraft manufacturer, for example.
That is why I am asking, how are you going to debug AI code without experience in that language ?
The employees cannot tell the AI please fix error X,Y,Z.
This is worrying me.
What is more terryfing is you just dispatch experienced employees who can debug code, and fix errors, for an exchange of newbies who just tell the AI, I want X,Y,Z to happen.
This could be the future of IT, sure, but it is also clearly marking the end.
 
I also don't think C/C++ is going anywhere. It will when people drop it and no longer use it. It will not be outlawed or any sucht thing.
The major impetus for adoption is coming from the US government.
I feel the urge to quote Sir Winston Churchill here...
 
I also don't think C/C++ is going anywhere. It will when people drop it and no longer use it. It will not be outlawed or any sucht thing.
About which generation are we talking ?
Generation Z ?
I am already at the beginning of my 30s, and I hope to die before C/C++ gets well, replaced by nonsense, because X,Y,Z is the current trend.

I also don't think C/C++ is going anywhere. It will when people drop it and no longer use it. It will not be outlawed or any sucht thing.

I feel the urge to quote Sir Winston Churchill here...
We can only hope that relevance instead of blink blink continue to exist.
My IT teacher said, don't fix it if it ain't broken.
In this case it is trend + easy made money...
 
I for one want to see C and in particular C++ go away as soon as possible. I see a niche for C in system development (and I use the word system here in the narrow sense, OSes and their ilk like the storage or networking stack). I think of it is a high-level assembler, very close to the hardware, for tasks where the utmost in efficiency and control is required. But for development of large artifacts, C++ is just unsuitable, being way too complex, and not giving the programmer many tools to deal with safety and complexity. Stroustrup is right: without serious redesign and application of good taste, C++ is doomed. Alas, the committee design process, and the long history of C++ tells me that his warning will be ignored.

In the early to mid 90s, C++ was a fine solution. The moment Java came out, it was over.

I am already at the beginning of my 30s, and I hope to die before C/C++ gets well, replaced by nonsense, because X,Y,Z is the current trend.

I was getting paid for programming about 10 or 15 years before you were born. And no, I don't wish to go back to programming in Fortran, RPG-II or COBOL. These were all excellent languages for their time, and they still have features and advantages that modern languages have not really matched. But overall, productivity today is much higher today, and programming is more pleasant with better tools. And many of the languages that were the "current trend" for a while ended up being useful and productive. I know people in the defense industry who still swear by Ada: yes, it is verbose, and niche and badly supported, but for a large and very highly engineered project it works well. You just have to have some discipline. Similarly, the Dbase language of the 80s and 90s allowed lots of people to get projects done ... and it is something we have nearly forgotten. Some of the languages that came out of academic research as teaching tools (Pascal, Modula-2) were good as teaching tools, but not successful in production, but that shouldn't besmirch their memory.

I've started programming some in Rust (so far I'm up to hundreds of lines), and the more I use it, the less I like it. And I have not yet reached the point of having fun and getting things done efficiently. Matter-of-fact, I'm thinking of dumping it and switching to Go, because a few friends think highly of it. But more likely, for my hobby stuff I'm going to use something that intentionally tries to NOT be compatible with existing C/C++ libraries (even with wrappers), because that allows the language more freedom to do the right thing. We'll see.

My IT teacher said, don't fix it if it ain't broken.
Today's C++ is very very broken.

I feel the urge to quote Sir Winston Churchill here...
You're ugly.
And you're drunk.
Yes, but I'll be sober tomorrow.

That's probably not the quote you were thinking of.
 
Y'know, Mesa is migrating to Rust, and even core Linux kernel devs who know a LOT more than rank-and-file users (Like Greg Kroah-Hartman) are acknowledging that maybe Rust has some technical merits that make it a good choice over C/C++. GKH did say that there are corner cases in the Linux kernel sources that are really problematic to solve in C code, and would be buggy - but problems/bugs would disappear if Rust were used in such situations instead. I don't have the skill to take a look at the code and make sense of what GKH is talking about, but given that he's smart enough to manage a lot of Linux kernel development (AND sustain demonstrable success for a long time), I'd think GKH can be taken seriously when he says stuff like that.

We all can complain about how Rust has shortcomings - but is there a good, academic article that actually compares the two languages side-by-side, and have some solid benchmarks?

Otherwise, it sounds like pro-Rust and anti-Rust camps are full of people who are rather uninformed about the other camp, and draw conclusions based on very superficial/second-hand info, without considering how correct it really is.
 
The CHERI Project doesn't get enough mention around here or in the greater open source world. They're doing a lot of really game changing stuff there. Since they're addressing the problem at the lowest level; it really makes this whole debate regarding memory safety and Rust moot IMO.

Brought to you by us FreeBSD folk too. I might add.
 
CHERI == Capability Hardware Enhanced RISC Instructions. It is very very unlikely such an enhanced h/w arch. will see a widespread adoption. Even on such h/w platform, its benefits are more isolation and controlled access to external (to a process) resources. That won't fix the kind of memory corruption/access problems one can get with C/C++. There is still a place for memory safe languages which can allow compiling programs that are by design much less likely to be buggy. I am not a fan of Rust though. There are other options.

And I tend to think that complex compilers are in the long run a liability. How would you ensure that don't contain any built-in vulnerability? Just as one should be very careful using hardware designed by an unfriendly country.
 
The CHERI Project doesn't get enough mention around here or in the greater open source world. They're doing a lot of really game changing stuff there. Since they're addressing the problem at the lowest level; it really makes this whole debate regarding memory safety and Rust moot IMO.

Brought to you by us FreeBSD folk too. I might add.
Maybe pay attention to (and quote) at least one credible source before making unsubstantiated claims like 'Brought to you by FreeBSD folks' ?

Doing less than 5 minutes of Googling: According to Wikipedia, Capability Hardware-enhanced RISC instructions were NOT a new thing by the time there was a first attempt (in 2014) to run FreeBSD on the CHERI architecture. Capability-based architectures were a thing in 1970s and 80s. Yeah, RISC-V ISA did come out of UC Berkeley. There's kind of a reason why UC Berkeley is taking the credit for that and assumed institutional stewardship of the standard, rather than an individual researcher/programmer.

It kind of takes knowing what is so special about capability-based addressing, how long it has been in existence, where, and how it is different from mainstream commercial offerings.

Yeah, CHERI is an open standard, in the sense that information is easily available on the Internet if one knows how to research that stuff. Yeah, you can probably buy commercially available hardware with CHERI capabilities if you know what to look for.

But total control of the hardware - hmmm. It kind of requires a special kind of crazy to achieve that. Like having in-depth knowledge, ability to connect the dots, ability to do research, maintain focus and motivation for a long enough time to see results, ability to ignore outside life so that you can focus on that pursuit, ability to ignore the fact that everything around you is falling apart from neglect and lack of attention... that kind of crazy.

bakul : Because CHERI is an open standard, it actually does stand a chance - on the Loong processors. There's also a variant of AArch64 that supports CHERI: Arm Morello, which came out in 2022. But yeah, with Intel and AMD dominating the global commercial market, I don't see CHERI making much inroads into widespread adoption. Well, that can change - Qualcomm and Samsung did make inroads. Samsung with its Exynos chips (Mobile, I know, but think of the improved battery life as a major selling point!), and Qualcomm with Snapdragon (Yeah, nearly every laptop maker offers laptops with Qualcomm Snapdragon chips, where battery life is a major selling point). And Snapdragon is AArch64, a Tier 1 supported arch for FreeBSD.
 
but is there a good, academic article that actually compares the two languages side-by-side, and have some solid benchmarks?

But what would one benchmark? I would start with: Developer productivity, absence of latent bugs, long-term maintainability, stable development environment and stable language definition, ability to attract good developers, predictability of the development process.

Oh, you wanted to benchmark CPU and memory usage? Those are easy (or at least easier) to measure, but usually of minor importance today. Just because they are measurable and we have several generations of experience benchmarking them doesn't mean that they are actually important metrics. There is the old "streetlight effect" and "McNamara fallacy".

Otherwise, it sounds like pro-Rust and anti-Rust camps are ...
I'm solidly in neither camp. I don't know (yet?) whether Rust will be a good solution to replace C/C++ in the application or system development space. But I do know that C/C++ is awful, and has to go. And yes, I've used it professionally for several decades (starting in earnest about 1993), have written hundreds of thousands of lines, and quite a few of them are in systems that are still shipping and in use today.

And I also use that my current favorite language (Python) is not an alternative for wholesale adoption. It's great for many things, but I don't think I would build something like the Apache server or Apple/Google Maps in it.
 
Maybe pay attention to (and quote) at least one credible source before making unsubstantiated claims like 'Brought to you by FreeBSD folks' ?

You have an incredibly bad habit of attributing statements to things people actually never said or meant. Or being a habitual contrarian (and often hypocritical) for the mere sake of it; making assumptions. Maybe you should stop that? By the amount of post you have and time you spend on here; you should probably make better use of your time instead of starting pointless arguments about semantics. I haven't seen you once actually help a user try to solve a particular technical problem.

I'm talking about CHERI, as a project; based on the Capsicum Framework. Both of which were invented (or at least co-invented) by a renowned FreeBSD developer and Researcher; Robert N.M. Watson. I never once stated the design concept of capability based security came from FreeBSD. Nor that it was a new thing conceived by the FreeBSD community. Re-read my post again. So yes, those projects were brought to you by FreeBSD folk.

CHERI == Capability Hardware Enhanced RISC Instructions. It is very very unlikely such an enhanced h/w arch. will see a widespread adoption. Even on such h/w platform, its benefits are more isolation and controlled access to external (to a process) resources. That won't fix the kind of memory corruption/access problems one can get with C/C++. There is still a place for memory safe languages which can allow compiling programs that are by design much less likely to be buggy. I am not a fan of Rust though. There are other options.

While I agree market adoption could be a hinderance; but I still think it's a much more practical approach to the problem IMO. The Capsicum Framework (and subsequently CHERI) is in C, made for existing C/C++ programs, and requires little modification to a codebase. I'd wager using that would be more economically feasible than re-writing entire swaths of applications and low level code in an entirely different language, "just for security". In fact; one FreeBSD developer was able to get a fully functional KDE desktop working with only a small fraction of the codebase being touched. There's also a port for huge projects; like Webkit and Chromium. But hey, I digress.

I don't know, maybe it's just me. But all that rewriting, for just one metric, is just a wild concept to me, and something I'd hate to do. The Linux community is an interesting movie though. If some brave soul decides to write a production quality, ZFS-like filesystesm, or something like Dtrace in Rust; i'll start to sway my opinion. As those pieces of software are engineering marvels.
 
That is why I am asking, how are you going to debug AI code without experience in that language ?
The employees cannot tell the AI please fix error X,Y,Z.
An example of this is the AI generated Science Fiction stories. Sections repeat which are easy for another tool to catch, but other more subtle ones like continuity problems like "took a sip of coffee then put the cup down while taking a sip" or pronunciation issues "wound" (as in an injury) vs "wound" (as in wrapped the string around it).
 
Back
Top