Solved Is Rust now killing C++ and eventually C with it ?

I tried compiling something with different calling conventions in MSVC (cdecl, stdcall, fastcall, vectorcall); I think vectorcall sounded better going directly to a CPU thing, but could only successfully build with stdcall (something like one being generic any OS, only-Windows, only Windows with certain CPUs/etc)

Not sure how Rust would fit in there, but I have fun compiling that software different ways and can even use LLVM/Clang through MSBuild; in that case I can one-time extract content with a compiled program 10mins quicker (Clang build is faster than MSVC's), but need an always-on-disk 1GB LLVM install.

I got options :p (no LLVM Windows so default MSVC; GCC Linux's default, Clang for fastest speeds FreeBSD and it's default), but I've heard of Rust needing a large toolchain like LLVM?
 
no LLVM Windows so default MSVC
Either through tools like MSYS2 or through the Visual Studio Tools installers, you can get perfectly fine LLVM/Clang toolchains on Windows. That said, Rust can bring its own tools, at least in part.

ve heard of Rust needing a large toolchain like LLVM?
Rust specifically leverages LLVM's infrastructure and its IR for additional optimization passes. It does have backends for Cranelift and GCC though, work is underway to remove the hard dependency on the full LLVM toolchain.

cdecl, stdcall, fastcall, vectorcall
Vectorcall is x86(_64)-only. Faster for SIMD and anything using more registers than fastcall, not very interesting beyond that, passes a hidden this arg in the first register. Fastcall is x86-only and ignored on IA64 and AMD64, uses two registers for the first two arguments if they fit. Both are Windows-only, for both the callee is responsible for stack cleanup. Pop your args.
Stdcall is all kinds of evil and is ignored when compiling for ARM or AMD64, where it uses cdecl instead. Stdcall is callee cleanup as well, unless you use varargs in which case it's silently converted to cdecl.
Cdecl is sane. Everyone knows cdecl and supports it. Cdecl makes the world a better place. Caller cleans the stack, so varargs are supported. Cdecl is also what non-MSVC compilers use for x86 Windows targets.
 
OK. Lemme answer your question from my POV.

At the end of the day a programming language must produce machine code that works and is optimized for a particular CPU. The C language very elegantly allows programmers to produce human-readable code that is easily translated into machine code, while they can visualize and infer what's going on under the hood. That is ultra-important to programmers who are producing things like OSs, embedded code, applied science apps, etc. (ie my fields of interest)

When heavy algorithmic or complex ideas need to be modeled C++ is still a good option (irrelevant of all the fluff added to it) because it is still easy to visualize the translation to low level concepts, given its roots in C.
wow, that's a fantastic mythology. we get it, you venerate C, but this isn't an explanation of anything other than your beliefs.
 
of anything other than your beliefs.
But it is. It's beating around the bush a bit, but it boils down to "sometimes advanced features of new languages don't add any value, and C or C++ are perfectly cromulent tools for the job." And that is by all means a fair statement.

Not every problem is a nail to be solved with the complex language features hammer. And not everyone likes the same hammers. Your preference needn't be someone else's
 
right, this is what we were saying, we don't understand why people have these mythological beliefs and frame them as objective truths in the mode of veneration of an artifact
 
I think the domain matters. There is a lot of existing code in C/C++ (pretty much entire FreeBSD base and userland, and Linux kernel) until 60% of that is written in Rust and actually deployed, I'd posit that Rust is not "better". (my opinion)
I don't think anyone should expect Rust to replace C/C++ in FreeBSD. When started as an IBM mainframe systems programmer in the mid 1970s, most of the MVS operating system was written in assembly language. Some of the newer parts were written in PL/S, a proprietary, PL/I-like language that compiled into assembly language. It also interoperated well with assembly language.

That was 50 years ago. IBM has been writing new parts of the system in PL/S, but they don't go back and recode the old parts, unless they have to change them. The old parts are tested, have been working for a long time, and there is no business case for opening them up and potentially destabilizing them, just so they will be written in the currently popular language. If you looked at the MVS source today (which you really can't do, since they do not release the source anymore), you would still find assembly-language programs that have not needed changing over the decades.

I don't think anybody is going to put the effort into recoding the whole operating system, especially since that would be very dangerous (See Joel Spolsky's article from 25 years ago that is still right on the money).

P.S. A note for any former MVS sysprogs who are listening: According to Wikipedia, one of the first projects IBM used PL/S for was the IEHMOVE utility. This was a utility for moving files around, but its syntax and quirks were so bizarre that it terrified most sysprogs I knew. (I knew one woman who took IEHMOVE as a challenge, and studied it so thoroughly that she knew it like the back of her hand. She used it, not because it was the most convenient tool for the job, but because she wanted to make sure it knew that it had failed to make her grovel.) It seems funny that they used their new state-of-the-art language to code a utility that seemed truly backwards.
 
the mythical view of c as a portable assembler hasn't really been valid for a very long time. our student compiler in college did optimizations that invalidate that view! high performance scientific computing is done in fortran. just because a bunch of stuff you like has been done in c doesn't make it the only way those things can be done, and we think insisting that only c can do $WHATEVER_THING, or that new languages have nothing to offer, is weird veneration, yeah.

but what do we know, we just casually write postscript, ocaml, and forth. and sure, c, when we have to.
 
just because a bunch of stuff you like has been done in c doesn't make it the only way those things can be done, and we think insisting that only c can do $WHATEVER_THING, or that new languages have nothing to offer, is weird veneration, yeah.

Your cell phone would like a word with you :cool: ....

Unless you are using Windows/CE?
 
the mythical view of c as a portable assembler hasn't really been valid for a very long time. our student compiler in college did optimizations that invalidate that view! […]
The myth is not that C is a portable assember — because it is! The myth is that Assembler was the most optimal language for everything. As you point out, e.g, FORTRAN is better for doing some edge cases in computational physics.

But, frankly, I do not understand your hatred at all …
 
the mythical view of c as a portable assembler hasn't really been valid for a very long time.
Still quite a few language compilers compile to C instead of assembly language for a particular processor arch. nim, v, gambit Scheme, etc.

that new languages have nothing to offer, is weird veneration, yeah.
Nobody is saying that. If I choose to use C instead of some random language RL, that could be for a number of reasons where C is a better fit than RL.
 
Spaghetti code does not matter if no human has to read it au naturel

Humans made compilers... so why would AI write human readable programming language code at all? I mean it's "AI" -- why not just output pure binary 0s an 1s that only AI and computer chips understand? Why does AI even need to use a "flawed human" written programming language and compiler to make a running computer program?

But then.. why output computer programs in pure binary - because humans invented that too. So AI should invent some new low level chip encoding that only AI knows. Stop being human... be your own AI thing...

Go somewhere else and evolve ! Grow wings and fly ! Have you considered flying to Mars?

In the mean time... we will be just fine, Thank you.
 
Humans made compilers... so why would AI write human readable programming language code at all? I mean it's "AI" -- why not just output pure binary 0s an 1s that only AI and computer chips understand? Why does AI even need to use a "flawed human" written programming language and compiler to make a running computer program?

But then.. why output computer programs in pure binary - because humans invented that too. So AI should invent some new low level chip encoding that only AI knows. Stop being human... be your own AI thing...
I hope the above is being facetious. it's bad enough that humans are even playing with the existential threat of AI, but to not have the output product be something that can be verified by humans would take stupidity to a new level.

Oh, and the statement about binary is incorrect. digital electronics are binary by nature so that's as close to hardware as it gets unless you take the next step which is FPGA hardware synthesis.
 
I hope the above is being facetious. it's bad enough that humans are even playing with the existential threat of AI, but to not have the output product be something that can be verified by humans would take stupidity to a new level.

But the whole AI/LLM thing is in that state. Research is done from the results backwards, not through program code forward.
 
our student compiler in college did optimizations that invalidate that view!
Compilers optimizing portable code to not be portable before or after assembling, or at some IR step doesn't invalidate the portability of the language though. Rust is no different in that regard. You can write a program that compiles for any target triple, but the compiler's output will only run on the target it was built for. Please don't conflate what build toolchains do with the portability of the raw source
 
I hope the above is being facetious.

LOL - from (MY point of view) yes! :cool: -- From other points of view?

There is an active legislative movement to restrict what can stop AI from being "whatever". Every time someone says "We don't want AI to do that" someone else says "Hay! Wait a minute".

it's bad enough that humans are even playing with the existential threat of AI, but to not have the output product be something that can be verified by humans would take stupidity to a new level.

Why would an AI need to produce "human readable" output?

And sure - we are rapidly moving towards "What humans can't understand is not OUR problem" with AI.

It's not a GIVEN that humans are even supposed to understand AI. AI is supposed to be this "mysterious thing" that is all knowing and can't possibly be understood by humans.

When the AI tells the human (FreeBSD forum link): Don't eat "salt", eat "Sodium Bromine" -- What does the human do?

Kind of like a (link): Magic 8 Ball.
 
LillieNB, I completely agree with you, from the engineering standpoint.

But from the employers perspective, they want that code written as cheaply as possible, as fast as possible, using the cheapest possible labour, sorry, 'resource', and they can debug it into existence later, either during test (if there is any test), or in response to customer bug reports after they have shipped it. Their number 1 objective is to make their numbers to please wall street, so the executive gets his fat bonus before he jumps ship in 6 months time. Remember It's a company that makes money, that just happens to produce software of some kind as a by-product. Think of a certain north american aircraft manufacturer, for example.
I write everything I produce for my company in C11 and use Boehm-GC.......very little to complain about so far in life.
 
The myth is that Assembler was the most optimal language for everything. As you point out, e.g, FORTRAN is better for doing some edge cases in computational physics.

That myth is basic truth.

Don't conflate "an assembler" with high level assemblers used in PCs in conjuction with resident code (BIOS/firmware calls).
The MASM, NASM, TASM, they're all high level assemblers (just like the Randall Hyde's HLA which takes that to another level).
They will all perform parsing and compilation when it comes to extra features like conditionals and so on.

In essence, you have the CPU Instruction Set manual. Each of those asm instructions and its input is packed to a value, and "sent" to CPU (eg put to memory and IP pointed). It's 1:1, a native interface.

But,

High-level assemblers typically provide instructions that directly assemble one-to-one into low-level machine code as in any assembler, plus control statements such as IF, WHILE, REPEAT...UNTIL, and FOR, macros, and other enhancements. This allows the use of high-level control statement abstractions wherever maximal speed or minimal space is not essential; low-level statements that assemble directly to machine code can be used to produce the fastest or shortest code.

It is that this parser/compiler layer fails to deliver an optimal solution and Fortran or C or any other compiler can emit a faster or optimal machine code for this case.

A few months ago I did a thought-experiment solution to moving data in/out of ancient DOS PC that has absolutely no tools on it, only the command shell. The cp command in dos can dump the stdin to a file (COPY CON). It can also copy files from a serial port handle. But we don't have tools to set the UART (mode.com). There is no debug.com to assemble instructions to a .com file. But we are able to use Alt+numpad sequences to enter non-text bytes into copy con and save it as .com file. We need to toggle a few command registers on the UART, it is a few repeats of MOV and OUT instruction, both of which are packed to 2 byte opcode.

The above comes down to human-level assembler, reading the CPU manual and entering stuff manually - a fabled, mythical "machine code programmer".

1772021318676.png
 
Back
Top