Solved Is Rust now killing C++ and eventually C with it ?

Oh and to not forget, high low assembler, compiled, interpreted, whatever you use, if you use BIOS/firmware code you are consuming outside "library".

For example if one goes to program an interactive application using Fortran, or using machine code entry, the Fortran way can still be faster, if the machine code programmer does not implement his own optimal display and input routines and uses what is provided by the BIOS, while Fortran libraries do.
 
Humans made compilers... so why would AI write human readable programming language code at all? I mean it's "AI" -- why not just output pure binary 0s an 1s that only AI and computer chips understand? Why does AI even need to use a "flawed human" written programming language and compiler to make a running computer program?

But then.. why output computer programs in pure binary - because humans invented that too. So AI should invent some new low level chip encoding that only AI knows. Stop being human... be your own AI thing...

Go somewhere else and evolve ! Grow wings and fly ! Have you considered flying to Mars?

In the mean time... we will be just fine, Thank you.
Argumentum ad absurdum?
Abstraction does not need to stop at syntax.
 
Someone on this thread, or maybe somewhere else (I don't remember, sorry), mentioned that Rust is a difficult language to learn.
How difficult is it to use it properly?
How does that complexity compares to a complexity of proper memory management in C?
Isn't that a swap of one awl for another?
 
Not sure if this has been mentioned. I remembered this thread was going but never stopped by https://github.com/anthropics/claudes-c-compiler

I object to this claim in the article:

This was a clean-room implementation (Claude did not have internet access at any point during its development)

Presumably the model includes gcc and/or clang in its dataset. While it may not have referred to code online while doing the work, it certainly has echoes of existing compilers.
 
Presumably the model includes gcc and/or clang in its dataset. While it may not have referred to code online while doing the work, it certainly has echoes of existing compilers.
That would still be a clean-room situation tho, no?

Lets say you take a leading expert human in the field of compiler design and you put them into an isolated room with a computer (clean room) and tell them to implement a C compiler. Would that not be a clean-room implementation just because said leading expert has decades of experience in the field stored in their brain?
 
Leading expert will not be able to recite the codebase from memory given the right prompt. You can't treat these things like humans.
 
That would still be a clean-room situation tho, no?

Lets say you take a leading expert human in the field of compiler design and you put them into an isolated room with a computer (clean room) and tell them to implement a C compiler. Would that not be a clean-room implementation just because said leading expert has decades of experience in the field stored in their brain?

It depends. Did said expert previously read the source code of gcc and/or clang?

 
For me a big question is how much Rust will be needed if the hardware changes such that things like CHERI, ARM MTE, Intel/Amd ChkTag and Apple MIE become ubiquitous. If we have, roughly, the borrow checker in hardware then there will be no need to endure the pain of using it in Rust code.
 
For me a big question is how much Rust will be needed if the hardware changes such that things like CHERI, ARM MTE, Intel/Amd ChkTag and Apple MIE become ubiquitous. If we have, roughly, the borrow checker in hardware then there will be no need to endure the pain of using it in Rust code.
"If only we had these few convenient assembly instructions we wouldn't need no damn compilers with their pesky types!"

Borrow checker is a form of static analysis, no amount of exploit mitigations would obsolete the thing that finds issues before the program is even run. (Where this inane notion even comes from?) It can only be obsoleted by better forms of static analysis. Not to mention that crashes don't have to be security issues to be worth preventing in the first place, they are pretty annoying as they are.
 
Leading expert will not be able to recite the codebase from memory given the right prompt. You can't treat these things like humans.

LLMs do not store literal copies of the training material.

Although they come close to be able to recite some.
 
Lets say you take a leading expert human in the field of compiler design and you put them into an isolated room with a computer (clean room) and tell them to implement a C compiler. Would that not be a clean-room implementation just because said leading expert has decades of experience in the field stored in their brain?
Not if they read gcc/llvm as part of gaining their decades of experience. But if they started with the theory and algorithms (the dragon book etc.) that would be a clean room implementation. If you can be 100% assured that an AI was *not* fed with gcc/llvm but only compiler books and papers, yes it would be a clean room implementation but currently we do not have a way to get that assurance.
 
LLMs do not store literal copies of the training material.

Although they come close to be able to recite some.
Nor would an expert on the subject matter be able to recite compiler source, but they'd inadvertently replicate the harder and more unique concepts.

Regardless of exact numbers, if you're replicating the solutions you've seen before, it's not a clean room implementation. I'm nowhere near the level to write my own compiler, but I'd already have to excuse myself from clean room GCC or LLVM implementations because I've dived through patches relevant to GCC's library behavior and LLVM's x64 and i686 Windows SEH handling
 
That doesn't change that they don't store the original anywhere.
I didn't say that I understand this point but I thought it might be implied when I indicated that this is functionally indistinguishable. You could probably convince a jury that it's indistinguishable from encryption (the key being the prompt).
 
That doesn't change that they don't store the original anywhere.
You could call it a lossy encryption then. My argument in court would be that in face of a lack of creativity the result would by definition be a more or less good copy of the input, thus not a derived work.
 
It would still be a copy, having produced a majority of 1:1 material and definitely derived because you literally fed it into the machine.
 
How much change would you need to escape the copyright? The license? Over how many lines of code was this sco case fought?
 
Back
Top