Will FreeBSD adopt a No-AI policy or such?

I currently don't using, even don't planning to use AI myself.
This is not because any technical reason, but legal reason.
Legal issues about AI, especially about data used to train it, is still unclear.
I could try AI if international law clearly defines what's allowed and what's prohibited, and AI companies 100% respect and follow the law.
UN should be responsible with it.

My opinion is, until the legal aspects are cleared out, AI generate codes and documents should NOT be used, at least with knowing the fact.
On the other hand, using AI only for code / document REVIEWS and TESTS would be "relatively" legally safe. I don't object this specific use-case with AI even currently.
Meh, I'm sure my past-use of LimeWire balances something out somewhere for anything legal :p (good artists copy, great artists steal; pirates do what they want)
 
"what is the next word that commonly comes after X". There is zero intelligence there. Its basically a search algorithm.
It's supposed to be a more complex thing. I'm not an expert, but basically each cell of a neural network has multiple inputs and a single output. During it's "learning" process the weight of each input changes, hence the "decision making" gets more and more accurate.
 
It's supposed to be a more complex thing. I'm not an expert, but basically each cell of a neural network has multiple inputs and a single output. During it's "learning" process the weight of each input changes, hence the "decision making" gets more and more accurate.
That's pretty much the algorithm. Also not an expert (since AI has only been a hype for ~3 years, very few are). The neural network design is fairly old but this weighting / feedback ("training") approach has been used for OCR / dictating software for a while. The fuzzyness of it also works well so you can reduce size of the database.

(This is going from my knowledge from my university days over two decades ago where I had to write a neural network to detect the letter "T" ;))
 
There just was a landmark court decision in Germany that said that current LLMs reproduce song lyrics so accurately that it counts as copying and is hence NOT fair game.

That line of decision making could go all the way to the problem of using GPLed code as input to generate non-GPLed code.
 
AI bubble burst:
So on happenstance I found a "blog" (it was a video/youtube--so, if that counts as a 'blog' to you) by some person called "Lunduke" (well it was "lunduke journal of technology" and had a pixelized character of a person next to it so I just assumed his name was Lunduke) that said Linux created AI coding guidelines. I believe this may offer a bit more context to the OPs initial post.

I won't speak to the "over dramatization" / obvious "misconceptions" this Lunduke person had about the small snippets of those guidelines he shared but he went on to talk about how "when the AI bubble bursts" the regular coder's abilities will have atrophied to a point where they can no longer contribute (paraphrasing; but it was something along those lines). I'd say that stance is almost pure BS (more click-bait than facts/educated guesses and/or not that well thought out) but I can almost see a connection to "real world problem" projects have.

Project contributions are / have always had the problem of maintainability (the obvious topic of code reviews aside--because no project would just blindly accept random code). Take a project you wrote, and I come along and drop 1000 lines of code to do X. A factor in your accepting my PR should be how long I will be around to help you maintain X because if I'm just some fly-by-coder and have no intention of maintaining X then you've just accepted a whole bunch of work. So, I guess I'm still a bit confused about what the OP means when/if the ai bubble bursts. Does (s)he think the code will then magically disappear or that there will not be anyone to maintain the ai contributed code? And by extension, does this translate to the project itself atrophying when/if the bubble bursts?

I'm not sure what you guys are talking about with the hammer/gun thing, but I guess I was always taught that if you're good at what you do, the tool has less to do with the outcome than you'd think.

16oz = editor + make + cc
Framing hammer = IDE
Framing gun = AI/LLM
?

Wasn't there mention of "dancing baloney" and "meat and potatoes" by Theo at some point? I don't remember the quote/reason but seems like a better fit/stance to me than a hammer.
 
Even a good framing hammer won't help someone with two left hands. Just like a good car doesn't make you a good driver.
Agreed but in the "right" hand.... :)

Actually had a good example of this today at work. Been chasing a werid gstreamer pipeline issue in legacy code that was never brought from 0.10 to 1.0. Old school me added logs at key places so I could get info and hand walk a mess of nested if/elseif/blah (not my code inherited over the years). Realized what step was not happening. Youngster fed logs and code into some AI tool, AI tool came up with the same conclusion as me and the fix was where I was headed.
So in this specific instance the AI agreed with me so I'll accept it as valid :)
 
Well, I am downloading Google Antigravity right now. An entire IDE dedicated to vibe coding. Cover me, I'm going in.
We need to go back to the "good old days" when software was developed by late nights fueled by coffee (real coffee not iced mocha choca oat milk lattes) and pizza? This new age crystal based programming is raising my hackles.
 
Again the marketers want to push AI products in any light that sounds powerful, including fear-mongering but the algorithms really do just stem from dictation software we have been using for decades; "what is the next word that commonly comes after X". There is zero intelligence there. Its basically a search algorithm.

What's interesting is that life-long programmers are one of the groups most likely to misunderstand what autogenerated software is. You already have such an ingrained idea of what software is or, rather, how it is produced, that you might not even realize other ways are possible, and thus be innured to the intricacies of those different ways.

What you wrote is a little like writing that a 2008 pc running Windows XP is just a slightly more advanced version of an equation written out in a high school algebra class.

Also, the thing about "text" is a big misunderstanding. For a long time, the autognerted software industry was producing software where text never came into play. Detecting objects in images, producing frames given previous frames, navigation, that sort of thing.

But, hey, if somebody gave it the name language model, than that's what it definitely is, right?

The power of good marketing.
 
In a way, a traditional software developer making decisions about autogenerated software is a little like architects making decisions about chemistry.

There may be some vague crossover there, but more likely there is simply arrogance.
 
What's interesting is that life-long programmers are one of the groups most likely to misunderstand what autogenerated software is. You already have such an ingrained idea of what software is or, rather, how it is produced, that you might not even realize other ways are possible, and thus be innured to the intricacies of those different ways.
No offense, but I'm calling BS on this. I've been around long enough to have experienced a lot of autogenerated software. automake, autoconf? someone writes rules and a tool turns those rules into software. Old fashioned CORBA, DCE? Autogenerated software.
It's not about "how" software is generated, it's about "is the software correct" aka does it meet requirements.

AI stuff around "text" (stories) the output is guided by how the AI is taught. Teach it bias, and the output is biased. Relating that to software, bias towards malware and backdoors, the output is biased to malware and backdoors. Run the output through AI tools biased towards ignoring malware and backdoors and you wind up with AI generated code that is verified 100% safe by AI tools.

There are lots of Mafia related examples that fit here
 
No offense, but I'm calling BS on this. I've been around long enough to have experienced a lot of autogenerated software. automake, autoconf? someone writes rules and a tool turns those rules into software. Old fashioned CORBA, DCE? Autogenerated software.

This is exactly what I mean. To even put those tthings in the same category is a staggering misunderstanding of the undelrying technology.
 
For me, as a user, it's sad, because I know where FreeBSD will end up.

But at least now I know. And I can predict a fairly decent schedule of the timelines involved. At least it won't be overnight.

Just like facebook. It wasn't the cool kids that made it status quo. It was the previous generations! Sometimes, a little disconnection with technological advances is the lubricant needed for their adoption.
 
What's interesting is that life-long programmers are one of the groups most likely to misunderstand what autogenerated software is. You already have such an ingrained idea of what software is or, rather, how it is produced, that you might not even realize other ways are possible, and thus be innured to the intricacies of those different ways.
Currently programmers are the only people who understand the limitations of tools and why AI code generators are not going to exist in our lifetime.

When I was a kid, I enjoyed code generators. I made loads of fun little games with Games Maker back in the day. One might say it triggered my passion for writing proper software.
 
Many years ago, when I designed medical computers for Bausch & Lomb, I would come home from work with code on my mind. I'd walk around the house and talk about a problem with my wife who knows nuthin' 'bout 'nuthin 'bout electronics or programming but she would blurt something out to be helpful. What she would say didn't hit the nail on the head but it might steer me to an AhHa! moment.

When I experimented with AI and code development a while back, I got code examples that did not work. One would not compile. Another gave no output. And a third was just plain silly.

It has to be reminded that these AI tools develop their skills from what they learn online and other resources. Just like you. They piece together information they've read about and hope it works out. But there are no guarantees. Their disclaimers even say so.

A new development I read about is interesting. In some cases they've found ways of doing things a human did not think of. It's a matter of there being a thousand ways to do something and AI could rummage through them all quicker than a human could. Assuming, of course, that their result works at all. Just like you. Only faster.

But a human still needs to review it.
 
  • Like
Reactions: mer
Currently programmers are the only people who understand the limitations of tools and why AI code generators are not going to exist in our lifetime.

When I was a kid, I enjoyed code generators. I made loads of fun little games with Games Maker back in the day. One might say it triggered my passion for writing proper software.

I am sure the thought is comforting. Even though it really just pushed the ball to "the next generation's lifetime." Not to mention the problems of it being comforting. Does it mean that if they were going to exist in your lifetime, then you would fret?

Unfortunately, it reflects a complete lack of understanding of the technology involved. Comparing it to code generators you used as a child is almost monstruous, though I suppose there might have been an ingredient of trolling there.

Let's take a look at how you, an obviously capable traditional systems programmer, look at a computer system, and let's take a look at how an autogenerated software engineer looks at it. You think "the sky is the limit, what I can do is only limited by what I can imagine and by how well I can write it." An autogenerated software engineer looks at it and thinks "can I exploit 80% of its total actual capacity, or only 70?"

That I guess is the last thing I'll say. Where pride and fear combine, reason is helpless. Smarter people than I invoked the first two. At least, people who feel no disgust at working with that material.

I'm at least old enough to know that history will do what it does. Best one can do is try to understand it as it happens.

FreeBSD is still baller, so thanks to everybody involved for that.
 
To even put those tthings in the same category is a staggering misunderstanding of the undelrying technology.
To put so much faith into the underlying technology, something that did not exist and become well known till a couple of years ago, is staggering in itself. It is unproven and immature technology with little history to prove itself. I find it amazing so many people are putting so much faith and assumptions into it (see the article I linked to earlier).
 
Faith is for people who:

1. Aren't sure about the truth, and so have to guess, and,

2. People who rely on the object of faith.

I am the only person I personally know who uses autogenerated software for exactly nothing. Well, not counting the many interfaces I use that undoubtedly do, which is an increasing amount. Probably this very FreeBSD installation has code someone ripped from autogenerated software. Like was mentioned above, no way to tell.

So faith is not the word I would use.

Another thing the accusaiton of faith implies is that I, or you, somehow have a say in what gets implemented. If I did have faith it would matter not a tittle and jot.

The reason powerful new technologies get adopted is not faith. It is balance sheets. You may have all sorts of ethical opinions, but a CEO has people to answer to. If he sees figures, and these figures exist, telling him that autogenerated software far outstrips human persons in cost effectiveness for a task, and in cases can even be more creative, obviating whole classes of problems you thought you needed solutions to, then there really isn't much question of whather a human should review or this or that faith or whatever. Cracauer or kpedrsen's opinion on the quality of the code (on which they have both contraticted themselves, overestimating it at times and underestimating it at others, as the situation requires. I believe neither of them to be intellectually dishonest, I believe it to be a case of emotion induced blindness). Completely irrelevant.

Some of the comments I read here about "human review" is a little like if a seamstress in 1880 had said "yes this is all well and good, but a human will have to review every garment this steam contraption puts out."

Now. Are there cases where human software can compete? Just like haute couture remains an industry for seamstresses, I believe the answer is yes, and that FreeBSD could be a case. But not if our top minds keep saying stupid shit like "bah it's just like Games Maker."
 
As for the bubble, it will be like the dot com bubble.

The idiot bros who thought it was a get rich quick scheme will go down, and drag a good portion of the economy dowh with them. But serious autogenerated software work is here to stay, and to shape the future.
 
Back
Top