AI for writing documentation

zester It seems that it is customary in your culture to insult others.

When did I insult anymore? Because of this?
I have very frequently the experience that some people get offended when they meet someone more intelligent than them. I can imagine that people acting that way, and so stupid to also personalize AI, do get upset.

How about I spell it aout to you because clearly your reading comprehension is shit.
people get offended when they meet someone more intelligent than them.

Meaning "people get offended when they meet someone more intelligent than them." that most be a european thing!! And everyone who responded proved my point!!
 
Because in American culture we don't get offended by others who are smarter, it's not a thing that happens here it would be seen as very strange, people would even think you have something wrong with you. So my natural assumption was that must be a European thing cause a European just said how common they found it to be in there country!!!!!

It wasn't a insult! It was an observation on how something would be seen as very odd and very uncommon, like how Europeans use the word "fag" which in the united states could get you killed or arrested for a hate crime.
 
Engineering is an end product of thinking while programming.
Indeed, and that speaks against your principle of writing documentation before coding.

Better short comments in the code.

And my idea was to leave the final documentation to AI using code and comments.

But someone said it is bad troff code. Perhaps a small program can correct it.
 
Yes, *this*. And if we let models write too much code for us, or do too much of our thinking, we lose this edge.

If AI ever truly reaches genuine creativity, the world will become a fundamentally different place. That moment wouldn’t just be another technological milestone it would be an evolutionary event. Creativity is effectively the threshold of artificial general intelligence: the point where a machine is no longer just processing patterns, but producing ideas in the same sense humans do.

Most people underestimate how profound that shift would be. It would mean we had created a synthetic form of intelligence that behaves less like a tool and more like a new kind of life. At that point, automation wouldn’t simply replace tasks it would replace the necessity of human labor itself.

For the first time in history, humans wouldn’t need to work in order to survive. We would simply live.



 
Maybe I should start banning some folks. You either follow the directions of the mods/admins or you're going to find yourself on the receiving end of the ban hammer. Is that clear enough for you?
 
Creativity is effectively the threshold of artificial general intelligence: the point where a machine is no longer just processing patterns, but producing ideas in the same sense humans do.
Perhaps most of the time humans are processing patterns, from the very beginning of their life.

These LLM have something called "temperature", if it is 0, it is deterministic, selects as next word the one with highest assigned probability, as temperature is increased, it may select other word, the higher, the more words with less probability. In the extreme case, delirium, non sense, noise comes out.

In few words: creativity increases with chaos, extreme creativity means speaking nonsense.
 
Perhaps most of the time humans are processing patterns, from the very beginning of their life.

These LLM have something called "temperature", if it is 0, it is deterministic, selects as next word the one with highest assigned probability, as temperature is increased, it may select other word, the higher, the more words with less probability. In the extreme case, delirium, non sense, noise comes out.

In few words: creativity increases with chaos, extreme creativity means speaking nonsense.

Then that nonsense starts to evolve and converges into meaning. I am curios what AI would come up with If I asked it to create a new c++ code documentation system.
 
Then that nonsense starts to evolve and converges into meaning. I am curios what AI would come up with If I asked it to create a new c++ code documentation system.
Perhaps too complicated for today's AI, but it is also pattern matching, the phrase "code documentation system" is a pattern, otherwise it would not exist, and new means apply chaos to find something else than the "c++ code documentation systems" that exist.
 
the "prompt" for this image is "dr house diagnosing tony soprano with lupus". you want to trust this system, which cannot keep these two men separate, to write documentation that has to be both correct and meaningful to humans? really?
1771261664463.png
 
the "prompt" for this image is "dr house diagnosing tony soprano with lupus". you want to trust this system, which cannot keep these two men separate, to write documentation that has to be both correct and meaningful to humans? really?
Irrelevant to the theme here.
 
LLMs will inevitably take jobs, just as with the printing press, the loom, washing machines, etc, before it and that will be deemed to be perfectly acceptable and "progress" [by those who don't stand to lose their livelihoods and possibly have shares in big tech.]

That's how it always works - it's a great idea if you don't stand to lose from it. Historically a loss of a job role to a machine has not always meant an immediate improvement, in some cases quite the opposite. We live in the world of mass production and planned obsolescence. Experience and history tells us that proprietary vendors put profits before quality - the goals are similar here. "Documenation" hardly matters if you're coding up a black box, so it goes to the lowest paid or its skipped. "Tech support" can be monetised, so you build complex systems and sell certifications.

Unfortunately "AI" is not yet developed enough to take over these roles - it is, of course being "talked up" and the stock values go up and up. All of the signs point to a bubble and another 2007 - 08 style crash. That's a bigger issue than your X number of Americans who may lose their jobs to LLMs.
 
Proving the shortcomings of AI is relevant here.
Image analysis and generation is a different issue.

you open by claiming that it's good because now you don't have to learn troff. seriously?
No, but because I see how fast it recognized what program do and write down documentation that I would never have written because I do not want to expend to much time on it. If it makes errors, it corrects it in a dialog. It is a nice way to do the job. I told that 1000 times. AI would have understood it, you still do not get it. I you were the one that writes documentation and I were your chef, I would fire you because you do not get it.
 
Image analysis and generation is a different issue.

it is not. it's the same process.

No, but because I see how fast it recognized what program do and write down documentation that I would never have written because I do not want to expend to much time on it. If it makes errors, it corrects it in a dialog. It is a nice way to do the job. I told that 1000 times. AI would have understood it, you still do not get it. I you were the one that writes documentation and I were your chef, I would fire you because you do not get it.
what?
 
I was looking into an obvious ../.. vulnerability introduced into a major web framework today, and it was committed by username Claude on GitHub. Vibe coded, basically.

So I started looking through Claude commits on GitHub, there’s over 2m of them and it’s about 5% of all open source code this month.

As I looked through the code I saw the same class of vulns being introduced over, and over, again - several a minute.

but yeah it's totally gonna be good at writing documentation. right.
 
Back
Top