What's with the avalanche of new things to learn - has it always been that bad?

It feels that life is an accelerating treadmill, one has to run faster just to stay in the same place.
But eventually one starts falling behind.
Until getting off the treadmill altogether.

P.S.
I mean retirement, not the final rest :)
 
I thought about this some more and contemplated what you folks posted. After all I think there isn't too much special about the situation. Most of it is normal. New languages like Rust and Go have always been introduced. Kubernetes, containers, VMs are just normal new technologies. Kernel security features are picked from my personal interest, can't blame anymore for that. GPU programming is just an architecture with a weird instruction set. Manageable.

But there are two exceptions from the "business as usual for programmers":
  • C++ is actually out of control, and it seems there is general agreement here. This hasn't happened to this degree in the history of computers.
  • Machine learning lies a bit square, and the reason is that it left the zone of "human-understandable computing". Your normal understanding of data structures and your debugging abilities are suddenly not helpful in understanding this. You can jump in anywhere with gdb and it doesn't buy you anything. For me as a debugging minded programmer that is unusual.
 
I thought about this some more and contemplated what you folks posted. After all I think there isn't too much special about the situation. Most of it is normal. New languages like Rust and Go have always been introduced. Kubernetes, containers, VMs are just normal new technologies. Kernel security features are picked from my personal interest, can't blame anymore for that. GPU programming is just an architecture with a weird instruction set. Manageable.

But there are two exceptions from the "business as usual for programmers":
  • C++ is actually out of control, and it seems there is general agreement here. This hasn't happened to this degree in the history of computers.
  • Machine learning lies a bit square, and the reason is that it left the zone of "human-understandable computing". Your normal understanding of data structures and your debugging abilities are suddenly not helpful in understanding this. You can jump in anywhere with gdb and it doesn't buy you anything. For me as a debugging minded programmer that is unusual.
Go is unlike any other language. It was created by Ken Thompson and Rob Pike. It's BSD licensed, so it's not yet-another-Google thing that Google can drop.

Go was carefully designed and seems to blend the best from C & Python. It's "OOP" done right with composition over inheritance. The compatibility promise is not something you see in the rest of languages like Rust or Python.

Go is also easy to learn. The concurrency primitives are the best. It's not perfect, yet.
 
@cracauer fwiw, I am not sure if ML is unique / first with respect to debugging.
For imperative languages, even very high level, we can establish some mapping from what source code does and what a processor does.
For declarative languages, functional languages, logic languages like Prolog there is no obvious mapping from source code to how it gets "executed" / "interpreted" / "solved".
Debugging must follow the same concepts as the language.
ML should also provide some "thought reading" debugging, but I imagine that coming up with proper debugging / introspection tools is harder than just doing stuff in the first places.
As it has always been.
 
I was thinking in my head all day. Is optically scanning bottles really Machine Learning or Machine Comparison.
There is no learning involved. The utter speed of the operation was impressive but that still did not make it smart.
In a QA env all you really need is GO or NO-GO.

I really hope they don't 'fix' C!
But the same as C++ they are always producing new revisions.
Hundreds of changes.

When should it go into maintaince mode except for security fixes? Can they vote on that?
 
I was thinking in my head all day. Is optically scanning bottles really Machine Learning or Machine Comparison.
There is no learning involved.
One may name the training of an AI learning, but you're right insofar as there is no thinking involved.
If you break it down to machine level after all it's just a giant heap of pure binary 'if...else...' decisions.
Just because of some mechanism becomes extremely large and complex doesn't change the mechanism's core nature.

No need to tell you ML goes back all the way of computering until the 1950s.
Already in 1980s computer game's opponents were based on some kind of AI.
What's really revolutionary new is size and power of the machines being available for that.
As I see it: The large things, like ChatGPT, were started in the first place by pure curiosity. 'Just see what we get when we build a really big machine, with lots of power.' No much thinking ahead. Of course not. It was purely scientific. But to get money for even bigger machines, you need to take business people on board. Plus amazing results always attracts investors anyway. But business people don't just spend money, they must produce revenue. So it must be sold, the sooner, the larger the better. If something new shall be sold there always is advantages only, no second side of the medal, no flaws, no downsizes, no disadvantages at all.
As a result there are several kinds, and levels of a technology simply summarized as AI, placed on some markets too soon. There's lots of confusion, because things are wildly mixed up, especially excessive promises, dreams and hopes. (It wasn't the yard who called the Titanic 'unsinkable', it was the newspapers exaggerated the new system of separated sections.) Large parts of the society cannot deal with it. Its results ain't reliable, or trustworthy enough, so to be no more as a toy for most people (which for many is fully sufficient.)

Because - and that's the part that did not change - like in every other computer language - and it doesn't matter if we talk assembler, Lisp, C, C++, Java, Rust,...LLM - the core principle problem always stays:
Until some one, or something can read minds - and I hope that will never happen - one has to learn how to talk to the computer right so it produces the actual wanted results.
Which requires always one has to be crystal about his demands what the machine shall do in the first place. Every programmer knows that. No need to tell you. But every no programmer dreams of using machines without thinking. That's where the money is. The results we see everyday thousands of times everywhere around us.

As some also already pointed out here:
If something is either started on the wrong foot, or misused for something it wasn't originally ment for, you cannot correct that by regulations, standards, patches, updates, extensions, features... - it only bloats, complicates it, makes it less useful even more.
But the good side is, this will be natures course of evolution. Sooner or later it dies like the dinosaurs became even bigger than too big to fail, and make way for new things.😁
 
Go I'm kind of interested in, because it comes from the same group, or direct descendents thereof, that designed C. Brian Kernighan has his name on the book, so I am hoping it may actually have some real practical value; I haven't looked into it in any detail yet though. C++ has been a living nightmare since barmy bootstrap first came out with templates, and multiple inheritance, and 'friends'.. arrgh🤪, and then it just got worse from that point onwards. It IS out of control, with the endless addition of countless new features, apparantly they claim it is now not just an OO language (that's old hat) but now also supports functional programming! I actually tried using it a couple of times, but we dropped it in favour of C. With C++ you spend more time trying to decide which is the correct language feature to use, and then figuring out how to use it, than writing your actual program logic. Don't even get me started on UML and "object oriented design". I guess I'm some kind of luddite, or something. I'm a fan of Niklaus Wirth (and Pascal), "data structures + algorithms = programs". But when I get some time, I'll have a look at Go. Yeah the concurrency features in go sound interesting.

I'm not going to pretend I know anything about machine learning ;-)
 
Don't even get me started on UML and "object-oriented design".
I'm in complete agreement here.
About the only thing good I can say about them is:
They can make you stop and think about data, the relationships between different objects and types, rather than just on the algorithms.

Go and Rust look interesting, but for me I need a "real project" to actually work on to understand them. Just doing tutorials and "hello, world" programs are good enough to get the fundamentals of how to build the executable but don't have enough to really get into the nitty-gritty stuff.
 
But the same as C++ they are always producing new revisions.
Hundreds of changes.

The standards comittee usually tries pretty hard to keep the changes backwards compatible and not to break existing code, and for the most part they are small incremental changes rather than large-scale features. Some of the things they add are things that projects used to write themselves before they make it into the language, like those bitops functions in C23, or other minor changes like the new printf format specifiers. Or it's new features that you only have to use if you want to. I think the ansi comittee does a pretty good job of limiting the scope and amount of change and keeping the language stable, to be honest. Usually nothing that a couple of sed or awk scripts won't sort out :-) . After all... isn't this how most of the freebsd codebase has stayed relatively stable and able to be reliably compiled and run over a period of decades? Admittedly it's not perfect, there are times when you have to do some work to port code to a new C language version, but it's not usually on the scale of a major rewrite.
 
The standards comittee usually tries pretty hard to keep the changes backwards compatible and not to break existing code, and for the most part they are small incremental changes rather than large-scale features. Some of the things they add are things that projects used to write themselves before they make it into the language, like those bitops functions in C23, or other minor changes like the new printf format specifiers. It's not like they are adding whole new programming paradigms. I think the ansi comittee does a pretty good job of limiting the scope and amount of change and keeping the language stable, to be honest. Usually nothing that a couple of sed or awk scripts won't sort out :-)

Also, looks like they are "porting" some things from Posix to C, which I don't mind at all.
 
And John Carmack said he works in C++. https://cppdepend.com/blog/john-carmack-cpp-programming-legend
So clearly it is possible to do good work in C++. 😁
I was worried for a second.
Doom 3 BFG is written in C++, a language so vast that it can be used to generate great code but also abominations that will make your eyes bleed. Fortunately, id Software settled for a C++ subset close to “C with Classes” which flows down the brain with little resistance...To resume only a subset of the c++98 standard is used.

I still think Objective-C would be the perfect language if it had name spaces.
 
Also, looks like they are "porting" some things from Posix to C, which I don't mind at all.
Agree completely.
I still think Objective-C would be the perfect language if it had name spaces.
Based on my limited experience, I agree. I think libraries for use by others are the best place for name spaces, clearly deliniates things, but if someone starts doing "import all from namespace" it winds up "which free is being called"
 
Anyone had a look an dylan?

Yeah, there was big excitement about it when it had s-expression syntax (with the implication of full compile-time computing). But then they changed the syntax and "simplified" the macro system.

A former coworker of mine runs that site.
 
I've done everything in C. Only cause I got used to it after doing everything in assembly.

I've tried to learn C++ but it always seemed to complicate everything too much.

And every other language, when I did try to use it, I always had a voice in the back of my mind saying, "But I can already do this in C!"

I did try Go--and liked it--but the executable was so big compared to my C that I stopped using it cause the server I was using was so small.

Smalltalk is another something interesting I never devote time to.

I always want to get good at Lisp. I put it off to when I have time but when I have time I don't know what to do with it and get bored.
 
I still think Objective-C would be the perfect language if it had name spaces.
I have to use Objective-C at work and i think it would be the perfect language if it dies (painfully) and nobody uses it anymore. After an hour of coding in Objective-C, i need two hours of coding in C++ in order to relax my brain.
 
I played around with smalltalk-V back when it was a way to get yourself a protect mode runtime on ms-dos. I have hazy memories of it working but being slow. We're going back in time a bit now. In the event that project got done in assembler. Yeah, I think smalltalk was what kicked off the whole OO fad, even before bjarnie's C++ 1.0. I remember a review in byte (I think) saying smalltalk was "a language that could only have been designed in southern california". 😁
 
I liked Stroustrup's original C++ in part because I used the preprocessor in C for the same reasons (fix idiocy, add features) he did. For example, I imported equality ops from Fortran because while it's painfully easy to type "=" when meaning "==" - and viciously hard to debug - it's much harder to get it wrong when typing "EQ" instead of "==".

But once the theorists got hold of C++, I ran away
 
Back
Top