C Plain C learning school forum?

Right so the C++ example is compressed in the link given, compared to C which is nicely spaced out?
Show me stuff with overloads et al. Then verbosity goes through the roof.
Anyway, I'm not here to argue against using a language, I was just vilifying C++ and Objective C as a bit of a joke.
I think you're missing the point. The C code is about 20 lines of code whilst C++ uses just two standard library calls. I work a lot with C and I get really really tired of having to reinvent the wheel time and again, and wish very hard that I could just use C++ containers and algorithms.

Are you sure that you mean overloading? You're not confusing overloading and overriding? My experience with overloading is, yet again, that the result is _shorter_ code. As an example a few years ago I worked on a C code generator. One of the types used an array which would be used in general expressions. This resulted in code like

d = abc_add(abc_mul(a, b), c);

In C++ this would have been the "through the roof verbosity"

d = a*b+c;

Perhaps you mean the wonderful C overloading where all pointers can be converted to void*. This is a bit of a joke in terms of type safety, but it can result in shorter and more efficient code.

Finally C++ can do virtually everything that C can, so at the 'worst' you could compile C code with a C++ compiler and benefit from better type checking.
 
Would not the code crash on double de-referencing anyway?
Of course, when it crashes and where it crashes are a source of wonder for the programmer as they attempt to debug the mess.

It will certainly crash but in a controlled (and deterministic!) manner. The program would abort with a message like this:
Error: [Foo] no longer valid in main.c:12

Mostly because the first dereference is always valid and doesn't take you to potentially deleted memory but really an index block that simply tells you if the memory is deleted or not, its type and a few other bits. So then (in order to keep it typesafe) if this block reports all good, we dereference the pointer using ptr[0][0] because that indexing struct has the actual pointer at the top. These indexes can only grow so not suitable for release builds, however the way they work (large blocks vs lots of small allocations) means that even on 32-bit or embedded, you typically will never run out of memory or address space.

It is is a large amount of MACRO hackery so I can understand why many would avoid something like this but it really does go a long way to provide safe C code. I have written some considerable projects using it and have been pleasantly surprised by how effective it has been. Not once been stung by memory corruption or needing Valgrind. And I am certainly not a genius when it comes to programming.

If this does interest you, have a quick flick through this: https://github.com/osen/stent/blob/master/doc/libstent-2020.pdf

Of course, when it crashes and where it crashes are a source of wonder for the programmer as they attempt to debug the mess.

I hate this, especially if it isn't reproducible or deterministic! And most of all, I hate looking at my own software and knowing that somewhere, a memory corruption bug is waiting to kick me in the butt when a shareholder is watching.
And yet if other software I use (not written by me) crashes, it oddly doesn't concern me. I see it as "cute and quirky". Developers are weird ;)
 
It often takes discipline, and if people don't have it, I advise them to not program in C.
I've reached the conclusion I'm not smart or disciplined enough, but unfortunately, I sometimes have to work in C.

I rely on tools in these instances because I assume I'm not going to find all the leaks just by code inspection. I'm usually right.
 
I hate this, especially if it isn't reproducible or deterministic! And most of all, I hate looking at my own software and knowing that somewhere, a memory corruption bug is waiting to kick me in the butt when a shareholder is watching.
And yet if other software I use (not written by me) crashes, it oddly doesn't concern me. I see it as "cute and quirky". Developers are weird ;)
Worst one I ever had to troubleshoot was a stack overflow bug. The program would crash at random locations far, far from the actual problem in code that hadn't changed in years. I never would've found it if a QA person had not pointed me at the change in question. I guess I learned the value of git-bisect that day.

Some genius "fixed" a problem where we were running out of memory in a buffer allocated on the stack by simply making it 10 times bigger. The insidious thing is the code worked fine on Solaris and a bunch of other Unixes. It would crash on Windows, though.
 
I've reached the conclusion I'm not smart or disciplined enough, but unfortunately, I sometimes have to work in C.
I am definitely in this boat too. For me, the choice of C isn't due to performance or anything like that but more due to lack of any real alternative. Things like Python, Java and Rust are not programming languages but basically sacks of dependency managers and other mess XD.

Even though the software may crash, getting people to use C in place of alternatives is probably a selfish investment for me. It means their software won't drag in thousands of other packages just to split a sodding string.

C and libstent isn't really intended to be used to replace C oddly enough. It is actually to replace things like Java in situations where C would normally be considered inappropriate due to its perceived difficulty.

C++ is getting worse. Once modules hit and many projects regress, I think it will be basically no better than Rust at that point. All faffing with dependencies and no actual writing of code.

Some genius "fixed" a problem where we were running out of memory in a buffer allocated on the stack by simply making it 10 times bigger. The insidious thing is the code worked fine on Solaris and a bunch of other Unixes. It would crash on Windows, though.
Eeek. I notice that there are very few analysis / debug tools that can help with that too because no-one expects someone to be so oblivious to how the stack should be used!
 
Developers are weird
Hey! I resemble that remark!

I'm probably repeating myself but I started life as an electronic engineer building computers with TTL chips and transistors. So when I program I still picture in my mind the flow of the address and data paths through the motherboard or whatever electronics is there. So assembly and--more so now--C is my language of choice for everything I do when I can use C. When I do get enticed or forced into using another language, I tend to write that code just like I would in C but, somewhere along the line, I'll start realizing that and question why I'm not just doing it in C! Then everything is uniform and consistent.

Interestingly, at two jobs I did some work for recently, a PHP programmer and a <now I forgot what> programmer both remarked how much they liked how I wrote PHP and the-language-I-forgot but all I did was use standard, every day, run of the mill, tried and true methods which I thought everybody used.

I'll mention again the time I spent three days trying to solve the problem of a pixel showing up on the display of a medical product right before a major demo to marketing. I wrote it in assembly on a Motorola 68000 and no amount of blurry eyed scanning could find the problem. My boss couldn't find it. Two co-workers couldn't. What could be causing it?!

It looked like there was a problem with
Code:
move b $00000000,d0
but moving that byte into d0 was exactly what I wanted to do!
|
|
|
|
While the demo was going on, I found the problem with that code if you did not. A missing dot between 'move' and 'b'. The assembler didn't report any error.
 
Heh, nasty! When you provide the example code like you did (i.e just one line XD) it was easier to spot but having to wade through many lines of assembler to spot that must have been pretty tricky.

And I agree, the "write like C" way is certainly ideal. It really translates to "write code in a simple and boring way" which makes it much easier for others (and myself later) to understand.

What I particularly dislike is languages (or features) that forcibly prevent me from writing C-like code. The main ones that come to mind is the spaghetti JavaScript asynchronous "functions in a function, calling functions" stuff which is generally mimicked by a number of C++ developers and asio/lambda fans. This is particularly painful, especially for software with a single main loop (embedded, games, servers) because it breaks that architecture by continuing without completing an operation properly. It also infects a codebase making it all have to be effectively asynchronous unless you plan to scatter "if(ready == 1)" everywhere.
 
This specific reply is not targeted to the OP's original question.

Ask yourself, do you want to peruse programming as a carrier, or you're just a curious person. If it's the latter, language doesn't matter! C, vb6 or wolfram, all of them are irrelevant. What do you want to achieve by learning a computer language? Developing is hard. It's not just syntax or libraries. If you haven't defined a project yet or nobody is going to pay you, you're just waiting your time.

There's an alternative approach. Pick an essential computer book such as "Structured Computer Organization, written by Tanenbaum". Take your time and read it. You don't have to read it cover to cover. Skip boring or difficult parts. You'll learn more about computers (x86, ARM, AVR) by studying such materials (also algorithm, data structure, etc). Eventually you'll gain enough knowledge, to choose the right languages and tools for the specific job. It may be C or just running anaconda on a browser.
 
I'm not a professional C programmer, but I know enough C/ASM to spend my time on MCUs as a hobbyist. All that said, learning C has a great benefit for FreeBSD users. Documentation can't be perfect. You can't reflect functionality of the entire source tree in the manual pages. If you want to solve a fringe problem, unsure about some flags or just want to learn more, you have to refer to the source code. If you want to read the source code you have to know the C language.
 
Back
Top