Some use compiled because of library stability over longer periods.
A weak argument, but an argument. Source libraries (such as used by Python and Perl) are also quite stable; I've been running a 10-year old program that is using a 15-year old Python->BerkeleyDB interface, and it is (unfortunately) still working. Unfortunately, because I need a swift kick in the behind to force me to renovate that program from scratch (not just to use modern libraries, there are also lots of design decisions that are no longer appropriate to the requirements).
Type checking ? Could be, when you don't pass void pointers.
You can get around type checking in compiled languages easily. Any time you see the string "void*", you just got around it. Matter-of-fact, in straight C programming (not C++), there is actually darn little type checking for structures, since so many things have to be cast to void*. Sure, you get type checking for int versus float versus pointer, but that's typically not where the mistakes come from.
And you can enable type checking in Python. I use it all the time. Most of the time I appreciate it, because it does catch bugs. But I don't love it, since it occasionally it gets between me and the simple or elegant solution. Duck typing is really a good thing, and strict type checking prevents it.
I didn't want to use the word fetish, but I think I did!
I think that's exactly the worry I have here. Wanting to program in straight C "just because" is idiotic if the goal is to be a good programmer, or an effective programmer, or create good programs. It is a fine hobby, as is building gothic cathedral models out of nothing but match sticks. It can even be a teaching tool for understanding how older programming languages work. But in production, it can only be a fetish.
Something like this: DB (sqlite!) => Statistical Analysis => Graph (on windows).
Depending on your work environment, you don't need to learn any programming for this, or perhaps just a tiny amount of superficial python or R. There are integrated data analysis environments that have database interfaces, built-in IDEs, and built-in graphing tools. I used them all the time at work. Unfortunately, I can't give a recommendation for ones (since the one I use at work is proprietary), but I've heard that Jupyter is very good. Using tools like this, people who know very little programming and are really not software engineers can do analysis and graphs within a few hours.
Where's the root? Why do you think C is the root? What about Intel Instruction sets and registers? There's also GPU, FPU and Intel Instruction Set Extensions e.g. SIMD, SSE(2|3|4), AVX(2|512) so forth and so on! Where is the limit?
There is something to be said for learning how to program in (binary) assembly, where you write a small program, turn it yourself into hex codes, type in the hex numbers, and see it run. About 40 years ago, I used to do a lot of that ... until we got an assembler and editor. It's something that people should try once, to understand computers better. And to appreciate all the tooling that we have today.
In the same fashion, everyone who is a carnivore should at some point in their life go into the forest with a rifle, and go from a happy and living game animal all the way to a dinner, by themselves. To understand where those hamburgers and roasts really come from: from something like bambi. And shooting it isn't the hard part; taking it apart and removing all the bits that are not a delicious dinner is much more difficult. BUT: Having done it once, it is not necessary to repeat the operation frequently, the lesson has been learned.