FreeBSD as a development platform

Yes. You're right. My mistake. (I'm a big fan, pro-speaker and defender of TeX/LaTeX, and opponent of wordprocessors. :cool: )

But since I already used LaTeX on my Amiga 2000 in the early 1990s today I don't care much if compilation of a TeX-File takes 1 second or 1.5 seconds 😁 or if a toolchain contains a couple of Megabytes more....

But then what isn't overkill - especially wordprocessors?
Writing config files in XML is ultra-hyperkill (not to say nonsense.)
But that would change the topic.
 
if a toolchain contains a couple of Megabytes more....
I got sick of gigabytes of yearly updates and version collisions because debuntu brought EOL TeXlives for months and nothing worked. Several years in a row.

Since then (was it 2014?) I use plain text + vim + :hardcopy to print my formal letters. Like https://mro.name/2022/brief-dina4.txt. All else I write either by hand on paper or in HTML. Never looked back.
 
I find version collisions under Linux so very annoying that to me Linux is for people who's interest in computers is not using but fixing them. 😜

I also used TeXlive-systems. Once they are discontinued, or you change the system (me from MS Windows -> FreeBSD) one may face problems.
Main problem for me was different textencodings, which under FreeBSD with according tools can be fixed quickly and easily.
But since I use LaTeX purely with vim and within the shell I never faced any version collisions since FreeBSD 11something.

But, of course, it also depends on how/where you publish - and handwriting is something very good, exemplary (except my handwriting is unacceptable chicken scratch 😁 )

"plain text + vim + :hardcopy to print formal letters."
is something I find completely sufficient and exemplary.
After all the result nothing but typewriter - absolutely Okay in my eyes.
 
  • Thanks
Reactions: mro
This as gone out of the roof. I see that TeX does the same job as vim. Is the different product that each can do that makes the gap between them.
 
I see that TeX does the same job as vim. Is the different product
Sorry, but I cannot follow you.
vim is a texteditor which produces text such as source code.
TeX/LaTeX is a markuplanguage, which compiles a source.tex file to output.dvi (or *.ps, or *.pdf)
If you don't put anything but \begin{document} some text \end{document} in source.tex TeX will produce something similar like :hardcopy under vim - then I could understand your point.
But normally you start with a header with something like \documentclass[ ... which produces something vim (or any texteditor) cannot.
So, what did I miss?
 
Well, as you said, normally TeX start with a header that vim only takes as text. That extra TeX step is what you miss. They do the same job unless you add that step.
 
vim is a texteditor which produces text such as source code.
vim has a command :hardcopy > foo.ps that creates a postscript file. Give it a try if you haven't. That's what I send to my printer, put into an envelope and send to e.g. tax authorities.

It looks like written with a typewriter which I find quite appropriate. I never got a discount all those years I sent it in computer modern.
 
Age might have something to do with it as well.

I think a lot of people don’t realize that C is THE world-changing high-level language.
 
From a popularity / use point of view, C has always been weird.

Computer scientists have a love-hate relationship with it. Systems people used to love it, while language people used to hate it, but both statements are over-generalizations. Today, most CS college programs no longer use it as the primary language of instruction.

Until the early 2000s, there were more programmers working on COBOL, and far more production code written in COBOL, compared to C. But those are typically not the people you'd find on what today is Reddit (and back then was Usenet or BBSes). Matter-of-fact, until the early 90s, most of the COBOL programmers in the world probably had no e-mail addresses that were accessible outside their company. In the late 90s or early 2000s, I saw a study by Stanford software engineering researchers about what the largest software engineering companies were (counting by number of software engineers employed). Numbers 1...3 were General Electric, General Motors, and Boing (with the first two tied for place one). The rest of the top 10 were industrial and financial companies, with the largest banks also represented. The only "computer" company in the top 10 was IBM, driven there by its large army of contract programmers who work for IBM's consulting customers. The largest "hip and trendy" computer company (which sold software) was Microsoft, somewhere in place 30 or 50. Given that all the large business/industry/finance companies DO NOT write software in C mostly, it is clear that C (and then quickly C++) was a minor player, except among OS implementers.

If I look at today's studies, I find that Javascript, Python, and several others all beat C in both popularity (how much they are talked about) and in the number of engineers specializing in them.

C's accomplishment is that it unified the OS implementation language, for the period roughly from the 90s to right now (when other languages are beginning to take over in OSes, in particular Rust). In the same fashion as COBOL and Fortran unified the business and scientific programming world for the most part. And when I say "OS", that includes the big infrastructure applications, such as web servers, FOSS databases, compilers. From there, C took a significant fraction (but never quite dominant) of industrial and web programming, and shrink-wrap consumer software; but it never was a major player in end user software, and that's where much of programming in the real world happens.
 
But, how can cobol, which I never used, or any other programming language unify? In any case ForTran made geeks use mainframes.
 
But, how can cobol, which I never used, or any other programming language unify? In any case ForTran made geeks use mainframes.
I think that what ralphbsz meant was that a 'unified' programming language becomes standardized, meaning there's one 'official' version that is defined by a standard, like ANSI, ISO, or IEEE.

In case of C/C++, it's governed by ANSI (iirc), and any given compiler (gcc, cc, llvm, Borland, Visual C++, and others) kind of has to be able to compile the same source code, and the resulting binary kind of has to give the same output when executed.
 
Actually, what I meant is the following:

Before Unix and C, operating systems and their base systems were implemented in a mix of languages. A lot of assembly (usually using complex macro assemblers), ALGOL, PL/I and PL/S, Bliss, and a plethora of weird things. For example, PrimOS (for the Prime minicomputers) was implemented in Fortran (a bizarre choice, given the absence of pointers), and on the VAX, at least one utility was written in each of the languages Digital sold (rumor has it that the monitor utility, the equivalent of top and ps, was written in RPG-II).

Within about 10-20 years, C changed that. Suddenly the OS kernel and basic utilities were all written in C. So C unified a whole set of related but different OS implementation languages. And right now, we're seeing fragmentation again, as everyone is getting very tired of C, and now production OSes are getting new languages (see the discussion about Rust), and research OSes are being done in a variety of languages.
 
In case of C/C++, it's governed by ANSI (iirc), and any given compiler (gcc, cc, llvm, Borland, Visual C++, and others) kind of has to be able to compile the same source code, and the resulting binary kind of has to give the same output when executed.

No, it’s ISO, the International Standards Organization. ISO is made up of national bodies (ANSI for the USA, DIN for Germany, BSI for the UK, AFNOR for France and many more). Unfortunately that means it isn’t entirely open - to be part of a National Body you usually need to work for an affiliated company. There is an exception offered by the C++ Foundation but that means you can participate but not vote. I don’t know so much about ISO C.
 
astyle IIRC, ANSI was the first to standardize C at all, but this was quickly "re-issued" by ISO and since then, ISO governs the further standard development, as Paul Floyd explained.

ralphbsz Still people hate the complexity that comes with that fragmentation. There were countless efforts to allow using a single language for your whole stack building some (application) system (IMHO weird things like Xamarin or electron come to mind). I'd still say C's (unique?) achievement is offering abstractions that still map directly to (typical) hardware. It would be possible to use it for practically "anything" (given enough flexible libraries were available). I don't say that's necessarily feasible (having nothing but void * for anything "generic" is e.g. a strong downside), but, possible...
 
Talking of PL/1. It was in 1976 or 77. Took a class. It made me see later programming languages like dinosaurs. I see that C changed the way of the mainframe.
 
No, it’s ISO, the International Standards Organization. ISO is made up of national bodies (ANSI for the USA, DIN for Germany, BSI for the UK, AFNOR for France and many more). Unfortunately that means it isn’t entirely open - to be part of a National Body you usually need to work for an affiliated company. There is an exception offered by the C++ Foundation but that means you can participate but not vote. I don’t know so much about ISO C.
who does? once a person learns the basics the ISO adds something new to the language. Yes ISO made the rules but some were accepted before by programmers of C.
 
Back
Top