How do FreeBSD users LaTeX?

... end of the 1970s where there were computer users, not users of application boxes.
That's actually incorrect. In the 70s, there were lots of computers. Thousands and thousands of them. Nearly all of them were application boxes. Most people who used computers didn't know how to program, nor were they expected to. They knew how to type commercial transactions into punched cards, how to load punched cards into the reader, how to run certain programs (like "accounts receivable" or "billing"), how to mount and unmount tapes to update databases, how to use the printed output of the programs. Probably 99% or 99.9% of the people who interacted with computers on a day-to-day basis had no idea how to modify its functionality, but they were very good at getting work done. For example, updating the database of how many widgets are in the warehouse, and printing invoices to the customers who had bought widgets.

I was one of those computer users, but I was also one of the programmers. I took piles of punched cards in the evening, put them into the card reader, loaded the correct forms for invoices in the line printers, pulled the correct database tapes from the shelf, and ran "billing" and "logistics". Sometimes I would screw up, which is why we had backup tapes, and a shredder. But I was also the guy who would sit in front of a terminal during the day, and modify the COBOL source code to when the algorithm for calculating VAT (=sales tax) changed, or improve how we re-ordered widgets when we got close to running out of stock. It was only because I was working in a small company that I was able to see both sides ... being a computer user and a computer programmer. Technically, by title was "director of data processing" (Vorstandsmitglied fuer Datenverarbeitung), but that still meant that I had to mount tapes on the evening shift, and supervise the folks who operated the card punch (which were being replaced by terminals for data entry, with the same folks using both).

The myth that in the 60s and 70s people actually understood their computers is just a myth. 99% of people involved with computers used them as black boxes. Of the remaining 1% of programmers, most didn't really know how the magic smoke in the chip works, but they knew how to write in COBOL, FORTRAN, RPG-2, and get it to work. I was lucky, in that I took a series of "operating systems" classes in college (that was my minor), and built my own computers using microprocessors and wire wrap, and actually knew how to debug a program using a logic analyze or oscilloscope. From that I have drifted into making computers my profession.

There has always been a very small subset of people who try to understand computers at a deeper level, and who use them to build programs as a thing of beauty. Don Knuth and Leslie Lamport are two of the super-heroes of that very small group. It is interesting that they both were drawn to typesetting, which is what they are known best for, even though in reality, their accomplishments are much greater in other areas.

Frequently I ask me, why I am programming in this capricious, primitive and complicated language when I only want to write a text.
The language is complicated, because it is simple. It is not the slightest bit capricious. However, it is primitive.

That's all completely on purpose. Don Knuth started to think deeply about what it means to typeset a piece of text. He started thinking about possible solutions. And he ended up deciding to solve the problem by creating a text macro expansion language (which is really all TeX is, with the minor side effects of leaving black ink on paper). And he made that macro expansion language very simple and logical. The description of how it does macro expansion fits on a few sheets of paper. One of the things he did, being the consummate theoretical computer scientist, was to include a complete (matter-of-fact Turing-complete!) processor in that macro expansion language. And the documentation of that macro expansion language, and its theory of operations, are documented in the source code of TeX itself. You can take the source code, run it through a program that's part of TeX, and prints its own documentation (that's how the "TeX: the program" series of books is generated, they are nothing but the source code of TeX in book form). So not only did he revolutionize how to typeset text, but he also revolutionized how to document programs, and how to write macro expansion engines. The reason the tool is actually usable at all is that Don is incredibly smart and a deep thinker (insert image of Don next to Rodin's "The Thinker" here), and he figures out the fundamental truths about the problems he works on (like when to tokenize a macro expansion, something that eventually gets all TeX users).

However, as is common when computer scientists try to solve real-world problems, the solution is borderline unusable. So much though that Don's friend Leslie ended up writing "a small macro package" to make TeX actually useful for real-world documents. Thence LaTeX.

There is a joke that really explains how this works. The owner of a small business needs some software to do billing and accounts receivable. He goes and hires a computer scientist, who tells him that it will take him three years to produce that software. While that seems like a long time and expensive, there is no alternative, so the business guy hires the computer scientist. After one year, he goes to ask for a progress report. Which is: I have nearly finished writing a source code editor, during the second year I will write a compiler for my new programming language, and in the third year the IO library and database. Having finished those preliminaries, writing "billing and accounts receivable" will take a week.

You think that is a joke? Yes, it is, but it is also the truth. For a real-world example, look at RMS. He was a second-rate computer scientist, who wanted to write an operating system, one that fits his particular political opinions (which include free software, free beer, and free love, not necessarily in that order). He was looking at the great computer scientists of the past, who had created towering edifices of great engineering (namely operating systems). Obviously, he was wrong: Neither Dijkstra nor Knuth wrote any famous OSes, and the guy who actually did (Fred Brooks) neither got girls nor booze out of the deal, and the resulting OS was a horrible mess and a giant cost overrun (and yes, I've had to use descendants of OS/360). But to an idealist like RMS, these details of reality don't matter. So what does he do to write that towering OS that will guarantee him a place in the pantheon? He started by writing an editor (emacs) and a compiler (gcc). What did he end up getting famously fired for? None of these projects, but proselytizing having sex with underage girls. Ultimately, his dream OS ended up being delivered by slapping the "GNU" label on Linux.
 
Sometimes I would screw up, which is why we had backup tapes, and a shredder.
To cover up those screw ups, I presume.
built my own computers using microprocessors and wire wrap, and actually knew how to debug a program using a logic analyze or oscilloscope.
Those were the good ol' days.

I found a cash register in my basement today, actually a touch screen computer, from my restaurant that we kept as a backup. I tried to get FreeBSD installed on it but it has a custom bios and wouldn't boot to a USB drive. So I took it apart and scavenged it for parts. A small hard drive, compact power supply, an LED display. I felt good and right at home.
 
But what is the lesson of the story?

C programs, or in this case Pascal/Web programs, are sure nice things, but a script with a language like perl or tcl may be easier to write, more compact, more readable, more flexible for transformations, and perhaps more efficient. It is like using libraries.

Not everyone wrote lengthy and complex programs in the '80s
 
Back
Top