War of the workstations: How the lowest bidders shaped today's tech landscape – Liam Proven, The Register


… The first of the big lies is the biggest, but it's also one of the simplest, one that you've probably never questioned.

It's this: Computers today are better than they have ever been before. Not just that they have thousands of times more storage and more speed, but that everything, the whole stack – hardware, operating systems, networking, programming languages and libraries and apps – are better than ever.

The myth is …


Author's responses include:

… I'm really interested in stuff that not only doesn't run on any existing OS but completely replaces all existing OSes and their design concepts.

– and:

It's not about languages. Or rather, it is, in part, but it's not about languages on top of existing OSes.

The real point here is working out what are the 1970s assumptions buried in existing languages and OSes, removing them, and building clean new systems with those assumptions removed.

So for a start, that means throwing out absolutely everything built with C, burning it in a fire, and scattering its ashes on the wind, because C is a toxic waste dump of 50 years of bad ideas.

So, tell me, of your hundreds of experimental languages, how many don't use a single line of C anywhere in them, their libraries, and the OSes that host them?

Are any left?

The kind of assumptions I mean, for clarity, are not obvious local things that don't generalise, such as "there are 8 bits in a byte" or "this computer uses binary" or "this computer's users was and write in the Roman alphabet", but outdated 1970s technology like "drives" and "files" and "directories".

The deep assumptions. Only if we burn it all to the ground and rebuild on a more solid basis can we escape 1970s tech debt.


… the core of this article was adapted – with my editors' full knowledge and agreement, of course – from one of my FOSDEM talks. I plan to return with some more of the talk – and some new stuff! – in later pieces. …
 
  • Thanks
Reactions: _al
I'd wager it is easier to get rid of C that COBOL. Any takers?
 
Your mood.

(Y Y Y) it's definitely better, and it suits me so well.😅

1708435718346.png
 
Last edited:
Speaking of workstations making computing worse, and how Apple screwed up: The computer history museum is currently restoring several Xerox Alto machines to full function. I don't know whether they're publicly visible yet, or being demoed or even accessible to visitors, but rumor has it that Al Kossow has two of them running. Al is usually the software curator at the museum (he usually deals with floppies and tapes), and it seems his new hobby is capacitors in power supplies and deflector coils on CRTs.
 
I'd wager it is easier to get rid of C that COBOL. Any takers?
Hmm that's a tough one. Millions of lines of cobol in mainframes running transaction systems, payroll, accounting, etc; huge investment in hardware, financial firms, banks, all the big corporations. Many with lost source code and only binaries available now, but I suppose it originated as cobol so probably still counts as cobol.

Versus... pretty much everything on microprocessors. Windows kernel. Linux and bsd's, the cloud.. Global comms infrastructure - internet routers, satcomms, fibre. TCP/IP stacks. Vast numbers of embedded systems. Automotive. Avionics, some anyway. IOT. Space. Satellites. Military. Medical. Air traffic control. Oh gee.. I forgot mobile. Remembering that many other languages that are themselves written in C, eg Java, so you would have retest everything. There is more: games, AI, EDA, chip manufacturing systems, robotics...

That's a tough call. Hard to say what the aggregate value of either would be, certainly billions of dollars.

Hmm. Training an AI to translate cobol into say... java, sounds harder than training an AI to translate C into say ironoxide. But there's a lot more of the C out there than cobol, and in a much wider variety of myriad deployments on different platforms that would all have to be retested. Just think of all those tesla autopilot highway smashes if you screwed it up.

If financial systems weren't such a central part of this society, say cobol had only ever been used to write chess playing programs, then I would say C wins hands down. But because cobol is still central in so many financial systems, that does even it up somewhat. Certainly it keeps the mainframe hardware refersh cycle going, firms are more prepared to regularly cough up million of dollars to ibm et al in order to avoid having to dump or transfer their cobol apps. However there probably isn't much new cobol being written, and things that were written in cobol are likely to be incrementally moved away to the cloud because it's fundamentally orders of magnitude cheaper. Suppose someone writes a VM emulation of the mainframe hardware that can be deployed at scale onto commodity hardware, and run the cobol code on that emulator.

Maybe in the final analysis it would be cheaper to translate all the cobol into something else rather than rewrite and re-test everything that has been done in C or that depends on software written in C. Or to move that code onto VM's running commodity servers. It hasn't happened yet, people are still buying mainframes. But that doesn't mean it won't happen.

So I think I might come down on the side of C after all. Of course it's cheaper to just keep buying the hardware updates to run the cobol, that just comes out of this year's cashflow, and you know it's a diminishing cost base over time as you migrate off the mainframe.

Sounds like a google interview question....

Final thought: if god said "no more cobol!" and you were forced to write all the cobol apps in other languages, would it be possible? I think the answer is yes. You could do what you do in cobol in other languages. But if god said "no more C!" and you had to rewrite all the C code in other languages... well, maybe, but someone would just have to invent C again and call it a different name with but with a slightly different language design, and call it ironoxide; but it would still essentially be the same thing.
 
I made fun of my best friend, back in the 1970s for going to school to study COBOL.
Until he died a few years ago, he was still making big bucks at a local financial institution programming and managing COBOL.
 
As far as I know COBOL is a very simple language but the ecosystem it runs over is thorough and very domain specific.
This article


Completely fails to address the issue and depicts that COBOL is used just because of language intrinsic which is definitely not true.

It's like telling someone to take a course at Erlang because huge telco telephony system programmers are in demand. Erlang itself is 1% of typical Erlang code running on it.

In this kind of tight coupling the language will reflect the design patterns of the entire system but it will certainly not reflect the complexity of the said system.
 
I made fun of my best friend, back in the 1970s for going to school to study COBOL.
Until he died a few years ago, he was still making big bucks at a local financial institution programming and managing COBOL.

As far as I know COBOL is a very simple language but the ecosystem it runs over is thorough and very domain specific.
This article


Completely fails to address the issue and depicts that COBOL is used just because of language intrinsic which is definitely not true.

It's like telling someone to take a course at Erlang because huge telco telephony system programmers are in demand. Erlang itself is 1% of typical Erlang code running on it.

In this kind of tight coupling the language will reflect the design patterns of the entire system but it will certainly not reflect the complexity of the said system.
I know a couple of people in the same position. International market for their skills. There are fewer and fewer of them. Mainframe assembler too.
 
As far as I know COBOL is a very simple language but the ecosystem it runs over is thorough and very domain specific.
This article


Completely fails to address the issue and depicts that COBOL is used just because of language intrinsic which is definitely not true.

It's like telling someone to take a course at Erlang because huge telco telephony system programmers are in demand. Erlang itself is 1% of typical Erlang code running on it.

In this kind of tight coupling the language will reflect the design patterns of the entire system but it will certainly not reflect the complexity of the said system.
This is the key point. It's not just the cobol itself.
 
As far as I know COBOL is a very simple language ...
It is and it isn't. If you look at the early language (1960s or 1970s version) from the viewpoint of a modern programmer, who was steeped in structured programming (meaning having subroutines = call and return, if statements, and for and while loops available) and then later in object-oriented programming, then COBOL is indeed ridiculously simple. On the other hands, it has built-in easy to use features that few other languages have. For example: Define two records in the data division, which share some field names, and then use "move corresponding". Right there you have the functionality of C's printf and of object-oriented struct copy, rolled in one. Or try "perform C through F varying ... until ...". The power of the fact that at the call site (when you say perform) you can select where the return is going to be (the through clause) is something that other programming languages can't do, without writing lots of extra code.

... but the ecosystem it runs over is thorough and very domain specific.
You don't actually need a mainframe (blackbird pointed at the Z architecture document) to run COBOL. Programs in the language itself can be compiled and executed on many other machines, including all Unix machines. But what language purists forget is that computer programs live in an ecosystem, both using things (like libraries) and embedded into things (like batch processing). So a typical COBOL system will be using a database (with SQL embedded right in the COBOL source code), it will interact with a transaction monitor like CICS, the programs are all designed to be scheduled and run in batch using JCL and JES3, encryption and security is handled by RACF, and so on. And my detailed knowledge of that complex ecosystem is 35 years old, much has grown into it.

Think of it this way: the ls program is an important part of Unix, and it is written in C. If someone were to get rid of C, we would have to re-implement ls using a different programming language. But we would also have to redo everything that ls uses (such as libraries, and all file-related system calls), and everything that uses ls (like most shell scripts), and everything that uses the library the we had to modify to get the new ls back to work). The ls program does not exist in a vacuum.

A good description of the interconnectedness of these things can be found in Brooks "Mythical Man-Month", when he states the following: If it takes 1 unit of time to write a certain program, it takes 3 units of time to write the same program if it is part of a system of multiple programs, or 3 units of time if it is a system program (meaning run in the kernel), and 9 units of time for an operating system, which is a system of system programs.
 
These days, banks are trying to switch to Java, a language that was the hot stuff when I was in college. Java has a reasonably easy-to-understand syntax, but correcting a botched implementation of something is often a pain, because the correction tends to break everything, so it's a fix/break recursion with a lot of cursing. And that's on top of JVM having all kinds of performance issues in comparison with executables produced by other languages... 😩
 
There is sad news on the page at link above:


My first programming language was PL/1 (1982-1990). Next was Turbo-Pascal (1990-1995).
I didn't really like Pascal because of its limitations. It was more of an educational language after all.
Although with Turbo-Professional (and subsequently Object Professional) it became more usable. My first serious program was written in Turbo-Pascal (with Turbo-Professional and TechoJock's Object Toolkit).
 
I had a mate who was a mainframe programmer, I think he was a db2 admin too. One day long ago I proudly showed off my shiny new ibm pc to him, running turbo pascal. His reply was "THAT's not a computer, come with me!" He took me down to the machine room, which was in a huge multi-story hangar-like building. Lots of water cooling pipework all over the place. Huge disk drive packs, etc. Fan noise so loud I could hardly hear him speak. He said "THAT's a computer!" :cool:

It was like this...

View: https://www.youtube.com/watch?v=iQrLPtr_ikE


Of course the modern mainframe hardware isn't anything like the size of the old stuff, its more like what you see in a data centre. It's basically power servers now. Like this. Just standard racks with a fancy door on the front.

View: https://www.youtube.com/watch?v=RnpvyJaX4Q4

I remember back around maybe early 90s ibm had a box about the size of a ps/2 model 80 or maybe an as400 box that was a small 360 processor in hardware. You attached it to the ps/2 and it provided a way to run mainframe software. I don't know if they have anything like that now, probably not. The mainframe guys used to call the os/2 pc a "programmable terminal". Hahaha..
 
There is sad news on the page at link above:


My first programming language was PL/1 (1982-1990). Next was Turbo-Pascal (1990-1995).
I didn't really like Pascal because of its limitations. It was more of an educational language after all.
Although with Turbo-Professional (and subsequently Object Professional) it became more usable. My first serious program was written in Turbo-Pascal (with Turbo-Professional and TechoJock's Object Toolkit).
Turbo pascal was really groundbreaking when it first came out though. Hit a button to comple and run, another button to get a debugger. I was learning pascal on a mainframe (actually a CDC, same family as cray) which was a hideous old development environment to work in (textjab, anyone?). My uni had just bought a new lab of ericsson pc's, ibm pc clones. I went down the computer centre and got the ticket that allowed me to get on the PC's. The first time I tried turbo pascal was just about the last time I did any coursework on that mainframe. I was blown away by how good it was. I could see the PC was the future...
 
  • Like
Reactions: _al
There is sad news on the page at link above:


My first programming language was PL/1 (1982-1990). Next was Turbo-Pascal (1990-1995).
I didn't really like Pascal because of its limitations. It was more of an educational language after all.
Although with Turbo-Professional (and subsequently Object Professional) it became more usable. My first serious program was written in Turbo-Pascal (with Turbo-Professional and TechoJock's Object Toolkit).
Very sad. I've still got his "pascal user manual and report" on my shelf...
 
These days, banks are trying to switch to Java, a language that was the hot stuff when I was in college. Java has a reasonably easy-to-understand syntax, but correcting a botched implementation of something is often a pain, because the correction tends to break everything, so it's a fix/break recursion with a lot of cursing. And that's on top of JVM having all kinds of performance issues in comparison with executables produced by other languages... 😩
Java? Bah. Load of rubbish. Just keep debugging those garbage collector bugs! :) I think they'd be better off sticking with the cobol. I guess ibm has had a big push on java over the last 20-odd years though.
 
Purely by chance I spotted this today, from asianometry. He makes a lot of good videos, the one about tsmc and apple was particularly good. Anyway this is about unix. I thought appropriate to this thread. I suppose you can say that proprietary commercial unix has largely gone. But unix has mutated into something with a larger installed base than windows now, probably the worlds largest installed base of any o/s. Interesting video.

View: https://www.youtube.com/watch?v=HADp3emVABg
 
Back
Top