getting 50 now

PMc

Active Member

Thanks: 39
Messages: 117

#1
Happy new year everybody!

And, probably more important: this summer will be 50th anniversary. Plans for a party, anybody?
 

yuripv

Active Member

Thanks: 67
Messages: 162

#2
Should we mourn instead given the Rob Pike's "Not only is UNIX dead, it's starting to smell really bad."? :)
 

Trihexagonal

Daemon

Thanks: 725
Messages: 1,295

#4
Should we mourn instead given the Rob Pike's "Not only is UNIX dead, it's starting to smell really bad."? :)
People just don't give age the respect it deserves these days.

Respectfully and for the last time, since I've lost count of how many times this makes, I suggest we celebrate the 50th Anniversary of UNIX by fixing the freebsd.org site, since we didn't for the 25th Anniversary of FreeBSD.


I would be in a panic and could not go to bed at night if I discovered my site code didn't validate or I had made some kind of error that messed it up. I accidentally deleted my domain and everything in it instead of an old domain I let go when I renewed hosting last summer. I was exhausted at the time but could not rest till it was right.
 

ralphbsz

Daemon

Thanks: 839
Messages: 1,367

#7
Should we mourn instead given the Rob Pike's "Not only is UNIX dead, it's starting to smell really bad."? :)
Rob Pike is a very smart guy. I've never met him personally, so I don't know whether he is a nice guy or not. He's said many interesting things. For example, in the late 90s or early 2000s he presented a paper at a respected computer science conference claiming that "systems software research is irrelevant". To get the context for his frustration, you have to understand that since the mid-80s, he led the Plan 9 research operating system team at Bell Labs (those are the people who had given the world Unix already).
All these incendiary things he says or writes have a large grain of truth in them, and they cause people to think and argue. You can't take them quite literally though. Yes, he's right, Unix smells bad. There has been no major intellectual breakthrough in Unix since ... the last quarter century. There has been lots of tweaks and improvements, but all are conceptually minor. We can't ask Dennis what he would do today, and Ken is working on something prosaic or he is retired (I don't know which it is). Rob Pike was also right: Since 2000, systems software research has become mostly minor or irrelevant: Minor tweaks (look at the program of OSDI, SOSP, FAST and ASPLOS), nothing groundbreaking. Of the operating systems that are in significant use today, the youngest one is ... Microsoft Windows (the current version is a descendant of Windows NT, which was started in the early 90s). The other major operating systems still in use (Unix, VMS, MVS and VM) are all at least a decade older.

Intellectually, Rob Pike is right: The world has become boring. In practice, he's wrong: There are computers everywhere; you can buy a full-function self-hosting computer (with development environment) for $5 (the Raspberry Pi 0); most of the population of the planet has access to computing resources and to an enormous amount of information, and so on. And Rob Pike is still gainfully employed, working on computers.
 

pyret

Member

Thanks: 27
Messages: 50

#8
The quote is technically correct. Unix is like the gasoline powered vehicle which hasn't changed since the late 1800s. Sure, there are now anti-lock brakes, power steering, air conditioning, etc., but it is still powered by an internal combustion engine and has four wheels. There is nothing on the horizon for new operating systems. It is just adding more features on the same old frame.

In a way we are going back to where we started. Virtual machines and cloud computing which is essentially what IBM had with VM. golang is based on work from Alef and Limbo programming languages and Hoare's CSP and some say it has ignored 40 years of language research. Functional languages are now having a resurgence with Clojure and Elixir being newer, while Lisp itself was specified in 1958.
 

yuripv

Active Member

Thanks: 67
Messages: 162

#9
ralphbsz yep, I inserted that quote just for the fun of it, but look, it turned what seemed to became "let's drink to it" thread into something more interesting, at least your post is pretty informative :)
 

ralphbsz

Daemon

Thanks: 839
Messages: 1,367

#10
golang is based on work from Alef and Limbo programming languages and Hoare's CSP and some say it has ignored 40 years of language research.
Actually, in programming languages we are have made progress (unlike in operating systems). OO programming and OO A&D were real game changers. Sure, people had been informally doing objects decades earlier (every operating system if full of "control structures"), but thinking about everything it terms of objects, state, inheritance, modularity changed the way we think. Then Java came along and demonstrated that you can do interesting programming (like in C, more complex than what Fortran and COBOL allowed), without having to do all the memory management yourself. I had a lot of fun doing OO Java programming in the mid 90s, and it was really a breath of fresh air after a decade of C and non-OO C++. So in programming languages, there was real progress, and the 90s through early 2000s brought good ideas from research of the 70s and 80s to fruition.

Since then, it hasn't actually changed that much (intellectually), other than C++ getting another hundred features every two years, until right now it is completely collapsing under its own weight (so much so that even Stroustrup is putting the brakes on adding new features to it). I've looked at the Go manuals, and done 50 lines of it for fun, and it looks much more pleasant than C++, while being pretty simple and minimalist. I should really do my next major project in it, just to become more modern and hip. But you are right, Go is not great research, it is good engineering design in the realm of programming languages.

EDIT: Just looked it up: according to Wikipedia, Rob Pike works on the implementation of Go (the programming language) now. And according to scuttlebutt on the web, Ken Thompson (one of the two fathers of Unix) worked for Rob Pike on the Go implementation team a year or two ago.
 

drhowarddrfine

Son of Beastie

Thanks: 1,038
Messages: 2,937

#11
It always bothers me when someone says something needs replacing just because it's old. If something is better, then use it but, until then, Unix is the best of the lot and no one has come up with anything better to replace it.
 

Trihexagonal

Daemon

Thanks: 725
Messages: 1,295

#12
It always bothers me when someone says something needs replacing just because it's old. If something is better, then use it but, until then, Unix is the best of the lot and no one has come up with anything better to replace it.
I don't care how old it is and that goes for most things. If it still works the way I want and expect it to it's probably still good enough for me. To me, it now being 50 means there have been eyes on it 50 years. Not to mention all the history that goes with it.

Whenever I tell somebody what UNIX is I always make a point of the Bell Labs aspect. That's part of the attraction of it to me.

My Smell-O-Vision must be on the blink.
 

rufwoof

Active Member

Thanks: 64
Messages: 200

#13
A revolting stench for some, is a sweet perfume to others. I like the Unix-like philosophy and dislike how Linux is becoming more like Windows over time.

Professor Hasenöhrl (University of Vienna) in 1904 wrote a paper containing the equation E = 3/8mc2 ... old foundations are the base upon which greater things can be built.
 
OP
OP
PMc

PMc

Active Member

Thanks: 39
Messages: 117

#14
Rob Pike is a very smart guy. I've never met him personally, so I don't know whether he is a nice guy or not. He's said many interesting things. For example, in the late 90s or early 2000s he presented a paper at a respected computer science conference claiming that "systems software research is irrelevant". To get the context for his frustration, you have to understand that since the mid-80s, he led the Plan 9 research operating system team at Bell Labs (those are the people who had given the world Unix already).
All these incendiary things he says or writes have a large grain of truth in them, and they cause people to think and argue. You can't take them quite literally though. Yes, he's right, Unix smells bad. There has been no major intellectual breakthrough in Unix since ... the last quarter century. There has been lots of tweaks and improvements, but all are conceptually minor. We can't ask Dennis what he would do today, and Ken is working on something prosaic or he is retired (I don't know which it is). Rob Pike was also right: Since 2000, systems software research has become mostly minor or irrelevant: Minor tweaks (look at the program of OSDI, SOSP, FAST and ASPLOS), nothing groundbreaking. Of the operating systems that are in significant use today, the youngest one is ... Microsoft Windows (the current version is a descendant of Windows NT, which was started in the early 90s). The other major operating systems still in use (Unix, VMS, MVS and VM) are all at least a decade older.

Intellectually, Rob Pike is right: The world has become boring. In practice, he's wrong: There are computers everywhere; you can buy a full-function self-hosting computer (with development environment) for $5 (the Raspberry Pi 0); most of the population of the planet has access to computing resources and to an enormous amount of information, and so on. And Rob Pike is still gainfully employed, working on computers.
Well, I don't know that Rob Pike, but anyway, please allow me to seriousely disagree with You. First of all, I am just looking at the annual earnings of a company called "Apple" during the last few years, and they look quite impressive. Then, when I look inside the products of that company, I find the Berkeley dollarprompt - and that qualifies for me - I don't see anything smelling bad (in the beforementioned sense) there.

Then, what kind of "intellectual breakthrough in Unix" would You probably expect? Or, to put it differently: after you have crafted a seed (and the code within the seed), it is put into the ground, and the tree starts to grow. There is no major intellectual breakthrough happening afterwards, as there is none needed anymore; still over time the outcome may become very big. All the trick lies in the seed, or more specifically, in the self-referencing scheme of the seed. This is creation, as opposed to development.

Then, stating a lack of groundbreaking further development may be correct within a limited view that only perceives compute-sciences and engineering. But then, lets widen up the view and look at what we call technological impact assessment - lets look at the social and cultural implications. And there I see a terrifying beat of innovation (sensemaking or not), which makes people seriousely loose their grip on reality - something that is absolutely contrary to the other 100'000 years of human development, and with still increasing pace. From here, I don't see a need for even more "breakthroughs" - instead I see a great difficulty in riding this future shockwave and putting the outcomes into some sane direction. We have no idea at all on what an increasingly virtual world may do to our minds, to our cultures in the long term, as these were trained over +1mio. years for a tribalistic environment and still have difficulties to adapt to an industrialized culture.

We have sown the wind, we will reap the world-wind.

Now some other ideas running thru my mind: if there is a real urge or pain, I suppose things will get developed. Ther was always a pain with maintaining filesystems, and then there was ZFS (which I consider a quite perfect solution). So maybe there is not much pain otherwise, and things that want to be done can be done. Which, in other words, means: the Unix thing is still up-to-date.

And yes, in some kind the world has become boring. But isn't this indeed because things just work now, and we no longer need to study and investigate and team up and fight to get them?
 

pyret

Member

Thanks: 27
Messages: 50

#15
It always bothers me when someone says something needs replacing just because it's old. If something is better, then use it but, until then, Unix is the best of the lot and no one has come up with anything better to replace it.
The creators of Unix did come up with something better than Unix. It was called Plan 9. But it never got any traction because people didn’t see enough that made it worth the change and maybe licensing and probably other reasons.
 
OP
OP
PMc

PMc

Active Member

Thanks: 39
Messages: 117

#17
Sure, if a good team does sit down and do a rewrite, bringing in all the experience and adressing the problems, that brings a substantial improve. But then, why should anybody switch and take on them the teething troubles, as long as there is no serious pain with the current state of affairs? If it ain't broken, don't fix it.
And you need a quite solid base of competent beta testers for such an effort. I remember the issue with the scheduler (I think I mentioned here), which created a bit of disgust for me - and enough curiousity to look into the code and try and identify the problem. I got into correspondence with the original author, but that died off somehow - probably busy with other things (same as me). And so I have made myself a patch for my machines that at least solves my disgust, but nothing else has happened - and I have not found the time (and don't have the equipment) to do solid performance validations.
So if you don't have some enthusiastic influencers, not much will happen, even with smaller things.
 

ralphbsz

Daemon

Thanks: 839
Messages: 1,367

#18
PMc: I'm not disagreeing with you. From an engineering and financial viewpoint, Unix (in all its variations) is a great success. Matter-of-fact, I personally often go months using no operating system other than various Unixes (which includes Linux, MacOS, and Android).

Rob Pike's viewpoint comes from a different direction. He was a member of one of the two premier systems software research organizations in the 70s through 90s, namely Bell Labs and Berkeley. The created great stuff - today we call it Unix, and in particular (relevant to this forum) the BSD flavor of Unix. But after Unix, there was very little new research. Berkeley had Sprite, Bell Labs hat Plan 9, Tanenbaum had Amoeba, and little came from them in terms of new operating systems. Sure, they created spinout technologies ... for example Sprite begat both the Tcl programming language and the idea of log-structured filed systems. But since then, nobody has proposed and seriously followed through on a new paradigm of how to run a computer. For a researcher (not an implementor, engineer, marketer, or financial person) like Rob Pike, this is an important observation.

The question is not just about a rewrite. Linux did that: It wrote a completely new kernel, without using a single line of either Bell Labs or Berkeley code. Similarly, much of the user land has been rewritten multiple times; the Gnu compiler/linker and LLVM/Clang have nothing to do with Ken Thompson's original Unix C compiler (and are sadly lacking the ability to hack themselves into the system, at least as far as we know). But that is just an internal rewrite, leaving the interfaces at the outside surface the same, and copying many of the ideas about how to implement it. For example, Linux' VFS layer (the part of the kernel that takes file system IO requests and distributes them over multiple internal file systems) is mostly the same as in AT&T and Berkeley Unix, even though no lines of code were stolen. What has not been there is new concepts, new approaches, new capabilities.

You ask above: what intellectual breakthrough would I expect? Let me give you one example. At work, I use "Unix machines", which are typically clusters of 10^2 to 10^n individual computers (n is relatively large), with this many OSes, kernels, network stacks, local file systems, and so on. There is a huge amount of ad-hoc tools to get state (like files or executable programs) and resources (like processing power or RAM) distributed among these machines. Every major user of computer clusters, clouds, or supercomputers has a different set of ad-hoc tools, and thousands of engineers around the world work hard on these tools. Yet, it is all very pedestrian, manual, inefficient, error-prone, and annoying. In the 1990s, various research groups (in particular the Berkeley group) had the vision of making a large number of potentially disparate machines into a single entity, that a user could flexibly use. Cluster computing at large scales remains a big construction yard. Various companies (such as Sun and Apollo) tried to make a go of it, and make profits off it. Sun ended up mostly succeeding as a business, but it ended up selling individual workstations and servers, and never delivered on the promise of a real cluster computer; Apollo eventually failed (got absorbed by HP and the technology eaten). I think the closest we ever got to a homogeneous compute environment was actually not in the Unix ecosystem, but it was Digital's VAXcluster (which died together with Digital, the company, and with the VAX). Yes, we have little bits and pieces of the technology, which make clusters tolerable, and which typically don't scale to large installations: NFS as a network file system, LSF for batch scheduling, and so on. But even for something as simple as "parallel distributed compilation/linking", there is no universal or generally accepted solution. Basel might be the closest approximation, but it has tiny market share. For the harder general problem of "programming language that can be used for a multi-threaded problem and works from small CPUs to million-node clusters", there is nothing in site. Much research is needed here, and that research is just not getting done. And research not of little spot solutions to spot problems (like Bazel might be a solution to the parallel/distributed make problem), but an organic approach to making a set of heterogeneous machines feel and work like a single computer. I think that's the kind of problem Rob Pike has been bemoaning.
 

pyret

Member

Thanks: 27
Messages: 50

#20
An enterprise grade cluster solution, and even an empirical operating system, are not going to be solved by open source without financial backing from corporations and/or universities. The cost is too high.
 
OP
OP
PMc

PMc

Active Member

Thanks: 39
Messages: 117

#21
But since then, nobody has proposed and seriously followed through on a new paradigm of how to run a computer. For a researcher (not an implementor, engineer, marketer, or financial person) like Rob Pike, this is an important observation.
I Agree. And then I find that this is not uncommon in technology. Lets consider civil air transportation: the last important breakthrough was the jet airliner, and that was 1955. Not much has changed since then (except they can no longer do a barrel roll due to too much computers).

What has not been there is new concepts, new approaches, new capabilities.
My impression is still that those are not really needed, in such a way that there would be pain without them, and/or that they would be eagerly grabbed and gain momentum/market share when brought into existance.

But then, my viewpoint is limited, to the things that come to my attention. For instance, I have no contacts whatsoever with people who would use such masses of computers (as my interest in Unix was always just private), or a practical idea what they might want to do with them.

At work, I use "Unix machines", which are typically clusters of 10^2 to 10^n individual computers (n is relatively large), with this many OSes, kernels, network stacks, local file systems, and so on. There is a huge amount of ad-hoc tools to get state (like files or executable programs) and resources (like processing power or RAM) distributed among these machines. Every major user of computer clusters, clouds, or supercomputers has a different set of ad-hoc tools, and thousands of engineers around the world work hard on these tools. Yet, it is all very pedestrian, manual, inefficient, error-prone, and annoying. In the 1990s, various research groups (in particular the Berkeley group) had the vision of making a large number of potentially disparate machines into a single entity, that a user could flexibly use. Cluster computing at large scales remains a big construction yard. Various companies (such as Sun and Apollo) tried to make a go of it, and make profits off it. Sun ended up mostly succeeding as a business, but it ended up selling individual workstations and servers, and never delivered on the promise of a real cluster computer; Apollo eventually failed (got absorbed by HP and the technology eaten). I think the closest we ever got to a homogeneous compute environment was actually not in the Unix ecosystem, but it was Digital's VAXcluster (which died together with Digital, the company, and with the VAX).
Well, if I get Your point correctly, such a thing had been tried, as once I came across to play with it. It was called DCE (distributed compute environment). But that did die away. It seemed to me the market was not interested, and/or the commercial companies prefer being at war with each other.

This is also a point: most of the things that get established did not come from commercial endeavours. The original Unix was an experiment, Linux as well, the Internet was a scientific/political approach, etc. - and most of them take a really long runway to start up. Only later the commercial companies come along like locusts and look what there is in for them, not doing much good.
 

Vull

Active Member

Thanks: 46
Messages: 120

#22
Should we mourn instead given the Rob Pike's "Not only is UNIX dead, it's starting to smell really bad."? :)
Unix isn't dead and this statement is a non sequitur. Change != progress. Unix is a tool for deploying software, and software doesn't usually benefit when everything needs to be rewritten every few years. Having to rewrite everything because of operating system changes doesn't necessarily add any real value to the software. That's not Unix-world, that's Windows-world. Real value is often taken away when old software needs to be rewritten; often nothing is gained, and programming time is spent or wasted on software translations which reap no real benefits. It's like re-inventing the wheel, although some people may see real profits as a result of this type of re-invention. One could argue that Microsoft and other big software players became dominant via marketing strategies that required reinventing the wheel every few years.

A screwdriver is a tool that hasn't changed much over the past 50 years. Does that mean that screwdrivers are dead and starting to stink? They still do the job they were designed to do. So does Unix, and so does the wheel.
 
Top