FreeBSD is well situated

Every generation thinks it will be the last one because we humans have a difficult time coping with our own mortality.
Interesting thought. This may explain the hysteria instead of to deal reasonably with problems.
However, I didn't say it was the end of humankind. I am convinced many will survive, and build something new, whatever it may look like. I said it's the end of modern civilization as we know it. That's a difference.
The facts show currently that we're running into this doomsday scenario.
There still is some hope left, we may stop it. But hope alone ain't enough.
And looking at current politics don't get me my hopes up.

Anyway, today I just read, really hard it will become from 2100. Then I'm not alive anymore, and I don't have no children. Which doesn't mean I accelerate climate change, in contrary. I try my best to reduce my already small footprint even more, and trying to keep the issue in minds. On the other hand I see so many people day by day having children give a shit. So, why shall I bother?
 
So I've been using and working with UNIX and Linux for a long time...

I guess what I find confusing is -- that no one is stopping anyone from doing research. The original "UNIX teams" were all doing research projects - Palo Alto Research Labs (PARC), Bell Laboratories, even Berkeley University with their Computer Systems Research Group (CSRG).

I imagined myself back in the 1990s logged in online using BSD 4.3 running on a DEC VAX reading USENET using CNEWS or BNEWS in the "comp.os.bsd" newsgroup. Suddenly a new USENET post arrives from "someone" I have never heard of before named "Linus Torvalds" who writes a USENET posting that reads:

My name is Linus Torvalds. I am going to write LINUX, and this is going to happen wether you like it not ! Deal with it !

I think I would have been like "... well .. okay! ... go for it dude !" And then pressed the "next" button to proceed to my next USENET news posting.

Not really sure where todays "antics" come from on research projects? In the past we "just do" when it comes to research and development and we didn't make a big deal out of it. If you succeed - great! Here is a gold star for you. In the mean time stop dreaming about all of the money you plan to make, what food is going to be served at the "release party" and what your photo op is going to look like.
 
Not really sure where todays "antics" come from on research projects? In the past we "just do" when it comes to research and development and we didn't make a big deal out of it.

Funding. Profits. The todays research is not research, todays research are step 0 towards profit.

Look at OpenAI, they called themselves "Open" so they can get the funding, scrape the open resources, and then they closed down into a proprietary system.

Basically, where I and AI people part ways, is that I don't believe in research in computer science area which barely touches current applied computer science. The people that work on AI do not know how computers work. Ask your LLM engineer or data scientist what are network sockets, how exactly they work, or what is PCI express. They slap on their shit onto the already standing technology foundation and then they claim they're going to change the foundation themselves, and when they're unable to do so (It is like a child saying he will build a space rocket), they move the goalpost into changing the software platform, changing the hardware platform, change everything because then our stuff will work.
 
we didn't make a big deal out of it
Me neither.
What bothers me is, when there is something new developed, people flapping around, hyping things. Annoying. In german we say: "Nothings is eaten as hot as it's cooked." But above all telling others, all we there is now is obsolete garbage, and now all have to use this new things, is a nuisance, is missionary, sometimes even religious.

Just like I let others be, I don't wanted be urged into follow the swarm, just because everybody else is doing it now. I decide for myself, if something is useful to me, if I need, or want.
That's why I use FreeBSD, and not Windows, or any turn-key Linux distro.
 
Not really sure where todays "antics" come from on research projects? In the past we "just do"
Nowadays people are only concerned about the money as in profits. The guys who made Unix were just trying to make a better OS for the company's sake. No outside investors. No thoughts of profit/loss when it was sold. No concern with how fast they got it done. Just get it done right.

When I worked for a medical company, their top selling product was based on old hardware. They asked me to design a machine on new hardware. They didn't give me a deadline to get it done. They never asked how things were going. In fact, the one time they did ask how long it would take, they doubled that time line which gave me plenty of time to.....think!
 
Imagine a pocket calculator. Imagine you get a model you know it's making mistakes. ...
Q: How many percentage of errors are you willing to accept to use this calculator for your work?
That's a good question, and we have some data on that. I know of three cases where good-quality reliable computers gave wrong answers. The best known one is the Pentium FDIV bug, where certain floating point division operations gave wrong answers (I think they were slightly wrong). Then there was a problem in the FPU of the VAX 11/780, and I don't remember the details and can't find them on the web; I think the square root operation was occasionally completely wrong.

The last one is not published, and is very funny. A former colleague was working at a big university computer center as a systems programmer, and was asked to implement an accounting package for the newly released IBM VM/CMS (because early versions of VM had no functioning batch system, much less accounting for it). So what he did was to write a system program (what today we would call the kernel) that interrupted the CPU a few times per second all the time, and recorded which program was running. The problem with his code was that when it interrupted the running program, it did saved all the CPU registers, then did memory accesses and calculations, and the restored all the CPU registers (like in a stack, except the IBM 360/370 architecture doesn't use stacks). The problem was that he forgot to save/restore the floating point registers, and modified them! So any floating point calculation would have a change to be wrong a few times a second. But a big computer does hundreds of thousands of floating point operations per second, and many don't save intermediate results in the registers for long. He found his bug a long later, and it was never disclosed to the users (and I think only to his manager long later). So for maybe half a year or a year, a small fraction of all floating-point calculations were totally wrong. AND NOBODY EVER NOTICED!
 
The best known one is the Pentium FDIV bug, where certain floating point division operations gave wrong answers (I think they were slightly wrong).
That one was very famous, I remember. When I recall correct it was slightly wrong. But as you know, errors add up quickly in larger calculations.

😂 This story is in deed very funny. And it shows how even errors can be unnoticed.
Interesting question was, why it stayed unnoticed.
All I recall is, some FPU/operations were not used, but people programmed their own, because the given ones provided not what was needed.
 
Back
Top