What goes into making an OS to be Unix compliant certified?

This article is actually by the guy who got Apple Mac OSX Unix certified but I love this part toward the end.

If I were asked to do the same thing for Linux, it likely would take five years, and two dozen people.

Linux is pretty balkanize, has a lot of kingdom building, and you have to pee on everything to make it smell like Linux.

I could do the same in FreeBSD in about a year and a half, with a dozen co-conspirators to run the changes through.

A lot of the work would happen in the “ports” tree.
 
One interesting thing is that they were able to do this because at least $200M were at stake. And because they got explicit support from Steve. The cost of doing it was at least $20M (explicitly described in the piece), probably many times that (the engineers that did that work get salaries).

Now try to imagine doing the same in the open source area. Where do you find the people and the money for it?
 
"Unix Certified Compliant" does not seem (to me) to do much for the OS, but does both - make money for and justify the existence - of the certifying body.
 
"Unix Certified Compliant" does not seem (to me) to do much for the OS, but does both - make money for and justify the existence - of the certifying body.
This.

I do care that FreeBSD adheres to important standards (POSIX...)

I don't care at all about any "certification". If FreeBSD was a "UNIX™", that wouldn't buy me anything.
 
Unix certification is a requirement in some/many large organizations.
The number of organizations that have this requirement must be dwindling: the two operating systems that together have about 99% market share in the server space (Linux and Windows) are not certified. With the exception of the EulerOS Linux distribution, which is specific to Huawei.

20-25 years ago, the lion's share of Unix sales was commercial systems (AIX, HP-UX, SunOS/Solaris, and whatever Digital's OS du jour was called). At that time, quite a few customers required for SUS certification (typically large government agencies). Today, that would be impractical, unless you want to buy AIX.
 
I don't think that the UNIX certification even solved the original problem it was trying to solve in the first place. Apple binaries still can't run anywhere but on Apple. There's still little endian vs big endian encoding battles being fought between dinosaurs that are still stuck on AIX and COBOL: "I sunk millions into the systems, gotta get SOME mileage out of them, and everyone else owes it to me to follow my lead! MY systems are THE correct decision!". 😩 🤷‍♂️
 
Unix certification is a requirement in some/many large organizations. It guarantees interoperability among systems.
In the UK, many companies are still in the slightly 90's view that "the only way is Microsoft". They wouldn't know what certification is if it slapped them in the face.

At universities with development courses, we went through a nasty phase in 2003-2010 where Microsoft had a big push of their technologies into academia (you may recall Balmer screaming "Developers! developers! developers!" like a little fat child). A result of this is that students would almost start crying that they were being taught OpenGL rather than DirectX. Thankfully this has died off (now users are simply cattle to be milked for Microsoft) and many of the technical students are now really appreciating POSIX (more or less) platforms.

Luckily IT services have (as usual) made a little bit of a mess of the Windows image, so many students do find their way into the Linux or Macintosh labs. Ironically Windows being fairly unusable for doing anything more than simple web browsing has had the best influence. That and they actually get working roaming profiles, which strangely in my experience, UK companies seem unable to ever get working correctly on Windows. It probably stems from primary school and that terrible RM stuff.

TL;DR, POSIX seems to be as far as many people care about.
 
I don't think that the UNIX certification even solved the original problem it was trying to solve in the first place. Apple binaries still can't run anywhere but on Apple. There's still little endian vs big endian encoding battles ...
I partly disagree.

Running binaries from one OS on another OS: That's somewhere between hard and outright impossible. Java got close, and for a while in the 90s simple Java programs could be shipped between machines. Recent experience has been that while Java code runs, real-world programs that solve real-world complexity problems are too dependent on environment to work.

Litte-endian versus big-endian: For many problem domains, this is a non-issue. For example you do database queries, retrieve the results in a high-level language (not C or C++), communicate with other programs either via text streams or using a good RPC package, and do other IO in text format, the issue doesn't even arise. For bit-banging code, if you write your code carefully, it can be dealt with. Experienced programmers have dealt with this so many times, it's getting boring. A slightly bigger problem is the 32 <-> 64 bit chasm, but again, we've learned how to deal with it.

I think Posix and SUS mostly solved a gigantic problem. I started programming on Unixes in the late 80s, where porting between BSD-style and SysV-style machines was annoying and tedious. A significant fraction of system calls were different. Beginning in the early 90s, the trick was simply to code to published Posix standards, and many of the problems went away. Not all, and for esoteric edges of the system call universe (like file byte range locking, async IO, signals) it was still not completely trivial, but 90% of the work of porting went away.

Now did certification solve these problems? Perhaps indirectly, by forcing Unix suppliers to actually become standards-conforming. But for the most part, they already conformed with 90% of the standards anyway. The same is true today: If you write C code with lots of system calls to Posix specifications, it will usually run on Linux and *BSD, even though neither is certified.
 
Back
Top