Object Oriented Programming really an Advantage

Well, you can argue that both hardware drivers and filesystems in operating system kernels are very object-oriented, including inheritance.

They just don't use an OO language. Except on macOS.
 
writing an interrupt driver in OO doesn't make sense,
It's possible: one particular interrupt could serve one particular device, there are multiple devices in the system (each using one object), and the ISR is then a method of a particular device.

I did that in VxWorks once, several decades ago. The compiler magic required for taking a C++ member function of an object and casting it to math the prototype for an ISR (which is probably void ()(void*) or something bland like that) required half a day of figuring out, and a screen full of comments to explain. Kind of tricky to cast the "this" pointer of an object that exists silently in each member function to be explicitly visible on C. Worked extremely well, and was ludicrously fast. For a little while, our prototype system (which used multiple VME backplanes with a dozen Motorola mVME1xx boards and a dozen SCI coherent networked memory boards, from Dolphin Systems) was the fastest TCP/IP stack in the world, and we set some sort of record of the most CORBA transactions per second (half a million). Didn't hurt to have the (Lawrence-) Berkeley group helping with our network drivers.
 
They just don't use an OO language. Except on macOS.
In the late 1980s, there was a really nice guy from Stanford's SLAC accelerator control group who wrote a little text book: "Object oriented programming in C". And no, he didn't mean C++, he meant C. It's doable. The resulting code tends to look a bit verbose and scattered, but once you learn the idiom, it's pretty obvious. It relies on lots of calling via function pointers; and if you want virtual functions in the inheritance graph, you either put function pointers into your struct, or each struct has a pointer to another per-class struct that holds the function pointers (equivalent to the vtable).

Sadly, I can't remember the name of the guy. The book is probably in a cardboard box in the basement, not having been looked at in a few decades. Chuck or Charles Something?

If you look at OO this way, you quickly see that kernels tend to be very very OO. From there, it is only a small leap to use languages that explicitly support OO in the kernel, which can be done; I know some commercial projects that use C++ in the Linux kernel.

EDIT: Mer said the same thing above already, while I was typing.
 
  • Like
Reactions: mer
In the late 1980s, there was a really nice guy from Stanford's SLAC accelerator control group who wrote a little text book: "Object oriented programming in C". And no, he didn't mean C++, he meant C. It's doable. The resulting code tends to look a bit verbose and scattered, but once you learn the idiom, it's pretty obvious. It relies on lots of calling via function pointers; and if you want virtual functions in the inheritance graph, you either put function pointers into your struct, or each struct has a pointer to another per-class struct that holds the function pointers (equivalent to the vtable).

Sadly, I can't remember the name of the guy. The book is probably in a cardboard box in the basement, not having been looked at in a few decades. Chuck or Charles Something?

If you look at OO this way, you quickly see that kernels tend to be very very OO. From there, it is only a small leap to use languages that explicitly support OO in the kernel, which can be done; I know some commercial projects that use C++ in the Linux kernel.
absolutely...I've always strived to use an opaque pointer handle to an object with a well defined API when I wrote SO libraries. In fact, my embedded code frequetnly uses OO idioms so that it makes sense to me (if it reaches a certain degree of complexity)...However, I still prefer writing classes in C++. Well, other than an application God class...I've never had use for such things, although some OO guys are in love with the bloody things.
 
  • Thanks
Reactions: mer
If one looks at some of the current frameworks like Qt and GTK (less than v3?) Objects in C.
MacOS/IoS with Objective C? Objects in C done differently than in C++. My opinion once you get past some of the syntactic thing in ObjectiveC, it's not that hard to understand.
 
kent_dorfman766 good insight. One "problem" I've run into is "library C++ but someone wants to call it from C" or the converse of "Library in C and caller is C++". Doable, but you need to figure out the interface and compiler/linker options to constrain things.
 
kent_dorfman766 good insight. One "problem" I've run into is "library C++ but someone wants to call it from C" or the converse of "Library in C and caller is C++". Doable, but you need to figure out the interface and compiler/linker options to constrain things.
calling C code from a C++ program is the preferrable direction, and it's not hard. just remember to wrap the C func declarations in an extern "C" wrapper so they don't get name mangled. I generally warn people against going the other direction, although it is also possible.
 
OOP in crystal,

Code:
class Counter
    @x : Int32
    def initialize(x : Int32)
        @x=x
    end
    def setx(x : Int32)
        @x=x
    end
    def getx : Int32
        @x
    end
    def add1
        @x=@x+1
    end
end
c : Counter=Counter.new(3)
c.setx(5)
c.add1
puts c.getx
 
calling C code from a C++ program is the preferrable direction, and it's not hard. just remember to wrap the C func declarations in an extern "C" wrapper so they don't get name mangled. I generally warn people against going the other direction, although it is also possible.
Agree it's not hard. But if one is providing a shared library you need to think about it before you get complaints.
 
I ended up with a negative view of OO programming.

C++ of course invests most of the recent development (much of those 1000 pages) into facilities that are not object-oriented.
"I invented the term 'Object-Oriented', and I can tell you I did not have C++ in mind."

- Alan Kay
There's also a quote "Object orientation is in the mind of the programmer, not in the compiler" or something like that. I think it was Alan Cox, but I can't find a reference and am not sure.
 
C++ of course invests most of the recent development (much of those 1000 pages) into facilities that are not object-oriented.
This is unfortunately a bit true. It seems like the standards committee is hell bent on pythonizing c++ sometimes. One of the best arguments I've ever read against using c++ in commercial development came from a forum years ago and I did a screen capture but doubt I could find it at 1:30AM after drinking beer.

It was something to the effect that c++ is so complex that most experts will still do things different ways and it is too hard to find cohesiveness among a development team using the language.

I still prefer to use c++, but unless I'm running the dev effort and can dictate standards then it better be a program/system that I'm solo developing.
 
“I’m sorry that I long ago coined the term ‘objects’ for this topic because it gets many people to focus on the lesser idea. The big idea is ‘messaging’.”
Akan Kay @ Wikipedia
Note that Simula67, a superset of Algol60, came out 6 years before Smalltalk (designed in 1965). It introduced pretty much everything OO: objects, classes, subclasses, virtual procedure, inheritance, dynamic lookup etc. It even had coroutines! It was the first OO language and influenced pretty much every other OO language.

Alan Kay in "The Early History of Smalltalk" writes this:
What Simula was allocating were structures very much like the instances of Sketchpad. There were descriptions that acted like masters and they could create instances, each of which was an independent entity. What Sketchpad called masters and instances, Simula called activities and processes. Moreover, Simula was a procedural language for controlling Sketchpad-like objects, thus having considerably more flexibility than constraints (though at some cost in elegance) [Nygaard, 1966, Nygaard, 1983].

This was the big hit, and I've not been the same since. I think the reason the hit had such impact was that I had seen the idea enough times in enough different forms that the final recognition was in such general terms to have the quality of an epiphany. ... For the first time I thought of the whole as the entire computer and wondered why anyone would want to divide it up into weaker things called data structures and procedures. Why not divide it up into little computers, as time sharing was starting to? But not in dozens. Why not thousands of them, each simulating a useful structure?
So that to me is Alan Kay's central idea: each of his objects is simulating a little special purpose computer! These little computers communicate via messages is why he says "messaging" is the big idea. Though note that Simula 67 was already providing this but most every other OO language missed the boat by not providing coroutines to simulate independent objects. [Of course, concurrency brings in its own set of problems but simula 67, Smalltalk avoided them as coroutines are easier to reason about.]

In 60s/70s there were papers on "communicating sequential processes" and such but the advent of C and C++ meant that we kind of resigned ourselves to using unix processes communicating via byte streams, & concurrent threads controlling shared access via mutexes. Because we were busy focusing on the problems brought about by C++ OO complexity or C's weak typing.
 
These little computers communicate via messages is why he says "messaging" is the big idea.
I guess that's why the language he and others created was named smalltalk.

I always profit a lot by seeing things from the point of historical view of their development history. It always helps me a lot to grasp concepts, instead of just slaving over an abstract concept as it is today, not understanding the background why some one thought what about it to create something.

I've just seen, there is a port for GNU Smalltalk.
And it seems it was not "only" influenced by Simula, but by Lisp, also.
It seems worth to dig a bit into it.

Thank you for putting this in here! 👍
 
You don't need to link it, but it was nice if you could remember at least title, and author.
I am very much interested in that. ty
I have spent about half hour searching. And I can't find it. I know the following tidbits: It is about OO in C (not C++!). It was published in the late 80s; when I started by job at SLAC (in November 91), it was already in print. The author worked at SLAC, in the accelerator controls department, on the SPEAR accelerator. I think the author's first name is Chuck or Charles, but I'm not 100% sure. I think I have a copy of the book ... alas, there are about 20 cardboard boxes of older books in the basement, and searching through all of them would take me a day. Those boxes are supposed to be 100% cataloged, but I can't find anything resembling the book in my catalog.

Next step: I suspect that my copy of the book was acquired long after I stopped working at SLAC, and probably I found it at a used bookshop, or a library book sale, or when my office research library (yes, I've worked at places that that had their own libraries) closed and gave away lots of books. If that's true, then the book is not in the boxes in the basement, but in a box of keepsakes from previous jobs. There are much fewer such boxes, and I might be able to find it in an hour or two.
 
I have spent about half hour searching. And I can't find it.
Thank you very much!
Yeah, I know that. It can be really bothering: You know there was a book you knew once, but you cannot remember neither the correct title, nor the author, anymore.
Giving you the tip to search for it in catalogues of used book shops are unnecessary, because you already knew that, and tried it.
"you cannot judge a book by its cover, but you can remember it by it." 🤓

Thanks a lot again. Maybe you suddenly remember it later. Such may happen a couple of days later while doing something completely different. 😎
 
I know the following tidbits: It is about OO in C (not C++!). It was published in the late 80s; when I started by job at SLAC (in November 91), it was already in print.
I doubt it's: Object-oriented Programming with Ansi-C by Axel-Tobias Schreiner, but that seems to relate to this particular topic. A quote from the preface:
[...] no, object-orientation will not replace sliced bread.
You can download it from the link; it is also available from Axel-Tobias Schreiner's website: Axel-Tobias Schreiner - books. There's more interesting UNIX stuff and Plan 9 too.
 
I think I have a copy of the book ... alas, there are about 20 cardboard boxes of older books in the basement, and searching through all of them would take me a day. Those boxes are supposed to be 100% cataloged, but I can't find anything resembling the book in my catalog.
1) May be it is a SLAC report? At least a site specific search on www.slac.stanford.edu might jog your memory? 2) Our memory can mutate over time and we may be 100% sure about something that never happened or existed! [I could've sworn all the simulation objects in an ethernet switch simulator I wrote years ago were derived from a base class SimThread but turned out I was doing explicit calls "new SimThread(....)" in their constructors! I think I meant to do this but some reason I didn't.]
 
Back
Top