Frameworks, bytecode VMs and "do-x-with-y" ideas...

zirias@

Developer
I would like to talk about general directions in software development. Since good old times when (at least around Unix) nearly everything was done in C -- with C++ slightly taking its share, we have seen different ideas evolve.

I'd say the first radical thing that reached some relevance was Java. You don't run your code on a real existing machine any more, instead your compiler targets a "virtual machine" that can be implemented on many real systems, executing the exact same bytecode there. This sounds like a nice idea -- portability becomes a non-issue, at least in theory. What we have seen is the development of a huge ecosystem of Java libraries, some of them depending on "native" libraries, a typical Java application has to bundle all its dependencies, complexity is getting huge. Portability "out of the box" ends where your code interacts with system-specific APIs, which is not so much different from the situation when just using C.

Then, Microsoft came around, basically copying this idea with .NET. Newly added was the idea, that you can have different languages compile for their execution environment (the CLR). Oh, and of course the idea to deliver an implementation of the runtime only for Windows -- at least, this was finally addressed in .NET Core. The resulting situation is very similar to Java: Lots of libraries, a deployed application bundles all of them. (Disclaimer: I develop for .NET in my job, it's not my intention to totally bash this platform, it can be useful of course).

Both Java and .NET also have a lot of libs, frameworks etc. to easily write backend/server applications and web apps/services. One particular idea in earlier versions of the ".NET framework" always struck me as "doing it wrong" on an entirely new level (no idea whether something similar existed for Java as well) -- I'm talking about "WebForms" here. Back then, event-driven GUI applications were often build and were well-understood by most developers, and the GUI Framework coming with .NET for that kind of application was "Windows.Forms". So, WebForms should bring the exact same programming model to web applications, with a lot of "magic" happening behind the scenes. A web application written with WebForms was consequently slow, bloated (consuming a lot of bandwith) and very intransparent. Well, meanwhile, this whole idea disappeared for good.

Speaking of web applications: You could probably write a book about these. One interesting (but IMHO misguided) development there was to implement more and more client-side logic, of course using Javascript. This lead to the need for better, faster Javascript engines. Now, with such engines available, the next glorious "do-x-with-y" idea popped up: Why not write server/backend logic in the same language with the same programming model as the client/frontend logic? Node.js, the next dependency hell, was born. With npm guiding you into its seventh circle. Now, we see build processes that fetch thousands of small node packages, just to complete a build step.

Electron just hit the official ports tree. It's kind of the logical consequence. Now we can build web applications with "rich" client logic, and a backend using the same programming model -- why not reverse the stupid idea of "WebForms" and build desktop applications as if they were web apps? Surely, it's a much better idea the other way around. Please don't mind the sarcasm. (Disclaimer: I also don't want to bash the work that went into an electron port, I even helped a bit with it -- there are apps built using electron that are potentially useful and should be available on FreeBSD for that, but I'd really prefer to see that idea disappear again in the future). In fact, electron takes the complexity another step further. It needs a lot of code from chromium (after all, the programming model is "web apps", so you need a browser, you just don't want the user to notice) and also mixes in node.js. Final behemoth, mission accomplished.

Ok, this ended in a slightly unorganized rant, sorry for that. I just think: can't we all come to our senses and eliminate all the monsters? There's nothing wrong with native code and shared libraries (bundled or not, depending on the usecase). C already gives you a nice opportunity for portable code, when used correctly. If you want more (like e.g. "safe" memory accesses), maybe newer languages like go or rust will be good alternatives. If you want to isolate your applications (something JVM and CLR can at least help with), there are the good old jails -- or "fancy" docker stuff. So can we please find a sane way to handle complexity again?
 
An example of "why":
I'm involved in a commercial project. The application must have both local GUI and a web interface. The decision makers thought that there is no much sense to support both front ends, and had me write a Qt Webkit front-end which is actually a browser with all JavaScript/NodeJS/Angular etc. stuff.
 
In my view, the problem is there are too many people and companies wanting applications and not enough programmers to develop them. So programmers/companies make software products--CMSes, frameworks, GUI tools, etc.--that are easier for software "technicians" to use to create the applications these companies want; much to the chagrin of software programmers who know better.

These code generators, if I may, put out lousy code in an incompatible way and quickly fall out of favor as someone else comes up with another code generator that works the same--only better--until someone develops a tool that fixes the tool that fixes the original tool and everybody is happy until the next tool or fool comes along to start the cycle again.

In the meantime, thousands of older CS graduates are members of the Navel Observatory (yes, navel) looking for something to do but are considered not up-to-date with the latest tools for fixing tools that fix tools.
 
Back
Top