Proposal. For Browser survival in FreeBSD

PMc

Aspiring Daemon

Reaction score: 291
Messages: 781

And the first thing is honesty to yourself. Or we are all kernel driver writer and also we can lay down a NeuralNet, Invest with it our Bond Portfolio, design a Quant. Comp. algorithm and why not, build a QC itself. This is not the reality. I am sorry.
Argue for your limitations, and sure enough, they're yours. -- Richard Bach
 

drhowarddrfine

Son of Beastie

Reaction score: 1,469
Messages: 3,544

A few notes.

There is only one language that runs in a browser and that's JavaScript. Unless you want to include the javascript related WebAssembly but it's not intended for general use; that is, it's intended for transpiling another language into a javascript runnable. To repeat, no other programming language runs in the browser.

There is only one specification for browsers and it's written by the browser vendors themselves. The WHATWG people splintered off from the W3C a while ago but now the W3C publishes the specs from WHATWG. Note that w3schools is not related to the W3C in any way, shape or form.

So the two best sources everyone uses for web development is the spec itself and the Mozilla Developer Network (MDN) which is supported and written by the browser vendors and CanIUse for compatibility checking.

While lots of consumer-facing software (in particular that written by amateurs for free) is very bad and very unstable
Take web development in general. I don't know about very large companies but most sites you see are held together by spit, glue and the latest fan craze on reddit. Speed of development at all cost is the overriding motto.
 

mark_j

Well-Known Member

Reaction score: 124
Messages: 378

Javascript is a cancer. A world free of that junk would be a much simpler and less prone to <insert latest browser exploit here>.
 

20-100-2fe

Active Member

Reaction score: 185
Messages: 175

Software is house of cards. You change a tiny bit and it all fall apart.
Mine never did in 30 years of professional software development.
And I'm no exception, all the colleagues I worked with were achieving the same quality.
We have seen and fixed pieces of software that were that fragile only because they had been developed by incompetent persons.
 

ralphbsz

Daemon

Reaction score: 1,480
Messages: 2,431

Javascript is a cancer. A world free of that junk would be a much simpler and less prone to <insert latest browser exploit here>.
It's perfectly possible to write clean and stable code in JavaScript. It takes effort and a good engineering mindset, but it can be done. It has recently become easier, with TypeScript, which is JS with your checking.

Now, is a lot of JavaScript in web pages junk? Sure, because it's fine cheaply and sloppily. But that's by choice.
 

mark_j

Well-Known Member

Reaction score: 124
Messages: 378

Javascript is bloated nonsense. It's what makes us need faster and faster CPUs to handle the guff that's on every web page these days. Take a look at the 'engines' required to interpret javascript; they're enormous and growing. As I said, the major way to break into any system is via a web browser, via a web page running this cancer called javascript.

I also challenge the notion that people can and are able to write good "clean and stable code" in javascript. Most of it seems to be re-used, re-purposed code where the authors have NO idea how it does what it does. Its main purpose seems to be to facilitate spying and selling of ads. It's got very few redeeming features.

Anyway, I digress from the topic at hand.
 

ralphbsz

Daemon

Reaction score: 1,480
Messages: 2,431

A agree that we are digressing.

And I agree that a lot of the code that runs in web pages today is junk.

But the problem is not JavaScript, the language. I know quite a few people who write JavaScript for a living, and are perfectly respectable software engineers. A significant fraction of it is NOT deployed in browsers, for example look at the Node.js initiative, which is explicitly intended to support the use of JavaScript outside of browsers. And I know people who can write well-engineered and reliable code that runs in browsers, and is neither inefficient nor attackable.

The basic problem is not JavaScript. Getting mad at JavaScript is like getting mad at guns, knives, clubs, and rocks, which are used by people to kill or hurt other people. The problem is that there is good money to be made by spying on people and hacking into their privacy. These are all illegal activities, and they happen to use JavaScript when on the web. The correct response is not to ban the tools that they (and many legitimate people) use, but to go after illegal activity.

By the way: How does this forum work? The web pages rely heavily on JavaScript! Look at the downloaded pages with an editor sometime. Without JavaScript, forums like these would be much more rudimentary and hard to use.
 
OP
Nicola Mingotti

Nicola Mingotti

Well-Known Member

Reaction score: 193
Messages: 494

We are wildly digressing so I will refrain form praising in favor of Javascript, and also to contrast the view the there exist at all a class of people how can produce bug-free software. ;)
 

fernandel

Aspiring Daemon

Reaction score: 221
Messages: 893

Getting mad at JavaScript is like getting mad at guns, knives, clubs, and rocks, which are used by people to kill or hurt other people. T
I do not know about JavaScript but guns and other weapons were made for killing through the history and present too.
 

PMc

Aspiring Daemon

Reaction score: 291
Messages: 781

I do not know about JavaScript but guns and other weapons were made for killing through the history and present too.
You have a point in that, but then, I think there is a greater picture to view.

I think arguing if javascript is good or bad misses the actual point. Which is, the underlying architecture is fundamentally unsuited to the task. Http was devised as a stateless protocol for retrieval of static pages. But what we do now is 100% stateful and 90% distributed computing. If you put such "add-ons" onto a structure not the least designed for it, the outcome can only be ugly and problematic.

And javascript is not the only such issue. The whole matter of secure authentication and TLS is just a big trouble. And the idea of discerning pages on the same host by different hostnames (instead of different paths) ridicules both http and dns.
In short, that whole patchwork has become a horror creation of Frankensteinian dimensions. But this is what happens when the "free market" is allowed to take possession of something as beautiful als the Internet once was.

Then, concerning that "spying": this is simply a lie.
Nobody is spying on You, because nobody is the least interested in You personally. I know this is hard to take, but we must face the truth: the only thing that FAANG and those folks are interested in is Make Money Fast.
You are just a means to that end. You are cattle.
And FAANG is not spying on You; they are only taking care of their livestock.
 
OP
Nicola Mingotti

Nicola Mingotti

Well-Known Member

Reaction score: 193
Messages: 494

I do not know about JavaScript but guns and other weapons were made for killing through the history and present too.
caveat. broadly diverging, and provocative.
well consider, if there were not weapon at all the, major of your village would still be who punch harder. Weapon are the only thing that give a chance to the fake concept of equality. In some sense they are the core fundation of democracy.
 

fernandel

Aspiring Daemon

Reaction score: 221
Messages: 893

caveat. broadly diverging, and provocative.
well consider, if there were not weapon at all the, major of your village would still be who punch harder. Weapon are the only thing that give a chance to the fake concept of equality. In some sense they are the core fundation of democracy.
...who punch harder...but village will be here still. I don't agree about "core fundation of democracy" but nowadays if you have a nuclear power than they don't "bother' you but it has nothing with democracy.
But debate about weapons is IMO for FreeBSD forum but better for philosophical debate.
 
OP
Nicola Mingotti

Nicola Mingotti

Well-Known Member

Reaction score: 193
Messages: 494

...who punch harder...but village will be here still. I don't agree about "core fundation of democracy" but nowadays if you have a nuclear power than they don't "bother' you but it has nothing with democracy.
But debate about weapons is IMO for FreeBSD forum but better for philosophical debate.
Bah, nuclear, i would say it is the same. Instead of the village of people is the village of states.

Anyhow, I will go to write some code. If find definitely more interesting to build stuff than talk about it.

happy weekend folks !
 

ralphbsz

Daemon

Reaction score: 1,480
Messages: 2,431

I think arguing if javascript is good or bad misses the actual point. Which is, the underlying architecture is fundamentally unsuited to the task. Http was devised as a stateless protocol for retrieval of static pages. But what we do now is 100% stateful and 90% distributed computing. If you put such "add-ons" onto a structure not the least designed for it, the outcome can only be ugly and problematic.
Actually, there is a way to look at it that makes it very sensible. Today state is kept in the cloud, far away from (unreliable and insecure) end devices. If you order something online, or renew your car registration online, the state of the transaction (the book order from Amazon, the payment of your registration fee for a license plate number) is kept in the servers of Amazon and of the DMV. But that means that you need an application on your end device to interact with the server. In the very old days, that application was a card punch: you submitted your requests as punched cards, and got printouts back. Then we moved to terminals, where users had either an ASCII terminal or a 327x on their desk, and submitted requests by filling in forms or entering numbers on the terminal, and then got the results back on the screen. None of that is comfortable, nor does it scale well.

So what we do today is that we give users applications which run on their end devices, and that communicate with the servers over standard sockets (usually with authentication and encryption). There are two distinct modes of doing that. One is to make the user actually download and install the application, and then explicitly run it. That's for example how a lot of e-mail and navigation is done. The other way is much more lightweight, flexible, and dynamic: the user finds the service using the web (which after all has good search mechanisms), and then downloads an "application", namely a big chunk of JavaScript, using the http(s) protocol. That application then runs (in a standardized sandbox, that being the web browser), interacts with the user (like making them fill in fields, for example the name of the book they want to order, or the license plate they want to renew the registration on), and it interacts with the server via standard sockets (to find out how much the book costs and what it looks like, or to find out how much the registration fees are, and then to actually perform payment). In this way http(s) has become a protocol that is used for many things that are not really static pages any more: download blobs of executable code (namely the JavaScript that creates the actually visible page and interaction), and tunnel many other protocols.

If Tim Berners-Lee had proposed this in the early 90s, it would have seemed sensible then, and it still is. It has taken us 30 years to get to reasonably standardized programming languages for "instantly downloadable applications" (that being JavaScript) and reasonably standardized mechanism for rendering output to the human and getting their input (that being HTML and the user interactions of JavaScript), and we went through several dark ages in the process (remember when for example everyone thought the web applications would all be compiled Java classes, and all we really got was nervous text).

The whole matter of secure authentication and TLS is just a big trouble.
Yes, the design of https, TLS, SSL, authentication and all that is a patchwork, a mess. But it ultimately works.

And the idea of discerning pages on the same host by different hostnames (instead of different paths) ridicules both http and dns.
Dirty hack. Genius for simplicity of setting up many virtual web hosts on a single server, but a giant mess to administer. I hate it. I've never set something like that up, but I've had to do maintainance on it. Unpleasant.

In short, that whole patchwork has become a horror creation of Frankensteinian dimensions. But this is what happens when the "free market" is allowed to take possession of something as beautiful als the Internet once was.
I've used networks-of-networks since 1982, and the "internet" (meaning TCP/IP based networks with long-range connectivity) since 1986. It was never beautiful. It was always a horrible mess, hacked together with enthusiasm but without critical thinking. For example, the horror of trying to route an e-mail from a Bitnet host to a UUCP host via an intermediate Arpanet machine, and the replying to that e-mail. Or setting up how to download from Simtel20 (because that machine hat 36-bit words, so all the files that were ftp'ed from it were interesting). It has always been a mess. There have been well-organized top-down designed networks in the world, but either they failed, or they were internal to organizations (such as IBM or the military) that have very strong command and control mechanisms.

Then, concerning that "spying": this is simply a lie.
Nobody is spying on You, because nobody is the least interested in You personally. I know this is hard to take, but we must face the truth: the only thing that FAANG and those folks are interested in is Make Money Fast.
That is mostly true. The companies that everyone is upset about (FAANG, plus their Chinese counterparts, and the ISPs) really have no interest in spying. All they want to do is sell their services. Sometimes you are the customer, sometimes you are the ingredient.

On the other hand, some organizations are really spying on you (and on all of us), namely a variety of non-existing agencies.
 

20-100-2fe

Active Member

Reaction score: 185
Messages: 175

I've not worked with punch cards, but I've seen quite a bit of the evolution of HMI.
My perception of it is that it was user experience has driven the evolution of technology all that time.
Now, front-end technology is mature enough in terms of user experience and it continues its evolution to be as satisfying for the developer as for the user.

ECMAScript, even with the improvements brought by TypeScript, is still a piece of sh*t compared to say, Java.
But currently available ECMAScript frameworks allow developers to bring smiles of their end-users' faces.
And this is ultimately why we get up every morning and cope with technology all the day - for the smile of our users.
If it were only for money, we could earn it with another job.
If it were only for the fun of technology, we would soon stop because technology doesn't make sense in itself.
What makes our efforts sensible and meaningful is the contribution we can make to a collective work thanks to our command of technology.
And the smile of a user is a much more tangible measure of the value of our contribution than a number on our bank account statement.

Actually, there is a way to look at it that makes it very sensible. Today state is kept in the cloud, far away from (unreliable and insecure) end devices. If you order something online, or renew your car registration online, the state of the transaction (the book order from Amazon, the payment of your registration fee for a license plate number) is kept in the servers of Amazon and of the DMV. But that means that you need an application on your end device to interact with the server. In the very old days, that application was a card punch: you submitted your requests as punched cards, and got printouts back. Then we moved to terminals, where users had either an ASCII terminal or a 327x on their desk, and submitted requests by filling in forms or entering numbers on the terminal, and then got the results back on the screen. None of that is comfortable, nor does it scale well.

So what we do today is that we give users applications which run on their end devices, and that communicate with the servers over standard sockets (usually with authentication and encryption). There are two distinct modes of doing that. One is to make the user actually download and install the application, and then explicitly run it. That's for example how a lot of e-mail and navigation is done. The other way is much more lightweight, flexible, and dynamic: the user finds the service using the web (which after all has good search mechanisms), and then downloads an "application", namely a big chunk of JavaScript, using the http(s) protocol. That application then runs (in a standardized sandbox, that being the web browser), interacts with the user (like making them fill in fields, for example the name of the book they want to order, or the license plate they want to renew the registration on), and it interacts with the server via standard sockets (to find out how much the book costs and what it looks like, or to find out how much the registration fees are, and then to actually perform payment). In this way http(s) has become a protocol that is used for many things that are not really static pages any more: download blobs of executable code (namely the JavaScript that creates the actually visible page and interaction), and tunnel many other protocols.
 

PMc

Aspiring Daemon

Reaction score: 291
Messages: 781

Actually, there is a way to look at it that makes it very sensible. Today state is kept in the cloud, far away from (unreliable and insecure) end devices. If you order something online, or renew your car registration online, the state of the transaction (the book order from Amazon, the payment of your registration fee for a license plate number) is kept in the servers of Amazon and of the DMV. But that means that you need an application on your end device to interact with the server.
Ah, I see. You have the experience and perception to perceive the continuity from the mainframe to the cloud.
I was always looking at something entirely different.

With the mainframe we have a big and expensive central provider machine and small and (rather) inexpensive consumer machines (terminals). The same is true with radio stations, newspapers, in facht all the traditional media.
And this is so by technical necessity: it doesn't work any other way.

With the Internet things are completely different. On the Internet every attached piece can act as a server or client, and there is no technical distinction between them. I think this is a very essential difference, and at this point it could have been possible to get rid of the centralized structures in toto. (Some technologies do actually use this: torrents, bitcoin).
But this concept is highly disruptive on a social level: it breaks our understanding of hierarchy and dependency.

And so, as I perceive it, after some time of experimentation (usenet et al.) there was a fallback to hierarchical structures. We do not need an e-mail provider, because we have all the technology to run our own e-mail. We do not need facebook, because we have all the technology to run our own webserver. But people value dependency over knowledge, and so, with the spread of the Internet, those providers (of technically superfluous things) came into being, and happened to become the richest and most powerful people in the world.
"The slaves shall serve", as Aleister Crowley put it.

There was once a promising technology called DCE, which brought the idea of computing onto a generally distributed level. This was by no means complete, but it addressed the issue that plain equality means chaos, so in a distributed landscape you need even more of a solid organizational structure. But here also the market was not ready for it, and then somehow OpenGroup and IBM hosed it.
 
Top