I think arguing if javascript is good or bad misses the actual point. Which is, the underlying architecture is fundamentally unsuited to the task. Http was devised as a stateless protocol for retrieval of static pages. But what we do now is 100% stateful and 90% distributed computing. If you put such "add-ons" onto a structure not the least designed for it, the outcome can only be ugly and problematic.
Actually, there is a way to look at it that makes it very sensible. Today state is kept in the cloud, far away from (unreliable and insecure) end devices. If you order something online, or renew your car registration online, the state of the transaction (the book order from Amazon, the payment of your registration fee for a license plate number) is kept in the servers of Amazon and of the DMV. But that means that you need an application on your end device to interact with the server. In the very old days, that application was a card punch: you submitted your requests as punched cards, and got printouts back. Then we moved to terminals, where users had either an ASCII terminal or a 327x on their desk, and submitted requests by filling in forms or entering numbers on the terminal, and then got the results back on the screen. None of that is comfortable, nor does it scale well.
So what we do today is that we give users applications which run on their end devices, and that communicate with the servers over standard sockets (usually with authentication and encryption). There are two distinct modes of doing that. One is to make the user actually download and install the application, and then explicitly run it. That's for example how a lot of e-mail and navigation is done. The other way is much more lightweight, flexible, and dynamic: the user finds the service using the web (which after all has good search mechanisms), and then downloads an "application", namely a big chunk of JavaScript, using the http(s) protocol. That application then runs (in a standardized sandbox, that being the web browser), interacts with the user (like making them fill in fields, for example the name of the book they want to order, or the license plate they want to renew the registration on), and it interacts with the server via standard sockets (to find out how much the book costs and what it looks like, or to find out how much the registration fees are, and then to actually perform payment). In this way http(s) has become a protocol that is used for many things that are not really static pages any more: download blobs of executable code (namely the JavaScript that creates the actually visible page and interaction), and tunnel many other protocols.
If Tim Berners-Lee had proposed this in the early 90s, it would have seemed sensible then, and it still is. It has taken us 30 years to get to reasonably standardized programming languages for "instantly downloadable applications" (that being JavaScript) and reasonably standardized mechanism for rendering output to the human and getting their input (that being HTML and the user interactions of JavaScript), and we went through several dark ages in the process (remember when for example everyone thought the web applications would all be compiled Java classes, and all we really got was nervous text).
The whole matter of secure authentication and TLS is just a big trouble.
Yes, the design of https, TLS, SSL, authentication and all that is a patchwork, a mess. But it ultimately works.
And the idea of discerning pages on the same host by different hostnames (instead of different paths) ridicules both http and dns.
Dirty hack. Genius for simplicity of setting up many virtual web hosts on a single server, but a giant mess to administer. I hate it. I've never set something like that up, but I've had to do maintainance on it. Unpleasant.
In short, that whole patchwork has become a horror creation of Frankensteinian dimensions. But this is what happens when the "free market" is allowed to take possession of something as beautiful als the Internet once was.
I've used networks-of-networks since 1982, and the "internet" (meaning TCP/IP based networks with long-range connectivity) since 1986. It was never beautiful. It was always a horrible mess, hacked together with enthusiasm but without critical thinking. For example, the horror of trying to route an e-mail from a Bitnet host to a UUCP host via an intermediate Arpanet machine, and the replying to that e-mail. Or setting up how to download from Simtel20 (because that machine hat 36-bit words, so all the files that were ftp'ed from it were interesting). It has always been a mess. There have been well-organized top-down designed networks in the world, but either they failed, or they were internal to organizations (such as IBM or the military) that have very strong command and control mechanisms.
Then, concerning that "spying": this is simply a lie.
Nobody is spying on You, because nobody is the least interested in You personally. I know this is hard to take, but we must face the truth: the only thing that FAANG and those folks are interested in is Make Money Fast.
That is mostly true. The companies that everyone is upset about (FAANG, plus their Chinese counterparts, and the ISPs) really have no interest in spying. All they want to do is sell their services. Sometimes you are the customer, sometimes you are the ingredient.
On the other hand, some organizations are really spying on you (and on all of us), namely a variety of non-existing agencies.