Firefox AI enshitification

A freebsd port of waterfox would be nice. A lot of work though. I haven't tried librewolf yet. For the time being I've set those settings that scottro posted. 'Enshitification' needs to enter the oxford english dictionary. :-)
 
Yeah, all this AI stuff is getting out of hand. If it's branded "A.I." it should be strictly opt-in. It's nonsensical to forcefully integrate it into every part of computing and development. They may as well integrate a coin toss option into everything since it's always 50/50 on accuracy. "Ask a coin toss bot" should be the option name.
 
given the corporate direction that browsers are going, I decided that when I have time (assuming I ever do have time) I'm going to download and do a clean room build of chromium, disabling anything in it that I feel facilitates data-mining.

One big one that most people are unaware of is that browsers no longer wait until you hit <ENTER> when you put in a URL or search term, as they are sending individual keystrokes to google/amazon/etc as you type. I also want the option to disable all the on-init loading of javascript infomatics that happens when you start the browser, even before you go to any site.
And Bonzai Buddy did this same stuff and was widely considered malware. It just shows that if you pile on a dump truck of shit all at once people will notice, but if you put the steaming cow feces on, one shovelfull at a time, no-one will notice.
 
And Bonzai Buddy did this same stuff and was widely considered malware. It just shows that if you pile on a dump truck of shit all at once people will notice, but if you put the steaming cow feces on, one shovelfull at a time, no-one will notice.
You have it all wrong, friend.

Everybody notices.

The secret is "put a smiley face on it and tell people they are nazi if they don't accept!"

Before they could start introducing these things in earnest, a lot of work had to be put into demolishing people's sense of private space and general decency.

The secret is: "get them to ask for the anal probe!"

Hilariously, everything that is happening now would have been a nazi's wet dream. "If we had known it could be done like this, we would not have bothered with the whole 'war' business."

Rant=off.
 
You have it all wrong, friend.

Everybody notices.

The secret is "put a smiley face on it and tell people they are nazi if they don't accept!"

Before they could start introducing these things in earnest, a lot of work had to be put into demolishing people's sense of private space and general decency.

The secret is: "get them to ask for the anal probe!"

Hilariously, everything that is happening now would have been a nazi's wet dream. "If we had known it could be done like this, we would not have bothered with the whole 'war' business."

Rant=off.
I talk to average non-techies (as they would describe others) all the time. They don't notice. See: https://xkcd.com/2501/
 
It's plastered all over everything constantly. They might as well pave the streets with "Big Brother is watching" tiles. Nobody can go five sentences without saying "AI" or some synonim.

They notice.
 
The idea of putting internet-connected microphones and even "security" cameras in-house connected to cloud services is wild, yet everyone has em. Even the printer I have advertises cloud-connectivity for cheaper ink at a convenience :p

I'm not sure on my point. I avoid cloud-connected stuff on the basis of I don't want someone else spying from-afar and run self-hosted stuff to be decentralized.

Social media; people would rather use Mastodon hosted by someone else before taking a RPi and hosting their own instance that can communicate with others on decentralized protocols, as if using someone else's Mastodon instance is "better" than more-popular Facebook. Or Reddit: Lemmy is still someone else's platform.

Email I treat like a physical home address: It's not realistic to run around the world to grab your own packages, so you need to rely on someone's shipping service (USPS, UPS, FedEx, etc). Running a mail server is more effort than it's worth: I just don't use popular free ones and pay for a provider ;)

AI I'd run on a local computer in-house, and have other devices connect to that. I'm not sure how locally-hosted AI works though, but I wonder if that would get into the mass-scraping thing that websites have Anubis trying to block?
 
The idea of putting internet-connected microphones and even "security" cameras in-house connected to cloud services is wild, yet everyone has em. Even the printer I have advertises cloud-connectivity for cheaper ink at a convenience :p

I'm not sure on my point. I avoid cloud-connected stuff on the basis of I don't want someone else spying from-afar and run self-hosted stuff to be decentralized.

Social media; people would rather use Mastodon hosted by someone else before taking a RPi and hosting their own instance that can communicate with others on decentralized protocols, as if using someone else's Mastodon instance is "better" than more-popular Facebook. Or Reddit: Lemmy is still someone else's platform.

Email I treat like a physical home address: It's not realistic to run around the world to grab your own packages, so you need to rely on someone's shipping service (USPS, UPS, FedEx, etc). Running a mail server is more effort than it's worth: I just don't use popular free ones and pay for a provider ;)

AI I'd run on a local computer in-house, and have other devices connect to that. I'm not sure how locally-hosted AI works though, but I wonder if that would get into the mass-scraping thing that websites have Anubis trying to block?

The whole technology behind it is actually pretty awesome. You could probably get some nifty programs up and running with a few days of studying numpy and dedicating an old cpu to it. Techonolgy isn't bad, it's just bad when totalitarian scumbags get their hands on it.

The mass scraping, it is important to understand just how data hungry these programs are. If you toss a coin, twice, the odds of not getting 50-50 are pretty high. But if you toss it a million times, the deviation will be very small. These programs work the same way. If they adjust their logic gates based on 100 outcomes, it is not very effective. But if it adjusts based on a billion billion billion outcomes, it can get pretty accurate. I'm sure probably most of these companies don't even generate these programs themselves. They just harvest and sell data.

If, on the other hand, by "host locally" you mean having some local instance of some existing interface, then yes, you would be rellying on these mass scrapers in a direct way. You would presumably be downloading a binary, which is the product of algorhythmic selection refined by that data.

This isn't software like you are used to. It's not some guy or team writing code, and then you downloading that code or a binary compiled from that code. What they write is an algorhythm, or suite of alghorythms, that process data. The algorhythms of themselves are of no value. "AI" is just a fancy word for intensive statistical analysis. You aren't writing a program, you are statistically deriving it.

If you mean generating your own programs, then you would definitely have to solve the problems of: A. what is your target data and B. how to harvest it.
 
Yes, it says nightly in my dwm's equivalent of taskbar. It shows if I do help=>about as 140.0.4esr. (As you already saw).
It's been pretty stable for me. I *think* though my aging memory could be wrong that very rarely, after an update, it loses some of my saved logins.

Librewolf as both package and port makefile (updated with gitup yesterday) show as
144.0.2.
 
Waterfox claims to remove a lot of firefox crapware, telemetry, etc. Hope for the best.


They say they make money via the default search engine.... I always change it to something hopefully innocuous, although I don't know if I can really trust any of them.
 
If you are not hosting your own search engine, you are doing it wrong.
Would that require frequent mass-scraping of websites to populate a useful search index? What about obscure websites? (if I heard right Google Chrome sends browsed-URLs to Google search index, but without something like that how would small or new sites get indexed?)
 
Would that require frequent mass-scraping of websites to populate a useful search index? What about obscure websites? (if I heard right Google Chrome sends browsed-URLs to Google search index, but without something like that how would small or new sites get indexed?)
You are thinking wrong. Self-hosted meta-search engines like SearXNG still use google, brave, duckduckgo, startpage or whatever engine(s) you select in SearXNG control panel. I host it in a container that is on a completely separated network on a VLAN subnet that has no access to local network. This conainer is also connected to internet via dedicated VPN tunnel in Estonia. This way, there is no way for search engine to tie whatever it is you serach to your machine, your ip, your browser or pretty much anything. Because you are effectively not even using those engines. This would be like me calling you on your phone, and asking you to google something for me, and just tell me what the results are. The best thing about SearXNG is that you can select multiple search engines, and when you search for something, you get the result from all of them. User interface is clean, no ads, no bs, and it integrates into any browser by simply putting a search string in your browser search options. While im out of my local network, i use wireguard to connect to my opnsense instance and i have my local search engine everywhere i go. I also configured this for my entire family. They are completely unaware that they use it, and search results are always perfect. Eversince i started using it, youtube stopped bombarding me with suggestions that are based on what i searched. Best of all, its free and open source.
 
Back
Top