Google Sabotaging Firefox

Too many allegations and too few proof in that article. Not one single real-world example. People turned their back to Firefox because it had become a slow starter, it took 2 minutes before I saw the first window. And who believes that the Quantum update was a whole overhaul also believes in the Easter Bunny. On my systems the incremental update to Quantum was less than 10 MB. They disabled some of the most time consuming telemetry stuff - that’s my allegation. However, Firefox becomes a quick starter - 2 seconds to the first window - if I switch off all these idiotic telemetry, and that is proven. I don’t use Chrome, though, for other reasons. I started to like very much Epiphany - very quick, perfect page rendering, perfect page introspection, and not bloated.

In regards to JavaScript, I do not share the aversion of many. JavaScript is a tool which does its job like a hammer. I never would abandon all my hammers in the house because hammers could be inappropriately used by others to hurting somebody.
 
JavaScript is a tool which does its job like a hammer. I never would abandon all my hammers in the house because hammers could be inappropriately used by others to hurting somebody.

I would compare scripts to good nails and bad nails.

Disable the hammer from pounding each nail by default till it can be determined which of the good nails are actually needed, then pound them only.
 
Rule #1: Don't ascribe to malice what can be explained by incompetence. Another rule is: Most conspiracies don't actually exist. If you have to imagine a conspiracy to explain something, you are probably wrong.

And as far as JavaScript goes: It is, for better or worse, simply necessary today. Static HTML web pages are simply not feature-reach enough for what most people try to do with the web. Nearly everything we do on the web that is more complicated than looking at static text and static pictures does require a client-size programming language. People who implement web-based stuff need to rely on that being present.

It would be nice if JavaScript was completely standardized (not just the core language, but the client-side functions), and all browsers and web servers could be tested for compliance with a standard. But that's not the world we live in, at least not yet. About 10 years ago I tried to learn AJAX (using asynchronous javascript in a web browser to communicate with servers, for example to implement database query or remote editing tools), and at that time, 90% of the effort had to go into hacking to be compatible with the half dozen incompatible implementations out there. It was a mess, and I gave up. Rumor has it that today the situation is better, but by no means good enough; I have stopped trying to implement browser-based stuff. It would be ideal if the combination of HTTP, HTML, XML, Javascript and JS client libraries was standardized by a coherent standards body. But that's simply not the world we live in, with standardization fragmented and sometimes disfunctional. So implementors have to test all combinations, and due to lack of manpower, things get dropped on the floor.
 
YouTube page load is 5x slower in Firefox and Edge than in Chrome because YouTube's Polymer redesign relies on the deprecated Shadow DOM v0 API only implemented in Chrome. You can restore YouTube's faster pre-Polymer design with this Firefox extension: https://addons.mozilla.org/firefox/addon/youtube-classic …

They probably never heard of Unix, BSD, and a console with links, w3m, lynx,...
 
Rule #1: Don't ascribe to malice what can be explained by incompetence. Another rule is: Most conspiracies don't actually exist. If you have to imagine a conspiracy to explain something, you are probably wrong.

I would add greed in there.

One of what used to be my favorite sites is now nothing more than a source of income for someone who is more interested in revenue from script driven ads than the payload, or quality of the nail. Several members posted about landing on the site and getting Google warnings about malicious downloads and drive-by downloads infecting their computers. They hammered every nail they encountered no matter where they went.

It only took one script to browse the forums with full functionality and one additional script for what, to me, amounted to full functionality for the site. Two good nails of the box, the rest were covered in rust. I never saw the warning pages or was the least bit worried about tetanus.

Scripts that facilitate Bitcoin mining, Meltdown, Specter etc. all what I would consider bad nails. NoScript my Hammer of choice.
 
The problem with all "standard" browsers is that they are consumer centic.
This means that like with all consumer software you get bloat and crapware.

Just look at Nero CD burner versions < 6.x vs the latest version; as soon as it becomes popular; it becomes horrific to use.

Thats the world we live in. Just reduce your dependence on the web browser. Use a proper email client, use IRC, etc. Online shopping and others should be so quick that the terrible consumer software shouldn't really effect you by the time you have already closed down the browser software and got on with your life.

There are more to computers than the internet. I tend to prefer exploring that stuff instead.
 
The problem with all "standard" browsers is that they are consumer centic.
This means that like with all consumer software you get bloat and crapware.

Just look at Nero CD burner versions < 6.x vs the latest version; as soon as it becomes popular; it becomes horrific to use.

Thats the world we live in. Just reduce your dependence on the web browser. Use a proper email client, use IRC, etc. Online shopping and others should be so quick that the terrible consumer software shouldn't really effect you by the time you have already closed down the browser software and got on with your life.

There are more to computers than the internet. I tend to prefer exploring that stuff instead.

The problem is rather than all software's (mac, windows, .... android) are around business and low quality software.
It is against performances and quality.

Even Linux fans are getting locked to their web browsers (file transfer, webdav,... web uploads,....).

The less people know about sockets, low level programming, ... the better for business.
 
My pet project for the past 10 years has been writing a web application system authoring toolkit (wasat) using PHP, HTML, CSS, SQL, and ECMAscript (javascript), primarily, to write cross-platform compatible business applications in which software and services can be used by FreeBSD, GNU/Linux, Mac OS, and Microsoft Windows clients, and can also be served by all the above platforms, excluding Windows. At first I tried to support multiple popular browsers for the client end of things as well, including Chrome, Firefox, Internet Explorer, Safari, and Opera. At first I would write software to run on either Firefox or Safari as the model browser, and then test and debug the same software until it would also work as well on the other 4 browsers. This final phase of testing and debugging quickly became my biggest problem area, particularly regarding deployment on Internet Explorer (IE), and I soon realized that I was spending more time trying to support all these less-than-fully-compatible browsers than I was spending on writing the software itself, so, after a month or three, I eventually abandoned support of all browsers other than Firefox. I continue to support Firefox only.

Mainly out of curiosity, and perhaps, arguably, some small degree of masochism, I recently tried installing and using Chromium on FreeBSD, strictly as an end-user, and with no great ambition towards supporting it as part of the wasat project, but quickly abandoned that effort too. It was problematic at best, and if Google doesn't care enough about FreeBSD to assist in porting their version of that browser to the FreeBSD platform, then I see no reason why I should care very much about using their browser either.

As a programmer I care as much about stability over extended periods of time as I care about cross-platform compatibility. Long before starting the wasat project, it was apparent to me that Microsoft was a destabilizing influence against software development. Since that time I've come to view Google as an impediment to software stability as well.

The last time (last October) that I tested my software on a Windows 10 client, I noticed that Firefox was still a slow starter on that platform, just as it was when I tested it on Windows XP 10 years ago. I haven't noticed the same problem on other platforms, only on Windows. Firefox starts in an acceptable amount of time on all the other systems I've tried it on.

Using ECMAScript, and particularly the XMLHttpRequest API, are essential to writing smoothly-running and network-efficient web applications, in my opinion, and in my experience with the wasat project. I don't know of any way to implement drag-and-drop or dynamically-changing drop-down list features without it. When not being used, my ECMAScript objects are quiet and/or entirely absent, and consume no resources either in the client browser or in the server. The resource-saving features of ECMAScript far outweigh whatever resource-usage they entail. Contrarily, the ZDNet link in the opening post of this thread uses ECMAScript to drive and animate adverisements extensively, so can I can easily see why sometimes ECMAScript is viewed negatively, but, like any good tool, it's all about how it's used, and why it's used, so I'm not going to remove ECMAScript from my toolkit.

Buggy ECMAScripts are a nightmare, and can be as bad as, or worse than, resource-greedy, advert-driving ECMAScripts, but none of this is the fault of ECMAScript per se.
 
There are more to computers than the internet. I tend to prefer exploring that stuff instead.

I frequent 3 forums where I only play Alliterative word games, this and 1 other tech related forum and 1 A.I. forum. I rarely use email and no form of Instant Messaging.

I do, however, love shopping on ebay but have enough stuff already.

The majority of my time online is spent working on my chatbot Demonica. If I can get motivated today by Monday will rise in Rank to #4 out of 142,000 botmasters registered there.

That's about the extent of my online activity, at $59 a month no less.
 
And who believes that the Quantum update was a whole overhaul also believes in the Easter Bunny.
Good arguments. I am not saying Mozilla is going in the right direction as far as privacy or speed.
I tend to believe conspiracy theories because I live in the real world. It is a very different place than what is in writing.
We have reams of paper at work telling how safe we are. In real life we have had 2 work related deaths in 15 years.

To think that Chome team is undermining Mozilla is a given to me. I just like to see it in writing.
To see if from an active Moz developer would mean more.
I live in the Oops world.
Things being classified as acidents which were actually sabotage. You see it's very hard to prove sabotage.
To bring those charges to have you be 100% correct. Nobody wants that hassle. We let it slide.
 
Although, my first name is not Thomas, I doubt everything unless I saw it. And the most suspicious news for me are those which want me to believe something which everybody is liking to believe. The ex. Mozilla executive could have simply said that the popular Google page X had some general incompatible HTML, CSS or JS features activated which broke the thingy Y and the accelerator Z of FF and was not easy to workaround, couldn’t he?
 
Maybe the ex-dev is jealous because Chrome is now in Windows. Clickbait for ZDNet.
I agree some shred of evidence would be helpful.
Dirty tricks don't often become public knowledge would be my retort.
 
... and I soon realized that I was spending more time trying to support all these less-than-fully-compatible browsers than I was spending on writing the software itself, so, after a month or three, I eventually abandoned support of all browsers other than Firefox. I continue to support Firefox only. ...

I develop everything on Mac against Safari and before deployment I give it a test with Firefox on Windows. The point is that almost all modern popular browsers are WebKit based and claim in the User agent header to be Safari compatible. The only popular exception is the desktop Firefox. So actually I develop for WebKit and I cannot remember any single occasion of JS incompatibility with Firefox.

The problematic area for me is mostly related to the different design of the UI elements. So in general, I create a basic design for buttons, text fields and areas and drop down menus in CSS which in theory would look the same in all browsers. Only recently I saw, that Epiphany (tells in the UA that it is Safari) draws a slightly ugly frame around SELECT elements where others did not, and I used a JS snippet to help Epiphany to forget the frame.
 
almost all modern popular browsers are WebKit based
Chrome and Opera are based on Blink which is a fork of webkit but the two camps are different now.
which in theory would look the same in all browsers.
Much discussion is now about whether the UI should resemble the UI of the device and not look the same in all browsers but now I'm getting OT.

I did not read the article but my thoughts about what Google has done is unintentional and caused by the bubble they live in. When Chrome first came out, Google stated they created their own browser, not to compete with the others, but because, back then, browsers were too slow and inconsistent. Firefox stole about 30% of market share from IE but that's as far as they could get on their own. Chrome essentially put the boot on Microsoft's throat and that was a good thing.

Google today still states that everything they do is to improve browsers, networking and computing and, it is true, many things are much better now due to their pushing these advancements. However, some of the things they do step on toes and push others aside. They take a "lead, follow or get out of the way" attitude and that an ruffle feathers. Worse, they seem oblivious to this fact and forge ahead with things others aren't sure they want. They may want them but feel it's forced upon them--such as AMP.

AMP, in some instances appears to speed up web pages but it also wrestles control away from the creators. It's non-standard to an extent and controlled by Google. The advantages AMP gives may be great for some but it gives me the impression they ignore what the rest of us think or want. It might not be malicious, as Microsoft so often did, and just part of a long range, glorious plan to make the internet a wonderful place but we aren't aware of that plan and where it's heading and, in my mind, makes me wonder how much longer I'll be needed on the technical side for improving the web and kick me to the curb.

So, Google will say, "This is great!. We're implementing it and so should you", and then we find out we have to or we'll get hurt in search results. Otoh, Google is often right. If we implement their methods, our sites wind up being better in some way but it's as if we're being forced.

I'm really not wanting to say the internet mantra of "Evil". I just feel they have a well thought out plan but are running roughshod over everyone and I don't like that either.
 
Chrome and Opera are based on Blink which is a fork of webkit but the two camps are different now. ...
As long as they’re claiming to be Safari as well in there User agent header, I do not need to care, do I?

Much discussion is now about whether the UI should resemble the UI of the device and not look the same in all browsers but now I'm getting OT.

My take on this is, that I go with customized UI elements for web applications with a definite target audience (for example my device control front ends), and simply stay with the elements provided by the device for public web pages with unknown audience.

For example, I ship my electrochemical measurement devices (Potentiostats) with the latest FreeBSD Release + GNOME3 + Epiphany. The UI is provided by a web frontend and the measurement software is linked to a web backend. The system can be either used headless and be controlled by any web client over the net, or simply controlled locally (localhost) by the Epiphany browser. Here the point is that in either case the UI looks exactly the same (s. screenshot). If you don’t take out a magnifying glass, you won’t notice a difference between Epiphany on FreeBSD and Safari on Mac. All that said, nothing of this would remotely be possible without JavaScript - which was sort of the initial question on why do we want JavaScript.

CyStat.png


When it comes to cross-platform development of native applications, I try hard to use the platform’s look and feel for the elements, here for example software for evaluation of the measurement curves, running on macOS or Windows 10. One notable difference is the position of the main menus.

Note, I do not claim that this is the only way of doing things, only this is the way I do it.

CVA-Mac.jpg

CVA-Win.jpg
 
obsigna They aren't claiming to be Safari. They're preteding. This has to do with applications, software and web sites which test to see which user agent/browser you are using and deliver pages based on that. Some sites will not serve anything at all ("best viewed with ...."). Removing that will actually break some things. This is why it's always better to target standards and not browsers or devices but, of course, one needs to test in all of them if one wants to be 100% sure. At my place, we liked Chrome/Chromium dev tools so it got tested there first but sometimes we'd be sitting in Firefox and I'd use Firefox tools but, eventually, we'd look in both. Then Safari cause we'd have to bump the graphics guy off his machine for that. Lastly, we'd dust off the Windows machine in the corner and clean off all the spit on it and test in IE and Edge. We kept that machine in a separate room that had padding to keep the screaming noise down.

I'll look at the rest of your post later. Your software looks like cool stuff. I'm cooking and the Blues hockey team is in the playoffs!
 
This is why it's always better to target standards and not browsers or devices but, ...
... but there are no workable standards for the browser ecosystem. Really, in practice there aren't: while HTML5 is standardized to some extent by the W3 consortium, and HTTP is standardized by the IETF as an internet RFC, and basic Javascript is standardized by ECMA, in reality the ecosystem is a gigantic mess of incompatibility (at least, it's better today than 5 or 10 years ago, when it was even worse). That's why it is easiest to just assume one or a few browsers. I have only worked for very large computer companies in the last ~20 years, and they all say that internal applications are only tested on one or at most two browsers (typically Firefox and Chrome, or Safari and Chrome). My wife works for a small company, and her (rudimentary) IT and HR departments simply say: "We know the stuff works in Windows 10 with Edge, and we don't have time/money to make it work for anything else".

... and the Blues hockey team is in the playoffs!
Strangely our hockey team hasn't been eliminated yet. Old joke: What do you do if you are swimming in the ocean and get attacked by a shark? You just hand them a hockey stick, because that makes a shark completely helpless.
 
while HTML5 is standardized to some extent by the W3 consortium
Actually, it's the WHATWG that controls that now.
it is easiest to just assume one or a few browsers
Nothing is added to the standard until there are at least two implementations by current browsers.
IT and HR departments simply say: "We know the stuff works in Windows 10 with Edge, and we don't have time/money to make it work for anything else".
And now Microsoft has dumped Edge just like they dumped IE but Firefox and Chrome and the others live on. That's why one should never target browsers.

A couple of months ago, I said the Blues would get to the Stanley Cup Finals. I don't know if they will win but they'll get there. I was concerned after they lost two games in a row but we'll see how things go tonight.

My prediction is based on what I perceive as winning first place no longer means anything. Just get into the playoffs and push hard from there. The teams that finish first are worn out by the end but the others took it a little easier and practiced their craft. The 2011 Cardinals backed in to the playoffs even though they had to win the last game of the season. Atlanta just had to win one of their last two games but lost both of them. Then the Cards went on to win the World Series as a wildcard team.
 
... but there are no workable standards for the browser ecosystem. Really, in practice there aren't: while HTML5 is standardized to some extent by the W3 consortium, and HTTP is standardized by the IETF as an internet RFC, and basic Javascript is standardized by ECMA, in reality the ecosystem is a gigantic mess of incompatibility (at least, it's better today than 5 or 10 years ago, when it was even worse). That's why it is easiest to just assume one or a few browsers. I have only worked for very large computer companies in the last ~20 years, and they all say that internal applications are only tested on one or at most two browsers (typically Firefox and Chrome, or Safari and Chrome). My wife works for a small company, and her (rudimentary) IT and HR departments simply say: "We know the stuff works in Windows 10 with Edge, and we don't have time/money to make it work for anything else".

...
I can readily sympathize with the small company's staff. Although I no longer do any kind of strict testing and debugging on them, in general my pages still display okay on Chrome and Safari. IE was always klunky, but probably still works for the most part, and with Edge, everything seems okay at first, but then Edge seems to get into some unresolved timing issues with warp-scrolling of database displays-- problems which no other browser has and which I have no plans, time, or any particular need to resolve. My stuff fortunately isn't intended for the mass public who use Edge, although Windows 10 itself is no problem at all for a client system. People can always install Firefox and still keep their favorite browsers on their desktops if they wish, no problem. I decline to care about what happens with the Microsoft or Opera browsers; useragent identifiers can easily be faked, and there seems to be little or no standardization in the default html formatting values for any of the almost-too-numerous-to-count default formatting settings and dom objects like the always-pesky file-select or "browse" buttons.

Using PHP I generate every page as html5, using strict xhtml syntax, so that every page can be automatically syntax checked with an xml parser before it's served. ECMAScript is all objectified and has the "use strict" syntax directive.
 
How are you serving your pages? As HTML or XHTML? iow, as text/html or application/xhtml+xml?
Code:
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="application/xhtml+xml; charset=UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE8" />
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="cache-control" value="no-cache, no-store, must-revalidate" />
<meta http-equiv="Expires" content="-1" />
...
Every page starts like this. XML syntax checking is done on the fly using xml_parser_create_ns('UTF-8') and xml_parse(). Syntax checking is the reason for using XHTML.
 
Vull You need to make sure you are actually serving the Content-Type that way in a browser's dev tools. Using the meta tag only does not guarantee the server will serve it that way (I just don't recall cause we always set ours in the server itself).
 
Thanks drhowarddrfine. How do you go about setting that up it in the server? I'm using apache24. As I said above, I do this mainly just to assist in debugging my html syntax-- making sure I have my tags paired up right, and that sort of thing. In that regard, the PHP xml_parse() function seems to work well and consistently, but there might likely be other benefits or side-effects of using XHTML I simply don't know about.

EDIT: Here's a screenshot to show how I use XHTML for syntax debugging:
 

Attachments

  • ss.png
    ss.png
    566.2 KB · Views: 117
Vull I have not used Apache in a long time so I don't recall the proper way to set that.

The best way to test your HTML/XHTML is using the W3C validator.

You can see what your server is actually delivering by using the dev tools in Firefox. Enter ctl-shift-C and click on Network. That should display the files being served. Click on the html one. Click on Headers on the right side and look for "content-type". If you are serving HTML, then it will list it as "text/html". If it truly is XHTML, it should say "application/xhtml+xml".
 
Back
Top