(Solved)Can't view this website

Fellow FreeBDS users, I am pulling my hairs of why this website cannot be view. Can any one help me shade some light on thtis issue?




I am using FreeBSD 13.1


I can view the website on other Linux distros and Windows but NOT on FreeBSD
 
See a screenshot of the site when I try to view it.
 

Attachments

  • Today06052022.png
    Today06052022.png
    18.2 KB · Views: 80
They are playing games on the server side with the "user agent". Try using something that looks like Linux:
Code:
chrome --user-agent="Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36"
 
I said it elsewhere here. In this day and age, I find it astounding that a web site would do such a thing based on the user agent string which has been labeled an amateur blunder decades ago.
I fully agree with that. I'm currently working on a project which involves making regular HTTP requests to a HTTP server. Nothing special about that. The HTTP client is home-grown tho. During testing, we found a lot of rejections simply because of the user agent string.

I fail to understand the reasoning behind this on every single level. After all, the user agent string is just that: A simple string. You can set it to whatever you like. This cannot (and must not) be used for any kind of "security" aspects. The only reasonable explanation I could come up with is to prevent "bots" from successfully making requests but again: Just change the agent string and you're done.
If the reasoning instead if metrics: You screw your own metrics up by forcing people to set whatever user agent is necessary to "pass".

The web has become a truly ridiculous wasteland. There are few things that give me less pleasure than working with "modern web technologies".
 
jbodenmann The only thing that used to make sense was that some would adjust their CSS or Javascript for the user agent to make up for problems with that particular browser. We gave up on that long ago cause we strictly followed the standards and IE went away (for our users) so we never bothered to do that but, as you said, some users will change the string so that wouldn't help.

Then, again, any user that changes their user string and doesn't get what they expect shouldn't expect to get what they expect.
 
The world is not black & white. All depends on what we do with the user agent information.

For example my embedded web server offers HTTP SHA256 Digest Access Authentication (according to RFC 7616) besides the obsoleted HTTP MD5 Digest Access Authentication (according to RFC 2617). The problem now is, that with the exception of Mozilla an Opera, the other browser producers obviously didn't recognize that MD5 is obsolete, and some of these browsers bail out on the SHA256 authentication offer.

For this reason I check the user agents for the browser versions:
C:
      shaDigestReady = (fields.UserAgent.content
                     && ((s = strstr(fields.UserAgent.content, "Firefox/")) && strtol(s+8, NULL, 10) >= 93
                      || (s = strstr(fields.UserAgent.content, "OPR/"))     && strtol(s+4, NULL, 10) >= 80));

You will have a hard time convincing me, that there is anything wrong with this.
 
You will have a hard time convincing me, that there is anything wrong with this.
The thing that is "wrong" here is that we (as in humanity) apparently need to distinguish clients on the server side. This makes little to no sense. There are standards for a reason. It just seems that web technologies are either poorly defining standards, poorly maintaining standards or poorly implementing standards - most likely a combination thereof.

Web technologies have become so accessible that people without the necessary backgrounds, skills & experience think that they can pull of what others study years for and dedicate their life towards.
I'm not trying to express that things should not be accessible but the more accessible technology becomes the more it tends to be bend left and right to "just get this feature landed as quickly as possible".
This ripples through all the layers. Suddenly you have something like web browsers having more frequent releases than any other piece of software which means that there is little to no time to properly care for security which means that you need yet more frequent releases to patch issues that were not discovered prior - while adding more issues to be discovered and fixed a few days later.
Obviously this is not different from the regular software development workflow. I'm complaining about the fact that we seem to need/require/want extremely fast moving software rather than taking it slow and keeping things nice & neat.

These days, web browsers are pretty much an entire operating system. Why? Why is this necessary? Because we want shiny new bling-bling features and we want them yesterday and we want everybody with a coffee mug to be able to "make" something. To accomplish that, software gets engineered poorly, release management is done poorly, security is handled poorly. And to get everything done even faster, lets pull in 48719 dependencies without actually auditing, authoring or maintaining those. Again: Because you can't. It's beyond anyones man power. We have seen governments trying to do this and they failed. So I get that it's a man power & logistical issue. But whenever there are resource limits in the real world it's time to take a step back and re-evaluate what is actually necessary and what is just popular demand and regulate from there.

The web has become a disease. And we'll most likely not be able to get away from that without a skilled group of people dedicating themselves to rebuild. I doubt that what we currently have is fixable - because it's a systematic problem, not a technological one.
We keep piling stuff on top of each other. This used to work well in the past because things moved way more slowly. But these days somebody starts building something new on some "technology" that was "invented" a week earlier, implemented by a group of juniors that think that they are senior and audited by nobody.
 
Try setting in firefox about:config
Code:
privacy.resistFingerprinting true
This changes the user-agent string:
Code:
Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0

This is it... This fixed the issue on the browser by about:config and changed the variable, now I can view the site from FreeBSD 13.1 Thanks getopt and all of you for your feed back I really appreciated.

The browser that I use 100% of the time is Firefox
 
Try setting in firefox about:config
Code:
privacy.resistFingerprinting true
This changes the user-agent string:
Code:
Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0
I like to mention that this also changes the browser time zone to GMT0 and as a result the browser and computer time is different
if someone is in another timezone than GMT.
This can be and will be used for user tracking and identifying.
From there I believe it is better to use one of the user-agent-switchers, set it to Win10 but stay in your timezone.
I'm always on my way as Win10 with Chrome or Edge to look like a billion other sheep. Just dont stand out.
What's under the hood is a differnt story.
 
The thing that is "wrong" here is that we (as in humanity) apparently need to distinguish clients on the server side. This makes little to no sense. There are standards for a reason. It just seems that web technologies are either poorly defining standards, poorly maintaining standards or poorly implementing standards - most likely a combination thereof.
We either have to live with what we get or we simply desist.

This reminds me on a joke which was told by the mathematics professor when I was studying chemistry at the university. In one of the lessons he touched the number series which converge but never reach zero, and he used it to tell the difference between mathematicians and physicists.

At a university party, the girls and the boys should line up in two rows at the respective end of the dance hall, and move towards each other by half of the current distance at each drum beat. The music + dance would start as soon as they meet in the middle.
  • The mathematicians desisted, because a halfway rule is such a converging number series which never results in a distance of 0.00000...

  • The physicists figured that after the 5th beat of the drum the distance between the girls/boys would be near enough for practical purposes.
I finally graduated as a physical chemist, and with said joke in mind, I won't desist using a system which is good enough for practical purposes and instead wait for it converging to a hypothetical ideal state in the infinite distant future.
 
Actually bookmarked this in my Firefox! this post in another thread told me to "get spoofy wit it" :p :
 
Back
Top