Python Writing portable shebangs for Python scripts

I have recently switched on some of my machines to FreeBSD.
I have also adjusted my "make your life easier" scripts to work on FreeBSD.
Some of them are written in Python, and I am looking for a way to write shebangs in such a way that they work on Linux and FreeBSD just by calling the script.
On Ubuntu, there is always python3 symlink in the /usr/bin/ to the installed version of Python3.
Moreover, this symlink is automatically managed by the system, so it always points to the installed version of Python3.
My shebangs so far looked like this: /usr/bin/env python3.
However, on FreeBSD there is no python3 symlink in /usr/local/bin/.
There is only a binary with a name containing a particular version of Python, for example, python3.11.
Of course, I can manually create the symlink.
However, it will break in case of Python update, for example, for version 3.12, the new Python binary will be named python3.12.

I wonder if someone has a better way for handling portable shebangs for Python scripts.
 
pkg install python3
Code:
This is a meta port to the Python 3.x interpreter and provides symbolic links
to bin/python3, bin/pydoc3, bin/idle3 and so on to allow compatibility with
minor version agnostic python scripts.
 
pkg install python3
Code:
This is a meta port to the Python 3.x interpreter and provides symbolic links
to bin/python3, bin/pydoc3, bin/idle3 and so on to allow compatibility with
minor version agnostic python scripts.
Does it continue working after system/pkgs updates?
 
I use two solutions. Traditionally, the same one Kent Dorfman does: I make sure that every computer I use has a "/usr/local/bin/python3" that works. Why python3? Because I never want to run Python 2 by mistake any longer, since all my scripts are now converted to version 3. Why /usr/local/bin and not /usr/bin? Because I used do most of my personal development on FreeBSD, and that's where it puts the python executable. This means on every Mac and Linux machine I have to create that softlink by hand. This is somewhat automated; if a python project has make files (many do), that make file will check that there is something executable at /usr/local/bin/python3.

My new solution is to just use "#!/usr/bin/env python3" instead. That works as long as the path is set up correctly. And then, on every machine I have a .bashrc (or similar) setup file which makes sure the path is correct. The advantage of this is that it has fewer moving parts, and can all be done without root access (which I have on all machines I develop on, but it's nicer to not use it). If this breaks (because there is no python3 on the path), it's a bit harder to debug.

But that then ignores another much more serious problem: Access to libraries. My code often relies on Python libraries, which can be installed by pkg (on FreeBSD) or apt and yum (on Debian and RedHat), or if that is not available, from pip. All my local libraries also use pip to install. I used to always do system-wide installs, very deliberately. This used to work reasonably well, but not great. The problem is that pkg/apt/yum and pip don't play nicely with each other, updates get confused, and things occasionally break. This can be handled (with a few outages, which are typically easy to fix) by using good discipline: (a) prefer using pkg/apt/yum for all updates, and only use pip if there is no OS-installer based package; (b) only use pip upgrade on packages that are not installed by pkg/apt/yum.

Recently (last few years), the Python community has broken this further, with the introduction of virtual environments, and not allowing pip to do system-wide installs. This sounds like a "solution", but in reality it just takes the existing versioning problems (which I never experienced in Python anyway), and multiplies them into each virtual environment. It also has a big design flaw, as python venv is designed around one virtual environment per login user as the default. But some projects are shared between multiple users (some of them being cron jobs, web server CGIs, and services started by root). While in other cases, one user may have different projects that need different libraries. For now, I have used the sword of Solomon to hack (a.k.a. --break-system-packages) through this problem, and I continue using pip to do all library installs system wide. This is dumb and makes me angry, but for now I don't know a better solution.
 
Back
Top