I know you'll hate me for saying this, but Python is a mess. I love programming in Python, but holy crap-a-reno is dealing with all the modules a pain-in-the-ass-a-who. Then python 3 came out. Now I have a complete mess of modules on my machine. I have some modules that are system wide and others that are install just for me. I have no idea how it got so screwed up. It's probably that my Mac is 6 years old and it's just been a slow module death. I'd love to wipe it all clean and start over. I think that would just make it worse.
I'm sure there is some logic behind it, and I'm sure all the python pros are laughing, but getting python working for the average user is nuts. I'm trying to walk some people through it now and it's just impossible.
But here's the trick: use virtual environments religiously. Make a virtual environment for every project - ideally inside that project's source directory, or wherever you're deploying it. This isn't immediately obvious coming from Python 2 since it used to be kind of a third party thing with a couple different approaches, but with Python 3 it's easy and it works out of the box: `python3 -m venv .venv` creates a virtual environment in a directory named ".venv" (I am of course subtly trying to trick you into doing it my way :b). This environment has plain old Python and not much else. Now you can pip install stuff to there like `./.venv/bin/pip install my-package`, and run stuff inside the environment like `./.venv/bin/python3 example.py` or `./.venv/bin/some-launcher-script`. If your own project is using setuptools and everything it can work quite smoothly! (And then you realize setuptools solves about half of the problems you need it to solve if you haven't dutifully broken the non-Python parts of your project into their own build system and everything gets gross again, but it's good for a while).
Anyway, make a bunch of virtual environments and you can forget the system and user Python environments ever existed, I promise :)
Like the other comments pointed out, use a virtual environment for this kind of situation. Dylan's comment seems to explained it all out for you.
Didn't it occur to anyone that if you need tools like this, your fundamental project is broken? I guess it's just too large to stop at this point.
I looked into pyinstaller and it broken horribly on Mac Catalina.
I'll look at virtual environments next. Wish me luck.
Actually for passing around easy to run executables, I would never use Python.
Probably the fastest solution would be to switch language, heh.
(using Python for web development mostly, where this is not an issue)
Well I guess then the best approach is indeed the virtual env stuff and what Matej said should be a possible one-liner to go (after Python3.x + pip/pipenv was installed on whoever needs to run it).
Curious to hear what solution will work out eventually
Install the exact version you need using pyenv (https://github.com/pyenv/pyenv) and manage all your project in virtual envs.
I found as well pipx incredibly useful to install python tools (they get installed in their own virtual envs with their dependencies).
Since I'm using these techniques I've not anymore any problems with modules or conflicting dependencies ...
Ferdi van de Kamp
But virtual env is the way to go this should work on Mac and Windows.
When someone else want to use your code, he simply create a virtualenv and then run "pip install -r requirements.txt" and voila! there is a virtualenv with the exact version than yours!. These commands wokr across operating systems (it's my setup at work with a mixed environment of windows, mac and linux).
I hope you don't drop the towel, python is a wonderful environment for programming!
# cd into your project directory, then create a virtual env called `venv`:
$ python3 -m venv venv
# activate the virtual environment:
$ . venv/bin/activate
# from now on, you are in the venv as shown by the (venv) prefix in your command line.
# you can use pip to install any module you need:
(venv) $ pip install requests
# when you want to leave your virtual environment:
(venv) $ deactivate
Some other thoughts:
- Don't use Python 2.x (it's end of life since January), only use Python 3.x
- Never run `sudo pip install ...`, that would install Python modules in the system and might clash with your OS package manager
Regarding the distribution of your tools, it is indeed a really big pain point, especially in a cross-platform World (if you were just targetting Linux, you could use some packaging method such as Snap by Canonical). If your tool is comprised of several modules, but only uses the standard library, Python can create a zip archive that can be run as a Python program (<https://docs.python.org/3/library/zipapp.html>). Otherwise, I heard good things about Nuitka (mentioned above).
I switched all of my scripts to Racket, and I just compile those trivially.
No more pip/apt/package manager of the day/venv bullshit and a clean expressive language that didn't reinvent the wheel and even allow you to compute stuff with good performance when needed (CPython is a joke).
Basically, compiling apps is not really a problem when you're out of the C or C++ world. Racket, Rust, or Go have streamlined way to do so.
Similar to Windows, Python on OSX has no "standard" install. Many people on OSX use homebrew (the right way for a system wide python), scientists use anaconda; then there are some legacy python installs here and there. Before you start setting up your next try, you should get a clean slate.
If you can guarantee that your customers / coworkers always have an internet connection, you can just distribute a environment.yml file and everybody can sync their environment. Only, if you need explicit offline installers, you will run into problems sometimes. The other thing Anaconda can do for you is to synchronize your build environment, by getting everybody the same GCC + sysroot.