title: Python packaging is better than you think created at: Mon Jun 20 2022 17:48:50 GMT+0000 (Coordinated Universal Time) updated at: Mon Jun 20 2022 17:48:58 GMT+0000 (Coordinated Universal Time)
Alternative titles: "Stop saying Python packaging is terrible", "Python packaging for the 99 %"
Proof that there's an audience for this: https://twitter.com/juanluisback/status/1538936104824492033
[https://twitter.com/juanluisback/status/1538936104824492033]
Unbundle what people really mean when they say that "Python packaging is bad":
- bootstrapping Python for development
- OS-specific
- surprisingly, more difficult on Linux, since there are too many options and also Python is a core part of the system
- hard only because there is no canonical method or bad docs
- problem solved by Anaconda
- pyenv too, more intrusive and Linux specific but offers a wider range of Python versions
- diagnosing packaging problems
- a real mess because bootstrapping is hard and therefore people end up with chaotic Python installations
- takes skill, but some simple tricks:
which python
(tells you where does it come from),which pip
,python -m pip
to make sure,import sys; print(sys.prefix)
to be really sure
- installing system-wide binaries based on Python
- use pipx or fades and forget about it
- avoid system Python like the plague
- you could use environments for this, but you'd have to remember to activate it, which is not very convenient: avoid it if you don't need it!
- managing environments
- absolutely not OS-specific after the bootstrapping is done
- only 2 kinds of environments exist
- conda environments, managed by conda
- python environments, managed by stdlib venv, pyenv, virtualenv, PEP 582
- dealing with non-Python dependencies
- Python native solution for non-Python dependencies is bundling shared libraries inside wheels. mostly works!
- however, wheels can be quite fat (tensorflow, pytorch), not have enough specificity (GPU vs non-GPU etc), not be available for certain packages (RAPIDS), or lead to incompatibilities (Cartopy & rasterio)
- conda solves this, and it will be difficult that pip solves this for the general case. use conda, it's fine!
- declaring environment dependencies
- Python cannot install/import several versions of the same package in the same environment as opposed to Node.js
- that might be a good thing though! security patches are applied uniformly. too long to discuss
- but this leads of course to conflicts! which must be handled somehow
- libraries doing weird things with dependencies is not Python's fault (now upper version pinnings are frowned upon for example)
- pip solves dependencies these days! even though backtracking is often not verbose enough for good diagnosis
- mamba is a blazing fast replacement of conda
- installing environment dependencies
- conda, pip, poetry, pdm work fine, there are probably others
- but there's lots of outdated advice: Pipenv is largely dead
- conda and pip don't interoperate very well, so they need to be combined with care
- pip-tools and poetry are currently lagging behind in terms of standards adoption and bug fixing, but they are excellent projects and will get there with some time
- publishing packages
- nowadays most needs are solved by PEP 621
pyproject.toml
- you can use setuptools, flit, hatch and pdm and your metadata will look 90 % the same
- a separate tool i.e. twine is needed for publishing, is it really that bad?
- nowadays most needs are solved by PEP 621
- hot-reloading
- editable installations are now standardized, not a problem for the majority
- unless you're using Meson, like SciPy does, in which case there's still no good solution
Thanks a lot @benjyw for adding pex to the mix! I think "packaging Python applications in binary form" is a whole different topic though, probably deserving its own blog post. However, I'd say 99 % of people struggle with more basic stuff.