If the environment that is created from
$ python -m pip install -r requirements_original.txt
is then frozen and output to requirements_from_pip_freeze.txt
with
$ python -m pip freeze > requirements_from_pip_freeze.txt
and then in a clean Python virtual environment (here called pip-example
) that requirements_from_pip_freeze.txt
is attempted to be used to create an environment with the October 2020 dependency resolver turned on (--use-feature=2020-resolver
) it will break:
(pip-example) $ python -m pip install --use-feature=2020-resolver -r requirements_from_pip_freeze.txt
Requirement already satisfied: pkg-resources==0.0.0 in /home/feickert/.venvs/pip-example/lib/python3.7/site-packages (from -r requirements_from_pip_freeze.txt (line 19)) (0.0.0)
Processing /home/feickert/.cache/pip/wheels/8e/28/49/fad4e7f0b9a1227708cbbee4487ac8558a7334849cb81c813d/absl_py-0.9.0-cp37-none-any.whl
Collecting astunparse==1.6.3
Using cached astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /home/feickert/.venvs/pip-example/lib/python3.7/site-packages (from astunparse==1.6.3->-r requirements_from_pip_freeze.txt (line 2)) (0.34.2)
Collecting cachetools==4.1.1
Using cached cachetools-4.1.1-py3-none-any.whl (10 kB)
Collecting certifi==2020.6.20
Using cached certifi-2020.6.20-py2.py3-none-any.whl (156 kB)
Collecting chardet==3.0.4
Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting gast==0.3.3
Using cached gast-0.3.3-py2.py3-none-any.whl (9.7 kB)
Collecting google-auth==1.20.0
Using cached google_auth-1.20.0-py2.py3-none-any.whl (91 kB)
Requirement already satisfied: setuptools>=40.3.0 in /home/feickert/.venvs/pip-example/lib/python3.7/site-packages (from google-auth==1.20.0->-r requirements_from_pip_freeze.txt (line 7)) (49.2.0)
Collecting google-auth-oauthlib==0.4.1
Using cached google_auth_oauthlib-0.4.1-py2.py3-none-any.whl (18 kB)
Collecting google-pasta==0.2.0
Using cached google_pasta-0.2.0-py3-none-any.whl (57 kB)
Collecting grpcio==1.30.0
Using cached grpcio-1.30.0-cp37-cp37m-manylinux2010_x86_64.whl (3.0 MB)
Collecting h5py==2.10.0
Using cached h5py-2.10.0-cp37-cp37m-manylinux1_x86_64.whl (2.9 MB)
Collecting idna==2.10
Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting importlib-metadata==1.7.0
Using cached importlib_metadata-1.7.0-py2.py3-none-any.whl (31 kB)
Collecting Keras-Preprocessing==1.1.2
Using cached Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB)
Collecting Markdown==3.2.2
Using cached Markdown-3.2.2-py3-none-any.whl (88 kB)
Collecting numpy==1.19.1
Using cached numpy-1.19.1-cp37-cp37m-manylinux2010_x86_64.whl (14.5 MB)
Collecting oauthlib==3.1.0
Using cached oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting opt-einsum==3.3.0
Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Collecting protobuf==3.12.2
Using cached protobuf-3.12.2-cp37-cp37m-manylinux1_x86_64.whl (1.3 MB)
Requirement already satisfied: setuptools>=40.3.0 in /home/feickert/.venvs/pip-example/lib/python3.7/site-packages (from google-auth==1.20.0->-r requirements_from_pip_freeze.txt (line 7)) (49.2.0)
Collecting pyasn1==0.4.8
Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting pyasn1-modules==0.2.8
Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting requests==2.24.0
Using cached requests-2.24.0-py2.py3-none-any.whl (61 kB)
Collecting requests-oauthlib==1.3.0
Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting rsa==4.6
Using cached rsa-4.6-py3-none-any.whl (47 kB)
Collecting scipy==1.5.2
Using cached scipy-1.5.2-cp37-cp37m-manylinux1_x86_64.whl (25.9 MB)
Collecting six==1.15.0
Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)
Collecting tensorboard==2.3.0
Using cached tensorboard-2.3.0-py3-none-any.whl (6.8 MB)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /home/feickert/.venvs/pip-example/lib/python3.7/site-packages (from astunparse==1.6.3->-r requirements_from_pip_freeze.txt (line 2)) (0.34.2)
Requirement already satisfied: setuptools>=40.3.0 in /home/feickert/.venvs/pip-example/lib/python3.7/site-packages (from google-auth==1.20.0->-r requirements_from_pip_freeze.txt (line 7)) (49.2.0)
Collecting tensorboard-plugin-wit==1.7.0
Using cached tensorboard_plugin_wit-1.7.0-py3-none-any.whl (779 kB)
Collecting tensorflow==2.3.0
Using cached tensorflow-2.3.0-cp37-cp37m-manylinux2010_x86_64.whl (320.4 MB)
ERROR: Cannot install numpy==1.19.1, -r requirements_from_pip_freeze.txt (line 11), -r requirements_from_pip_freeze.txt (line 14), -r requirements_from_pip_freeze.txt (line 18), -r requirements_from_pip_freeze.txt (line 26), -r requirements_from_pip_freeze.txt (line 28) and -r requirements_from_pip_freeze.txt (line 30) because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested numpy==1.19.1
h5py 2.10.0 depends on numpy>=1.7
keras-preprocessing 1.1.2 depends on numpy>=1.9.1
opt-einsum 3.3.0 depends on numpy>=1.7
scipy 1.5.2 depends on numpy>=1.14.5
tensorboard 2.3.0 depends on numpy>=1.12.0
tensorflow 2.3.0 depends on numpy<1.19.0 and >=1.16.0
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
ERROR: ResolutionImpossible For help visit: https://pip.pypa.io/en/stable/user_guide/#fixing-conflicting-dependencies
The above example might seem scary, but --use-feature=2020-resolver
is going to be a good thing in general and will make things less painful moving forward as it will treat requirements.txt
as actual requirements.
If we had instead run
$ python -m pip install --use-feature=2020-resolver -r requirements_original.txt
then pip
will correctly produce an environment that meets the requirements of requirements_original.txt
and then
$ python -m pip freeze > requirements_lock.txt
$ python -m pip install --use-feature=2020-resolver -r requirements_lock.txt
will work beautifully!
This seems like a step towards viewing requirements.txt
files as core requirements (think mv requirements_original.txt requirements_core.txt
) and then doing
$ python -m pip install -r requirements.txt
$ python -m pip freeze > requirements_lock.txt
to produce much more reproducible environments (now that you have something closer to a real lock file (but you're not down to the SHA yet)).
Use a real dependency manager like Poetry for your Python applications!
Also future versions of pip
that use the --use-feature=2020-resolver
approach will make creating new virtual environments from requirements.txt
files much better and allow for those builds to be more reproducible.
This is a change that should be celebrated and we should all thank the pip
team!
Just to check my understanding, in the original install, pip used numpy 1.19, even though TF requires NP<1.19, however in the current version of pip this doesn't raise an error.
Then when you simulate the October installer it raises the error?
Surely for new builds this will be a good thing since originally the environment may have led to errors due to incompatible versions? But I guess for installing previous configurations of the environment this could break reproducibility.
Thanks for the pointer to Poetry. I'll give it a look!