So lately I’ve found myself surprised and perplexed by a few changes the Python ecosystem has gone through, and I’d like to discuss them with more people (besides those that commented on some PRs).
build-system
v.s nativeBuildInputs
What information / accuracy do we get from this division? What was wrong about having only nativeBuildInputs
? I do appreciate the idea of making nativeBuildInputs
provide only executable that don’t pollute PYTHONPATH
, but if that’s not ready yet, why bother adding the build-system
attribute?
dependencies
v.s propagatedBuildInputs
Hmm OK, but that’s just a semantic change, so why bother? The only place I found dependencies
being evaluated is here:
Even if the plan is to eventually treat these lists differently, treating them differently shouldn’t break packages (at least in the beginning) and requiring people to change their expressions should be done after the change is apparent.
Putting dependencies both in dependencies
and in build-system
This topic is related to this review comment, which from some reason links to this tracking issue: Python: Remove test & runtime dependencies from build time closure tracking issue · Issue #272178 · NixOS/nixpkgs · GitHub .
The general questions / considerations that guide me towards this discussion are:
- Why would an upstream package author put the same dependency both in
setup_requires
andinstall_requires
? - If an upstream author doesn’t do it, but they
import
one of theinstall_requires
dependencies insetup.py
, is that a good reason to add a dependency to bothbuild-system
anddependencies
list? - What if upstream
import
s the dependency in theirsetup.py
in order to get dynamically theinclude/
directory in which some headers of the dependency are available? In that case we have to patch the package to support enabling us to specify manually theinclude/
directory of thehostPlatform
’s package - and in that case again we don’t need to put the same package in bothdependencies
andbuild-systsem
. - What if upstream uses an executable from a package that also provides headers? An example is
scipy
that uses thef2py
executable fromnumpy
, and we also support cross compiling it. by specifying the headers of the hostnumpy
, and thankfully what the build platform’sf2py
generates is compatible with host platform’snumpy
. - Is the (seemingly semantic) distinction between
build-system
andnativeBuildInputs
supposed to help distinguish between the above 2 scenarios?
How to deal with multiple versions of packages?
So the numpy
2.x release came a few months ago, and although not many backwards incompatible changes were apparent, many packages have probably gotten broken simply due to version constraints in setup.py
and pyproject.toml
. Hence the attributes numpy_2
and numpy_1 == numpy
were introduced, and now it’s an open question how to deal with propagation of numpy == numpy_1
due to dependent packages.
I raised this issue on my (currently draft) attempt to add a few Python packages that support only numpy_2
, which raises a clash between these two principles:
- Spawning a
python3.withPackages (ps: [ ps.XXX ])
interpreter, must be able toimport XXX
- i.e not raise anImportError
. - Using package
XXX
that requires either the dependencyYYY_1
orYYY_2
should not require you to usepackageOverrides
and rebuilding your set ofpython3Packages
.
Another interesting case to compare is qtpy
, which is a compatibility layer between 4 different Python Qt packages. Spawning a python3.withPackages (ps: [ ps.qtpy ])
interpreter hence always fails with an ImportError
, but that’s of course is the point of the package - you should be able to choose a Python Qt implementation and add it to your ps: []
list.
In comparison, scipy
explicitly states that it can be built against numpy
2.x and it does specify “numpy
” in their install_requires
, so that pip
users could uninstall numpy
2.x that is installed after they pip install scipy
and then pip install numpy==1.26
. However, with Nix, we don’t have that privilege currently.
These two examples raise the question: Should install_requires
be imitated in the dependencies
attributes blindly? What if a package doesn’t support a new version of a dependency, and this is not even explicitly described with a version constraint - should Nixpkgs try to fix that elegantly to make users avoid as much as possible rebuilding attributes?
How long to wait for packages to support new versions of their dependencies?
Somewhat related to the above, this general question raises another priorities competition:
- Always put the latest version of all packages as the default version of the original attribute
- Make sure that as many leaf Python packages as possible build.
In an attempt to update quantities: 0.15.0 -> 0.16.0
, we noticed the upstream maintainer of a dependent package is aware that their package doesn’t support the latest version of the dependency. That upstream issue is 2 weeks old. Hence I ask:
- For how long we’ll wait for them?
- What if someone really wants to use the latest
quantities
- how many rebuilds should they suffer because some other leaf package they don’t care about requires an old version ofquantities
?