One benefit of the current process is that we serve as a crucible for python updates, and can alert respective upstreams about issues which they may not be aware of due to extreme pinning being the norm.
Ideal scenario would be something like this issue where an upstream was alerted, and eventually took care of updating to a maintained fork of a dependency.
However, many python library owners don’t care about how their package interacts with the rest of the ecosystem, so sometimes you get interactions like this.
But, this process of alerting upstreams does create a “only distros pester me with addressing dependency technical debt, obviously the issue is with distros” environment.
Personally I’ve resigned usage of python to scripts which use only the standard libraries, or well maintained dependencies like requests
. Anything more, and the python ecosystem becomes borderline unmaintainable given enough time and cpython interpreter versions.
Beware: rant
What do python packages have to do with the maintainability of Nixpkgs?
A few hundred of them are used in just the build process of certain packages, which may even export python related packages; so the dependencies need to be packaged in some manner.
And of the PRs in nixpkgs, 20k out of 143k of them have the topic: python
label; so the burden of maintaining the package set (in it’s current state) is quite high. Additionally, python ecosystem doesn’t really allow you do something like, “pin docutils for the sphinx module, and everything should work fine”. No, some other package will bring in docutils; so if that other package and sphinx exist in the same environment and if the unpinned docutils gets ordered before sphinx’s pinned version then things will fail.
People usually rebuttable the pinning with venv-like solutions, but those only work on a per-application basis, venv’s can’t freely compose with other environments with mutually exclusive version bounds.
I personally would be fine doing a mass pruning of the nixpkgs python package set to just enable applications, and move most of the module specific logic into something like poetry2nix or dream2nix; where there’s more freedom to determine “python landscape” since it’s a per-project concern.
Also, the pip version resolver is just a bandaid. It works well for small to medium projects, but for large projects (such as home-assistant), it takes 12+ hours to resolve the dependencies.