Python Application Template

I made a cookiecutter template which will generate a python project with what I see to be best practices. It incorporates automatic docker image/wheel upload and CI/CD caching with cachix. I would like to get some feedback from the community to see if there could be any additional changes.

https://gitlab.com/gcoakes/python-nix-template

1 Like

I also have a similar tool, if you’re on unstable, you can do:

nix-shell -p nix-template --run "nix-template python -p <pname> -v <version> -l <license> default.nix"

and you should get something similar to:

{ lib, buildPythonPackage, fetchPypi }:

buildPythonPackage rec {
  pname = "pname";
  version = "version";

  src = fetchPypi {
    inherit pname version;
    sha256 = "0000000000000000000000000000000000000000000000000000";
  };

  propagatedBuildInputs = [ ];

  pythonImportsCheck = [ "pname" ];

  meta = with lib; {
    description = "CHANGE";
    homepage = "https://github.com/CHANGE/pname/";
    license = licenses.license;
    maintainers = with maintainers; [ ];
  };
}

Once 20.09 is release, I’ll probably alter it to handle something like:

nix-template python --from-url "https://github.com/psf/requests"

or

nix-template python --from-url "https://pypi.org/project/requests/"

and it will calculate the sha, src, and most of the template for you

2 Likes

Does that produce nix expressions exclusively or does it also generate project files? I had some hesitations about cookiecutter because it allows writing arbitrary code for pre/post hooks, but I don’t know of anything which has similar capabilities.

1 Like

Right now, it just generates some bare bone templates. It’s mostly meant to avoid boiler plate when adding new packages.

My use case was having to add new packages to fix python package builds. Having something that could create the skeleton of a file, in the correct file location, went a long way to mitigating the “annoyance” with adding a new package.

2 Likes

what about
mach-nix gen .. command

1 Like

mach-nix works great for local development. But for nixpkgs, the problem has to due with the coherence of all python packages. What I mean by that, is there’s usually a lot of patching that we have to do to get packages to work well with other packages.

Mach-nix has the “freedom” to selectively choose what versions of packages get included in an application, but nixpkgs can only have a single version per interpreter, otherwise you will get “subtle” runtime issues.

1 Like

Assuming there was a dependency resolver available in nixpkgs, which wouldn’t require IFD (like the one of mach-nix), could you imagine to have multiple versions of packages available?

Instead of defining the exact dependency tree of each package upfront, one would leave it up to the resolver to pick the correct dependencies depending on a list of selected top level packages. A call to python.withPackages could trigger the resolver for the packages passed to it.

1 Like

I’ll just give you an example. When jsonschema bumped to 3.0, there was a time in which half the latest downstream packages wanted 2.0, and other half needed 3.0. Since there were significant api changes between the two versions, you couldn’t mix packages.

AFAIK, python will follow the same import logic (traverse sys.path, return the first instance of a module), so there’s no way to dynamically choose a version for a given dependency at runtime.

If python somehow had a way to map __file__ to different sys.paths, then maybe there would be to achieve many versions of the same package at runtime, but I don’t think that would ever come to be. Also, that would essentially make python into a similar packaging paradigm to node packages in which you have to wait 30mins for pip to install 2-8 gigs worth of dependencies.

Below is largely based upon assumptions, sorry if I’m straying from what you’re actually proposing:

That sounds like a very fragile house of cards, and we would also need a large persistent package set of python packages for the resolver to do its “magic”. This sounds very similar to node-packages.nix.

I’m not the biggest fan of these large generated package sets because they become huge. Both in terms of lines-of-code, and store space. If we have a single global python-generated.nix, then it will have similar penalties to the node-package set, such as: long (~30min) wait for the tool to finish resolving the dependencies, determining the shas, and creating the node-packages.nix

One of the “benefits” the current nix-python ecosystem has is that we patch the source to make the packages more cohesive, but this comes at the cost of manual work.

1 Like

Yes, python has a global package scope during runtime, therefore a nodejs-like model with local dependencies is not possible. And that’s not what I’m suggesting.
The dependency resolution would not take place during runtime, but during nix evaluation.
Therefore any closure of any python package or environment would always have only one specific version for each dependency.

I’ll make a simple example how this model could look like. Let’s say we have 2 different libraries which both depend on cryptography:
httpx wants “cryptography <= 3”
requests wants “cryptography >= 1”
Usually httpx and requests would have one specific version of cryptography inside their propagatedBuildInputs. But here, they would instead just carry their requirement specifiers inside passthru or something similar.
The moment python.withPackages (ps: [ httpx requests ]) is called, some nix function (our resolver) analyzes the requirements of these two libraries.
Due to <= 3, >=2 it will pick cryptography version 3 and add it to the list of selected packages. If one would to add another package which depends on cryptography == 2, then the resolver would pick version 2 of cryptography instead, since now the best pick for '<= 3, >=2, ==2' would be 2.

No rebuild of any package will be triggeed with that model. build Inputs of individual packages will remain untouched
Of course this model is only applicable for runtime deps, not build time deps
The same thing could be done with buildPythonPackage, if it would be redesigned just a little bit.

Now, if we have this model which allows us to have several different versions of packages available in nixpkgs, we could just add a few versions of each package, instead of putting a lof of effort into patching packages to make them all work with only one unique set of dependencies.

I don’t say that a massive amount of versions should be provided. But maybe just having 2-3 different versions of each package could take a lot of load off of maintainers, due to:

  • less patching necessary
  • have it easier to upgrade packages, because the risk of introducing conflicts is much lower and conflicts could easily be solved by just keeping the previous version aside.
  • make it easier to add packages because of the same reason.
  • probably more work could be automated since less patching by humans is required.

In general, I believe it could make nixpkgs more flexible and allow for a rolling upgrade style. New versions of libraries could be added on top, while old versions can slowly be phased out. There wouldn’t be these hard cuts where massive amount of packages have to be updated simultaneously every once in a while.

2 Likes