I have a python package that I am adding to the nixpkgs store. The main motivation is that I am trying to make sure that nixpkgs has the top 360 most downloaded python packages http://py3readiness.org/.
Should I included everything in one pull request with 15 commits?
Or is it better practice to do it in batches?
I should mention that I have already added 20 or so packages to nixpkgs so I am relatively experienced and I have made sure that all tests are run and well written with best practices. Is there anything that I can do to make the code reviewers job easier?
As you said, you’re experienced with nixpgks. You probably know how to make good commits.
I would expect to see one PR, instead of having mutliple PRs depending on other PRs. Easier to review the intended end-result than having to somehow piece together the PRs and verify them in isolation. At least, that’s how I’d implement it.
When updating N packages, I’ve often seen a single PR with N commits, each of which says
foopkg: init at X,
barpkg: init at Y
bazpkg: 1.0 → 2.0
I wouldn’t have multiple PRs open that depend on a merging order between them, but that said, if you have one commit ready to go that packages foopkg and you’re still working on the rest, it doesn’t hurt too much to send a PR for just that package and get it reviewed + merged while you continue working locally on the remainder.
I might do something similar myself, since I’d like to get the pypi package localstack and localstack-client (a cloud testing and mocking tool for AWS), which is currently ~2000th in popularity and has a large dependency closure:
I want to add that the more packages we add in Nixpkgs, the harder it gets to form a working package set. It’s already extremely hard to keep the most recent version of everything without breaking other packages.
Also, in case of pure Python packages pypi2nix may be a good solution. I think it will work quite well for openstack.