I can relate with this issue and I am sure others can too. I have had several PRs open for weeks while some get merged in less than 1 hour. I think that there are a few reasons for this. Also I want to state that I am sure the maintainers are swamped.
So my question for the maintainers is what can we do for you to make your life easier accepting PRs?
One method that I know of is to add yourself to the ofborgknown users. This allows you to build your PRs instead of them having to manually issue the commands. This allows you to “prove” that the package builds properly (making merging simple for the maintainer).
But I think the biggest barrier to PR attention has to do with several factors:
complexity: is it a simple refactor, update, or init of new package. If it’s not trivial it will be harder to get attention
dependencies: do many packages depend on this package? If so it has the potential to impact many users if not built properly so it need extra attention to make sure it is done right.
trust: have the maintainers seen you commit many PRs before? I know that my first few PRs took a long time to get correct due to agreed upon standards (capitalize this, no periods, etc.).
All of this takes time and keep in mind that nixpkgs receives around 20 issues and 50 PRs per day. I wonder if Github is a bottleneck?
Yeah it’s definitely annoying. If you have any specific ones to list here, I would be happy to take a look at them. There is definitely an ongoing struggle in handling are massive PR load in a robust way. I think we’ve gotten better but we also have had lots of new contributors - which is great but requires more time to go through.
Since a month or two I have not spend much time reviewing and merging PR’s, mostly because:
all these tiny (Python) packages that keep being added cause a lot of work and relatively few gain, at least for those that can merge
issues with Hydra/staging. Instead of individual Python package updates, I prefer to do a batch upgrade to reduce the amount of work. Often, however, there are transient failures, or there is simply too much load.
it becomes work with few joy for those that review/merge.
the more packages that end up in Nixpkgs, the harder it gets to perform actual improvements.
there are too few people that can actually merge, and too few that maintain the packages they contributed.
Which is a pity, because there may be users out there willing to give a hand but too shy to ask, or not sure to have the required skills. Or just needing the nudge to get started.
Could we consider having a “community manager” or something like that ? Someone responsible to give the required accesses and getting new committers started ? This could surely be discussed live at NixCon.
It would help to have a system to report metrics on every contributors. Then we could create some rules based around that, and even automate it. After N contribution, get access to ofborg, after N+M merge access, …
Does it actually require it? I though being a known-user pretty much just requires creating a PR on the ofBorg repository since there is little to no security risk involved. I think a recommendation would only be needed for trusted user (known user + darwin tests).
As I’ve said on IRC today I think “known user” status could actually be replaced by “has that user contributed to nixpkgs before”.
We don’t have a formal process for that, but I think you would qualify for a maintainer based on your past contributions. I will ask in the IRC. @zimbatm do we have a list of people that can add members to the nixpkgs organization or is just domenkozar?
Not my own, but the (Python) Tensorflow derivation is currently pretty useless, since it fails to open a shared library when tf.contrib is used. Since many machine learning project use at least some function from tf.contrib, they currently don’t work. A PR has been lingering for 3 weeks now that solves this problem: