When we package software without a stable release yet and active development, we could update the package after each commit, which could happen multiple times a day.
That seems not to be useful as it costs time to update, verify and build.
What should be the criterias to update such a package?
I think the changes should be significant for the user (new features, bugfix that affects the users) or it hasn’t been updated in some time (like a month).
I’m a maintainer of bcachefs and @eadwu has done very frequent updates lately. That’s why i came up with this topic.
I’m not sure if anyone actually uses the package and if it is worth the effort.
What do you think?
bcachefs is a curious case since it is indeed very unstable and regularly gets relatively crucial fixes, but also new features and with them bugs. That is why I personally use a pinned version that I upgrade if it’s very out of date, I want new features, or require a bugfix.
I’m not sure we can come up with a sensible packaging strategy for bcachefs as long as bugfixes and new features happen on the same branch and no revs are designated ‘likely working’ since no promise of stability (some of which we want even on unstable) can be made at all.
Well, I don’t have many packages but I generally wait for new releases/tags. I think that shows a clear checkpoint of a product.
If there is no such thing, bleeding-edge craving users should have a script/derivation/way like Gentoo’s bleeding edge (9999) ebuilds, which always gets the latest commit.
But I don’t think this would be repo-worthy… Updating the repo twice a month is plenty IMHO.
I don’t think we need hard criteria – it’s the maintainer who’s putting
in the work, and if they want to update it twice a week or whatever
that’s up to them.
For unstable packages I maintain, I’d update them if there was something
in a newer version that made it better for me or other users. If 100
commits had been pushed but they were all refactoring, I wouldn’t.
TBH, I think unstable packages would make more sense if they weren’t built by hydra and the source revision was mutable. That way the package always fetches whatever is latest on master, and is a source build. Currently, unstable packages are like packages that use snapshots off the source repository.
So like a git package in Arch Linux’s AUR or like how @gurkan mentioned Gentoo’s bleeding edge ebuilds.
TBH, I think unstable packages would make more sense if they weren’t
built by hydra and the source revision was mutable. That way the
package always fetches whatever is latest on master, and is a source
build. Currently, unstable packages are like packages that use
snapshots off the source repository.
I wouldn’t like that – then system builds would no longer be
deterministic. I couldn’t roll back my channel and get the same version
of unstable packages.
IMHO new packages still in a dev phase with fast iterations are best distributed through NUR.
Pkgs that are updated very often adds to the PR pile on nixpkgs repo, especially when maintainers of said packages don’t have the right to merge on their own.
youtube-dl (it really is a random example) with more than 30 updates in 2019 alone are a bit of a pain.
As maintainers, we may also send a friendly nudge upsteam to tag a patch release if there is good things ready to be shipped.
youtube-dl might be a bit of an exception as it is one of those programs which have to put out a new release whenever an online service decides to make a breaking change.
Is that really an issue though? Unless you’re measuring PR count for some reason, I don’t see why it would matter. Reviewing and merging a trivial update PR should take very little time for a committer.