Best way to manage sha256s?


I use flakes everywhere nowadays, managing dependencies with flakes is easy. I just do nix flake lock --update-input and things are updated. But sometimes I need to use fixed output derivations. For example:

rust-toolchain = fenix.fromToolchainFile {
  file = ./rust-toolchain.toml;
  sha256 = lib.fakeSha256;

Because fenix fetches a manifest from internet, so I need to specify its sha256.


((vscode.override { isInsiders = true; }).overrideAttrs (oldAttrs: rec {
  src = (builtins.fetchTarball {
    url = "";
    sha256 = lib.fakeSha256;

To get the latest vscode insider build.

If I want to update those, I need to edit the file to replace sha256 with fakeSha256, run nix, and copy from error message to file. This is inconvenient, I want to just run a command like nix flake lock --update-input and be done with it.

The solution?

The idea I came up with it to separate sha256s into a separate nix file that’s easy to search and replace, then have a script to do the steps to update them. Then I remembered niv. And yeah, turns out niv does support adding arbitrary URLs as dependencies, and I can get the sha256 from sources.nix.

Problem is, niv won’t update them. You see, niv expects you to give it an URL template, when you update, you need to assign new values to variables in the template to generate a new URL for niv to fetch. If the URL is fixed, niv does nothing.

I opened a feature request for this, but since niv is not actively maintained anymore, I doubt it will ever be implemented.

Request for help

So, how does everyone else do this? Is there a better way? Maybe flakes should support non-flake inputs?

1 Like

edit; also see

Using updateScript means I need to manually write update scripts for each of these cases where a fixed-output derivation is used. I just don’t want to do that, and wish there is a scalable way of doing this. niv comes very close to that goal.

And I know about nix-update, doesn’t look like it would do what I want either.

1 Like

To throw some other options into the mix (though haven’t checked if they’d work for you)

1 Like

They looks interesting, but I think neither support arbitrary URLs?

Anyway, I’ve already whipped something together quickly that does what I want. It even knows to use ETag and Last-Modified headers to detect change, which I think none of those tools do. If there’s interests I might polish it and make it public, if not I will just keep it to myself.

1 Like

I’ve been using GitHub - berberman/nvfetcher: Generate nix sources expr for the latest version of packages for years to solve this, it supports arbitrary URLs, among a variety of more specific things (e.g. docker images).

Maybe flakes should support non-flake inputs?

They do with flake = false

1 Like


oooh, this is very nice.

flake = false;

I didn’t know they do! I think this is basically what I want? I find out I can even set type = "tarball" to unpack the file downloaded (I don’t think this is documented).

Do we still need a tool like nvfetcher when flakes support non-flake inputs?

Ah wait. The narHash I get from a flake input isn’t the same as sha256 for fetchers :frowning:

OK, I tried nvfetcher, the interface is kind of cryptic but I think I figured it out:

This is the nvfetcher.toml I was using:

src.manual = "0"
fetch.url = ""

I run nvfetcher, and I got an error:

Command line: nix-prefetch-url
Exit code: 1
error: store path '7q7y75jlkzv2risc1qj5qz37gg9ycfcd-download?build=insider&os=linux-x64' contains illegal character '&'

It is quite disappointing that it don’t even know it needs to sanitize store paths.

Edit: It also doesn’t seem to support refreshing the hash of a fixed url. I think I need to manually change src.manual to trigger an update.

OK, since non-flake flake inputs exist, this is pretty much solved: just use flake inputs! You can even set type = "tarball" to unpack the file fetched.

It does, using src.manual is the escape hatch for when you don’t want to have automated updates for a specific source, so you chose not to refresh hashes.

The configuration is indeed somewhat cryptic, it contains a mix of nix-prefetch and nvchecker, but nvchecker gives it a very rich set of “new version” heuristics.

Here’s how I scrape an HTML page for whether a “new version” of a package exists, for example:

I’m somewhat surprised by the lack of sanitization as well though. Probably worth writing an issue upstream about.

There is still a niche for tools like nvchecker - flake inputs lack any ability to distinguish between higher level “versions”, you really do just get a hash for whatever the current thing at the URL happens to be, with no ability to update to a different URL if the version changes or such.

Fine for a lot of use cases, but not quite all. I wonder if some of the recent developments around making flake inputs support something like semver can tackle that problem too.