Building a dynamic derivation

I am currently creating a derivation for Shadow. I am using this PKGBUILD from AUR as a template.

I managed to make it works correctly, but I don’t know how to handle the dynamic part, where it downloads an YAML file, parses it, extract the SHA of the image, and download the last version accordingly. They update the file at this url almost every week, so it seems that having a static hash could be an issue.

Is this feasible with mkDerivation ?

2 Likes

The best you can do is hope they don’t remove old versions and do a double-stage.

Double-stage – this is a common practice to make a static derivation and an update script.
Static derivation uses explicit URLs and hashes and is stored in Nixpkgs.
Update script is attached to derivation (via passthru.updateScript) and is run occasionally, to update static information.

Sorry for pointing you to an issue, but I don’t yet know where is updateScript documentation: https://github.com/NixOS/nixpkgs/issues/61935

You may implement updateScript for Shadow as (not tested):

...
src = 
  let source = builtins.fromJSON (builtins.readFile ./source.json);
  in fetchurl {
    url = "https://update.shadow.tech/launcher/preprod/linux/ubuntu_18.04/${source.path}"; 
    hash = "sha512-${source.sha512}";
  };
...
passthru.updateScript = pkgs.writeScript "update-shadow" ''
  ${pkgs.curl}/bin/curl https://storage.googleapis.com/shadow-update/launcher/preprod/linux/ubuntu_18.04/latest-linux.yml \
     | ${pkgs.yq}/bin/yq -j . > source.json
'';

Then you just have to update it regularly (or home automation will handle it). If you do it only for personal use, you can use IFD (import-from-derivation):

with import <nixpkgs> {}; 
let 
  source = 
   builtins.fromJSON (builtins.readFile (
      runCommand "transform" { buildInputs = [yq]; }
        "cat ${
           builtins.fetchurl "https://storage.googleapis.com/shadow-update/launcher/preprod/linux/ubuntu_18.04/latest-linux.yml"
         } | yq -j . > $out"
  ));
  # source is now attrset with all the meta-data
in stdenv.mkDerivation { ...

In general, if upstream doesn’t provide historical releases, it is PITA package, because it is impossible to upstream to Nixpkgs.

1 Like

A similar method would be a local overlay that is regularly updated via the update method. But then Nixpkgs is not up to date. Which may be fine for your use case.

In a way, the Let’s Encrypt module faces a similar problem because it clearly can’t have all the certs available for nginx, etc. So it uses a known location for the dynamic content and sets up services to update it.

In general, if upstream doesn’t provide historical releases, it is PITA package, because it is impossible to upstream to Nixpkgs.

Well, if you are legally allowed to just host some of the old versions yourself, you could host a mirror of the packages — TeXLive expression in Nixpkgs does that (but there open-source nature helps). But this is annoying to maintain for you…

2 Likes