How to speed up deploys of personal website deployed with `nix copy`?

I have a VPS running personal server stuff. It runs NixOS, and I deploy to it by building a NixOS system locally, sending the closure to it with nix-copy -s and then ssh-ing in and calling switch-to-configuration on the derivation in the store.

One of the services it provides is hosting for my website, which is statically generated. I have nix expressions that build the directory to serve and plumb them directly into the server’s configuration.nix, which has worked well so far. As my site gets more images and media, this is making deploys take much longer than I’d like:

  1. I have git repos for the site generator and content. I make changes and push.
  2. I tell my build scripts to update which repo they’re pointing to. This runs nix-prefetch-git, and takes a while to redownload the content repo in particular each time.
  3. The derivation for built site gets rebuilt by the call to nix build that builds the whole NixOS. (This seems to fetch the files from git, again?)
  4. Despite being really similar to the previous version of the site, I have to upload the whole derivation again. My upload speed is not especially good, so this is “go make a coffee”-amounts of time.

The big one is point 4. I see a few ways around it:

  1. Build on the server. Doesn’t feel good from a secrets-management perspective.
  2. Point the webserver to some directory like /var/www and rsync site content there separately, instead of regenerating the entire server’s NixOS. Doable, but feels like a betrayal of the ideals of NixOS.
  3. Something else that lets me capture a delta against the previous site content derivation. This could work, but might create long chains of dependencies in the store, as each deployed site content derivation would depend on the previous one. I don’t know how to achieve this. (Content would be deduplicated in an optimised store, so it’s fine from that perspective.)
1 Like
  1. Despite being really similar to the previous version of the site, I have to upload the whole derivation again. My upload speed is not especially good, so this is “go make a coffee”-amounts of time.

The big one is point 4. I see a few ways around it:

  1. Something else that lets me capture a delta against the previous site content derivation. This could work, but might create long chains of dependencies in the store, as each deployed site content derivation would depend on the previous one. I don’t know how to achieve this. (Content would be deduplicated in an optimised store, so it’s fine from that perspective.)

These dependencies are not runtime dependencies, though, so you can GC the old stuff anyway.

Also, you can have a fixed-output derivation (build, checksum, reuse the same build process but with now-known output hash). Another derivation (via binary patching) will produce the same output path. Note that the binary-patching part is a build that will need to be done on the server (but the secrets are not necessary anymore by that point).

  1. Follow how the Nix does things; create multiple derivations and symlink everything together, so that only the new sub-derivations and a symlinks-only top-level need to be uploaded.

(probably there are more options)

Thanks for your help. Subderivations and symlinks sounds pretty promising, but I think I’m not understanding something about nix.

Suppose I have two versions C1 and C2 of my site content repo. The only changes are in markdown files, so the images and other media is unchanged.

I can set up a nix derivation D that takes the site generator G (which doesn’t change in this example) in nativeBuildInputs and the content repo as src, and produces outputs out and images. If I build D with G and C1, and then build D again with G and C2; won’t each images output have a different hash, even though the files inside each are the same?

If not, what am I missing? Am I trying to build this the wrong way?

Well, I guess you will have to have your images in a separate derivation, and maybe even a single fixed-output derivation per image. I am not sure how your generation pipeline should look like; I would start with a simple script that takes images-raw storage directory and makes sure that and images-nix directory (I guess inside the repository used as an input to the site generator) contains (GC-root) symlinks to the results of nix-prefetch-url of file:/// URLs for all images in images-raw.

1 Like

The best way to be more incremental in Nix is to cut the builds in smaller derivations. For example
Home - Styx Static Site Generator is interesting in that regard.

It would be nice if Nix supported binary diffing between two derivations during copy.

2 Likes

I think I’ll try only using Hakyll for HTML/CSS, and write a program that generates fixed-output nix expressions for each image in the repo, and then symlink the lot together.

Thanks for the pointers.

I have written and released a tool that makes splitting the derivation output much easier called nix-freeze-tree:

1 Like