Consider a use case where we are:
- installing, with nix, multiple tools for our development environment
- pinning nixpkgs as described in Towards reproducibility: pinning Nixpkgs — nix.dev documentation .
- using this same nix setup in our automation (e.g. CI or CD, both use many of the same tools used for development. E.g.
go
orterraform
). Doing this is good for increasing the parity between development, testing and production environments. - versioning each tool separately. In the spirit of breaking down large tasks and submitting small pull requests, we don’t want to update the versions of all of our development tools at once.
E.g. :
let
go = (import (fetchTarball "https://github.com/NixOS/nixpkgs/archive/<COMMIT-HASH-1>.tar.gz")).go;
terraform = (import (fetchTarball "https://github.com/NixOS/nixpkgs/archive/<COMMIT-HASH-2>.tar.gz")).terraform;
# many (e.g. 20 or more) other tools
in #...
Problem: downloading each of these many tarballs could add a lot of bloat to automation environments. E.g.: downloading GitHub - NixOS/nixpkgs at 1c4d0f130b0536b68b33d3132314c9985375233c gave me 47.7 MB zipped, 181.7 MB unzipped. Multiply this by the number of tools you’re using through nix and things get unwieldy pretty quickly: large build times and large storage in caches (e.g. docker container registries). The overwhelming majority of files inside these tarballs are not being used. What are possible solutions that ameliorate this problem while making little or no compromises?