For practice, I’m trying to package a very simple bash script with two dependencies (https://github.com/fcambus/ansiweather/blob/ed4e421e76effe7e6c64eb89ad4e46a043e5dbd7/ansiweather). It depends runtime on jq and bc, so these need to go in buildInputs. Unfortunately, when I build the package and run the result, the script cannot see jq or bc. I’m building the derivation by adding it to nixpkgs via an overlay, and then running nix-build '<nixpkgs>' -A ansiweather. Then running ./result/bin/ansiweather results in ERROR: Cannot find jq binary. Can someone point out where the error is? Below is my derivation:
buildInputs only make the dependencies visible at compile time. E.g., if you compile a C program, the headers and libraries of the derivations in buildInputs are made available when building the derivation. Then during linking the full library paths (e.g. /nix/[hash]-name-version/lib/libblah.so) are added to the binary. After building a derivation, buildInputs are not otherwise visible to the program anymore. There are two straightforward solutions to your problem:
Replace occurrences of jq and bc by ${jq}/bin/jq and ${bc}/bin/bc when building the derivation, so that the script of the instantiated derivation contains the full path to these binaries.
Many thanks for the responses! I used approach 2 since in seemed to be more robust. I didn’t manage to find any documentation for stdenv.lib.makeBinPath. It’s an useful tool and should be documented somewhere.
Suppose that your have a program that depends on jq and zlib. Adding these derivations to buildInputs has the effect of (among other things):
Making their store paths available in the build sandbox.
Adding bin directories to the PATH (here: ${jq}/bin).
Adding their library directories to NIX_LDFLAGS (here: ${zlib}/lib).
Adding their include directories to NIX_CFLAGS_COMPILE.
This makes these dependencies visible during the build process. You can see sort of emulate this (without the sandboxing) with nix-shell -p zlib jq --pure and then expecting the environment with export or env.
They are not visible after building in the sense that there is no global namespace that is visible to your program that has jq or zlib. So if your program uses these dependencies at runtime, it has to encode their full paths. E.g. a binary could link to the dynamic library /nix/store/byijk75wjm55sjngzl2zls2hgrg49lal-zlib-1.2.11/lib/libz.so.1 or execute /nix/store/p0lbj9rz96shr4lyh4js60zyizs70h6q-jq-1.6-bin/bin/jq.
However, if a program links to libz.so.1 or executes jq without fully-specified paths, it will not work, since these are not visible to the program anymore (since jq is not PATH or the directory with libz.so.1 is not in `LD_LIBRARY_PATH).
This is also the reason why precompiled binaries with dynamic linking typically do not work out-of-the-box on NixOS. They have to be patched (with e.g. patchelf) to use fully-specified library dependencies.
In other words, buildInputs tell the build environment about possible runtime dependencies. It is up to the build process (Makefiles, CMake scripts, compilers, linkers, etc.) to store the (absolute) paths to dependencies in the build result. LD does it pretty well by storing the shared library paths in the DT_RUNPATH entry of ELF files (thanks to our patches) but for more dynamic platforms, it might require some help (typically hardcoding program paths using a patch or wrapping executables with a script that passes runtime dependency paths through environment variables).
nativeBuildInputs fill a similar role for telling the build environment about build-time dependencies but their job is easier since they do not need to persist when the build environment ceases to exist.
https://github.com/abathur/resholved also relevant for packaging shell scripts! Still WIP according to its readme, but it looks like a good principled solution to this problem.