Something you can do is wrap them using makeWrapper
. That page is slightly wrong in that it’s not actually available in the standard shell environment but is instead exposed via a setup hook on the makeWrapper
package. Anyway, you can use it to ensure PATH
contains all of the dependencies the script needs, so if the script runs commands by looking them up in the path they’ll just work and won’t be reliant on your user environment.
The annoying part is you can’t automatically determine what the dependencies are or that you haven’t missed any (well, without using something like the aforementioned resholved
that parses the whole script). But if you’re wiling to figure out what dependencies to declare, you can do something like
runCommandLocal "my-script.sh" {
script = ./my-script.sh;
nativeBuildInputs = [ makeWrapper ];
} ''
makeWrapper $script $out/bin/my-script.sh \
--prefix PATH : ${lib.makeBinPath [ bash jq otherDeps ]}
'';
This will produce a package with $out/bin/my-script.sh
that’s a wrapper that sets up PATH
before calling your real script. Your real script should be using /usr/bin/env
in its shebang (e.g. #!/usr/bin/env bash
).
This approach means you can use your scripts unmodified (as long as they’re using /usr/bin/env
in their shebang) and therefore the scripts can be run as-is without Nix too.
Alternatively if your goal here is “provide dependencies via Nix” rather than specifically caring about getting the scripts themselves in the nix store, you could just use nix-shell
as your script shebang in order to declare the dependencies inline, and then share your scripts however you want. This does tie the scripts to Nix though so they can’t be used elsewhere.
A third option is to modify your scripts to use @foo@
tokens for all dependencies and use substituteAll
to replace them with the actual dependencies at build time.