Very strange runtime error when running a built script in a shell

That’s still progress of some sort! :slight_smile:

There may ultimately be some sort of incompatibility between 2.x and 3.x that will cause trouble, but I think there are still a few simple-ish things to try before coming to that conclusion.

Can you try:

  • removing export SPARK_ENV_LOADED=1 if you still have it set
  • removing PATH from your shellHook
  • in your override of spark, append procps to its buildInputs with something like: buildInputs = old.buildInputs ++ [ pkgs.procps ];

For a little context:

I think you’re just running into some wonkiness around a ~gap in the Nix ecosystem roughly caused by the fact that shell scripts aren’t compiled:

  1. It’s easy for packagers to miss dependencies in shell scripts because Nix doesn’t have a process that’ll break/fail due to missing commands at build time.
  2. When a package’s scripts contain bare command invocations, we either have to:
    • add (~leak) all of the script’s runtime dependencies to the user or system PATH
    • find some way to patch in a fixed PATH at build time

I suspect #2 explains why the nixpkgs derivation for spark builds a fixed PATH into conf/spark-env.sh, and #1 explains why ps is missing (there may also be more).

(I have been building resholve to address this ecosystem gap [i.e., resholve breaks a build when a dependency is missing, and rewrites bare invocations to absolute store paths], but the initial Nix API is focused on straightforward packages of shell scripts. The Nix integration needs more work before it can ~easily tackle cases like this.)

2 Likes