Use proot or alike to build complex npm/java libraries that try to download arbitrary executables

Many npm projects (and java also suffers from similar issues as I understand… up to the point that people prefer to package the .jar rather than to compile the source) try to download arbitrary binaries when we install them… pointing of course to libraries/loaders outside of the nix path. As a result, using npm’s dependencies in Nix is quite a challenge as most of the time the build will fail… which is quite frustrating (see e.g. some discussions around that regarding electron). The usual patchelf approach is not effective as the installer both downloads and runs the file at the same time.

I was however thinking: why couldn’t we use proot-like methods (used for instance with great success in Termux) to fake the path to /lib when the libraries are downloaded/run? This way, we could run these binaries without any troubles when downloading the libraries. And it’s still possible to patch the libraries later before we actually build our software so that we can get rid of the proot.


  • we could run basically any script to download the libraries without manual intervention
  • we don’t need to maintain thousands of nix patches for the libraries
  • it’s generic enough that we should be able to apply to methods to other languages quite trivially


  • if the npm script changes over time, the downloaded stuff may have a different hash… so this method may require from time to time an update of the hash (no idea how frequent this would be) and may not be as sound as downloading a .zip archive directly. But still better than nothing otherwise.
  • it’s harder to nicely and individually package each library needed during the compilation: so the cache may grow quickly as each package will have their own set of libraries.

What do you think? Is it something that has been considered before?

What you want is development shell instead of a build. Build intentionally does not allow downloading anything outside a network. The builds are supposed to be reproducible. Nix already has methods for proot-like stuff (its own sandbox, and FHSUserEnv (only linux))

Well if I want to package a program (e.g. an electron-based program) I do want a build, not just a shell. And FHSUserEnv is not perfect either as the integration inside these VM is quite bad (like you can’t even run a program from the host inside the FHSUserEnv, you get tons of unwanted deps, you can’t nest an FHS inside an FHS…). What I proposed above tries to enable a proot/FHSUserEnv-like system only for a few seconds the time to download the dependencies, and then patches the dependencies to drop you out of the FHS)

Build forbid network access, unless the output of the derivation is fixed (that’s why we can still download the source of a program)… so as soon as npm is “deterministic enough” to download always the same stuff, then it should be good enough. (I discussed this point above already in the first point of my drawback)