I’m wrapping a non-free software in a fhs wrapper.
source is an archive that is extracted by nix
the contents is actually another proprietary archive/installer. I run it as part of the build process
now I get final derivation
wrap that derivation into a FHS env
All good and works fine.
But
the derivation at p.3 is supposed to be fixed. There should be no difference in what dependencies I used in running that installer - in the end it unarchives the same exact content.
But currently, if I for example update nixpkgs used - it will force all steps above to happen again, even though derivation at p.3. does not change.
I hope I’ve described it in an understandable way. please correct my incorrect terminology.
The question is - how to make that derivation at p.3 to not depend on it’s build inputs?
The derivation is “fixed output” when it contains outputHash, outputHashMode and outputHashAlgo. When you add them to a derivation, it is fixed, thus can access network and will always check if hash of output path is the same.
source archive is not available for automatic download, it’s behind registration, so stage 2 relies on requireFile.
I should try to do that actually. But i’m expecting a bunch of hardcoded fhs paths to be present in different scripts in that software, so the process probably won’t be trivial.
Also, technically, does patchelf count as tampering with software which would be a EULA violation? since all that has a custom proprietary license
I wonder, what happens if I add those hashes, but it will turn out that the software’s custom installer actually do behave differently with different versions of build dependencies?
Will it result in noone else being able to build that derivation unless using the same exact nixpkgs version I did when calculating hashes?
Yeah, if the fixed-output derivation is actually not deterministic, it will fail for people, possibly only sometimes. This is unfortunately a pretty real problem
On the other hand, since fixed-output derivations only care about the output, you can possibly munge the output to “fix” non-deterministic results.
Right so where’s the issue? requireFile is a fixed output derivation by definition.
If all of those scripts are available after unpacking and it doesn’t download anything global-FHS-dependent at runtime, you should be able to just patch the shebangs. The autoPatchelfHook might even do that automatically but I’m not sure.
IANAL.
Technically, it could. Though I’d argue that dynamically linking the software against all the deps of the FHSEnv would be much the same and that is also allowed (or at least tolerated) somehow.
so, first - we need the archive. it’s contents - is the “installer” (this archive is a requireFile derivation)
then - we need to run “installer” that will once again unpack a bunch of stuff from itself.
And this second step contents, that “installer” produces, that I want to be a fixed derivation too, since it basically just extracts files from it’s internal archives.
so basically that is what I do now: one derivation requires the installer archive (downloaded from the site manually), as install phase it runs that installer - and the result of that installation is always the same, only depends on that installer archive. therefore this derivation can be a fixed derivation.
Another derivation uses this fixed one and wraps it into a buildFHSEnv.
Now when you want to update nixpkgs - only that wrapper gets rebuilt, and the whole installation from the initial source archive does not happen again (which saves ~3 GB of space)