I’m playing with openai-whisper and I encountered the CUDA situation on NixOS
I replaced torch by torch-bin in the openai-whisper nix file and got it to work with CUDA, but I’d like to make it declarative in my configuration file. I there is a simpler way to have CUDA enabled, I’d be happy to hear about it
but I’d prefer to remove torch and replace it by torch-bin in the original package, I can’t figure how to remove something. I tried with pkgs.lib.lists.remove pkgs.python3Packages.torch OldAttrs.propagatedBuildInputs but it fails with the follow error message
error: value is a set while a list was expected
at /nix/store/faqxpr0wwzk5gbrqlsklh24hif2bgby0-source/configuration.nix:326:37:
325| name = "openai-whisper-cuda";
326| propagatedBuildInputs = (pkgs.lib.lists.remove pkgs.python310Packages.torch oldAttrs.propagatedBuildInputs) ++ pkgs.python310Packages.torch-bin;
| ^
327| });
Edit: This can be done in one line using nix-shell -p 'openai-whisper.override { torch = python3.pkgs.torch-bin; }'
Several different override functions exist. .overrideDerivation overrides the call to derivation and should seldom be used. .overrideAttrs overrides the call to stdenv.mkDerivation and is typically used. In this case, you have a Python package, which offers .overridePythonAttrs and that overrides the call to buildPythonPackage. Finally, there is .override which overrides the function call of the whole expression, and you are interested in using this one.
python3.withPackages(ps: with ps; [ openai-whisper ]);
Note that there is also torchWithCuda that can be used instead of torch-bin.
It is also possible to set the Nixpkgs configuration config.cudaSupport to true. This will be picked up by torch and so you don’t need any override at all.
Ok, so using torchWithCuda requires rebuilding a loooot of things, while using torch-bin only required a big package for pytorch (900 MB) and it worked as expected.
These things are likely available on Cachix - Nix binary cache hosting
It’s maybe worth mentioning, that a consistent way to use cuda-enabled source-built packages is to obtain them from import nixpkgs { config.cudaSupport = true; }
Unfortuanetly, this doesn’t seem to compile on 24.05
0k8r95cvmybiiqkl57f3jmr9-python3.11-whisper-20231117/bin
shrinking RPATHs of ELF executables and libraries in /nix/store/klk287z3dyamgj2bgjxbxyfnvhrasglm-python3.11-whisper-20231117-dist
checking for references to /build/ in /nix/store/klk287z3dyamgj2bgjxbxyfnvhrasglm-python3.11-whisper-20231117-dist...
patching script interpreter paths in /nix/store/klk287z3dyamgj2bgjxbxyfnvhrasglm-python3.11-whisper-20231117-dist
Rewriting #!/nix/store/5w07wfs288qpmnvjywk24f3ak5k1np7r-python3-3.11.9/bin/python3.11 to #!/nix/store/5w07wfs288qpmnvjywk24f3ak5k1np7r-python3-3.11.9
wrapping `/nix/store/f01xfyvh0k8r95cvmybiiqkl57f3jmr9-python3.11-whisper-20231117/bin/whisper'...
Executing pythonRemoveTestsDir
Finished executing pythonRemoveTestsDir
Running phase: installCheckPhase
no Makefile or custom installCheckPhase, doing nothing
Running phase: pythonCatchConflictsPhase
Found duplicated packages in closure for dependency 'triton':
triton 2.1.0 (/nix/store/l1r6fvi6n4rrdg589h9qr2890s6l9zpc-python3.11-triton-2.1.0)
dependency chain:
this derivation: /nix/store/f01xfyvh0k8r95cvmybiiqkl57f3jmr9-python3.11-whisper-20231117
...depending on: /nix/store/f5v7zr3dvnijgwrblj1668xh2rk7vilq-python3.11-torch-2.3.0
...depending on: /nix/store/l1r6fvi6n4rrdg589h9qr2890s6l9zpc-python3.11-triton-2.1.0
triton 2.1.0 (/nix/store/0hycn87xrc34iayra6pzwrg9aaxj10kh-python3.11-triton-2.1.0)
dependency chain:
this derivation: /nix/store/f01xfyvh0k8r95cvmybiiqkl57f3jmr9-python3.11-whisper-20231117
...depending on: /nix/store/0hycn87xrc34iayra6pzwrg9aaxj10kh-python3.11-triton-2.1.0
Package duplicates found in closure, see above. Usually this happens if two packages depend on different version of the same dependency.
error: builder for '/nix/store/ka3in58i5hzh9i90k5pi93p17sxccn6c-python3.11-whisper-20231117.drv' failed with exit code 1;
last 10 log lines:
> dependency chain:
> this derivation: /nix/store/f01xfyvh0k8r95cvmybiiqkl57f3jmr9-python3.11-whisper-20231117
> ...depending on: /nix/store/f5v7zr3dvnijgwrblj1668xh2rk7vilq-python3.11-torch-2.3.0
> ...depending on: /nix/store/l1r6fvi6n4rrdg589h9qr2890s6l9zpc-python3.11-triton-2.1.0
> triton 2.1.0 (/nix/store/0hycn87xrc34iayra6pzwrg9aaxj10kh-python3.11-triton-2.1.0)
> dependency chain:
> this derivation: /nix/store/f01xfyvh0k8r95cvmybiiqkl57f3jmr9-python3.11-whisper-20231117
> ...depending on: /nix/store/0hycn87xrc34iayra6pzwrg9aaxj10kh-python3.11-triton-2.1.0
>
> Package duplicates found in closure, see above. Usually this happens if two packages depend on different version of the same dependency.
For full logs, run 'nix log /nix/store/ka3in58i5hzh9i90k5pi93p17sxccn6c-python3.11-whisper-20231117.drv'.
It takes approx. 15 minutes build time to come to this point, so iterating experiments will suck. Does somebody have suggestions?
You’re likely seeing conflicts because you end up with both torch and torch-bin in the closure; something other than triton must be propagating proper torch which conflicts with torch-bin; you can override the entire python3Packages package set to use torch-bin instead of torch to avoid this: overlays = [(fi: pre: { pythonPackagesExtensions = pre.pythonPackagesExtensions ++ [ (py-fi: py-pre: { torch = py-fi.torch-bin; }) ]; })];
There’s a flag to disable checking for conflicts, I don’t remember the name but you can grep
As a reminder, you may not necessarily need the torch-bin hack, but idk what your use-case is