Overlays: remove a package in a list (using torch-bin instead of torch)

Hey :wave:t3:

I’m playing with openai-whisper and I encountered the CUDA situation on NixOS :sweat_smile:

I replaced torch by torch-bin in the openai-whisper nix file and got it to work with CUDA, but I’d like to make it declarative in my configuration file. I there is a simpler way to have CUDA enabled, I’d be happy to hear about it :slight_smile:

I currently have this:

  nixpkgs.overlays = [
    (self: super: {
      openai-whisper = super.openai-whisper.overrideDerivation (oldAttrs: {
        name = "openai-whisper-cuda";
        propagatedBuildInputs = [
            pkgs.python3Packages.ffmpeg-python
            pkgs.python3Packages.more-itertools
            pkgs.python3Packages.numpy
            pkgs.python3Packages.torch-bin
            pkgs.python3Packages.tqdm
            pkgs.python3Packages.transformers
        ];
      });
    })
  ];

but I’d prefer to remove torch and replace it by torch-bin in the original package, I can’t figure how to remove something. I tried with pkgs.lib.lists.remove pkgs.python3Packages.torch OldAttrs.propagatedBuildInputs but it fails with the follow error message

error: value is a set while a list was expected

       at /nix/store/faqxpr0wwzk5gbrqlsklh24hif2bgby0-source/configuration.nix:326:37:

          325|             name = "openai-whisper-cuda";
          326|             propagatedBuildInputs = (pkgs.lib.lists.remove pkgs.python310Packages.torch oldAttrs.propagatedBuildInputs) ++ pkgs.python310Packages.torch-bin;
             |                                     ^
          327|         });

Edit: This can be done in one line using nix-shell -p 'openai-whisper.override { torch = python3.pkgs.torch-bin; }'

3 Likes

Several different override functions exist. .overrideDerivation overrides the call to derivation and should seldom be used. .overrideAttrs overrides the call to stdenv.mkDerivation and is typically used. In this case, you have a Python package, which offers .overridePythonAttrs and that overrides the call to buildPythonPackage. Finally, there is .override which overrides the function call of the whole expression, and you are interested in using this one.

Not tested but this should be sufficient:

final: prev: {
  openai-whisper = prev.openai-whisper.override {
    torch = python3.pkgs.torch-bin;
  };
};

Alternatively, if you want to build a Python environment, you could use e.g.

final: prev: rec {
  python3 = prev.python3.override {
    self = python3;
    packageOverrides = final_: prev_: {
      openai-whisper = prev_.openai-whisper.override {
        torch = final_.torch-bin;
      };
    };
  };
}

and then

python3.withPackages(ps: with ps; [ openai-whisper ]);

Note that there is also torchWithCuda that can be used instead of torch-bin.

It is also possible to set the Nixpkgs configuration config.cudaSupport to true. This will be picked up by torch and so you don’t need any override at all.

4 Likes

Neat! I’ll try that!

I think CUDA - NixOS Wiki deserves some love too :thinking: using torchWithCuda seems very practical

2 Likes

Is this possible with nix-shell to change a dependency of a package without having to go through a file?

Ok, so using torchWithCuda requires rebuilding a loooot of things, while using torch-bin only required a big package for pytorch (900 MB) and it worked as expected.

1 Like

The former builds from the source and the latter uses released wheels.

1 Like

These things are likely available on Cachix - Nix binary cache hosting
It’s maybe worth mentioning, that a consistent way to use cuda-enabled source-built packages is to obtain them from import nixpkgs { config.cudaSupport = true; }

1 Like

This is the golden thread for CUDA PyTorch. I finally got PyTorch working with CUDA thanks to this thread.

2 Likes

One-liner to use pytorch-bin :partying_face:

nix-shell -p 'openai-whisper.override { torch = python3.pkgs.torch-bin; }'

2 Likes