Openai-whisper failing on larger models (NVIDIA / CUDA issues)

I assume it’s the same with prime.offload?

The nix-community cache has recently begun including CUDA packages and it’s quite useful to have in general, so I suggest you enable it and see:

nixpkgs.config.cudaSupport = true;

nix.settings = {
  substituters = [
    "https://nix-community.cachix.org"
  ];
  trusted-public-keys = [
    # Compare to the key published at https://nix-community.org/cache
    "nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs="
  ];
};

PS: If you don’t need packages for a specific python version, it’s probably better to use python3Packages instead of python3xxPackages since the former will just point to the current python3 version that’s installed in your system.

environment.systemPackages = with pkgs; [
  python3Packages.pytorch-bin
  python3Packages.openai-whisper
];

This means you don’t need to change your config every time python’s version changes and you don’t have to install 2 python versions (one from the system and the other from python3xxPackages).