For ollama, you can see from https://github.com/jmorganca/ollama/blob/325cfcd9ffa8c5e19c7258471a864dcd8508d49a/docs/linux.md that it can be installed by downloading a binary and putting that binary in the right directory. So something like the derivation below should work for a quick trial. (But it’s probably better to try and build it from source.)
{ fetchurl, lib, stdenv, autoPatchelfHook }:
stdenv.mkDerivation rec {
pname = "ollama";
version = "0.1"; #Change this!
src = fetchurl {
url = "https://ollama.ai/download/ollama-linux-amd64";
hash = "sha256-WxRimPMHV2qbePUu9EVniApXy2NrLK97LsXO+Burdkk=";
};
nativeBuildInputs = [
autoPatchelfHook
];
dontUnpack = true;
dontBuild = true;
dontConfigure = true;
installPhase = ''
mkdir -p $out/bin
install -m755 -D $src $out/bin/ollama
'';
meta = with lib; {
homepage = "https://ollama.ai/";
description = "Enter Something Here";
platforms = platforms.linux;
};
}
You may have to add some libraries to buildInputs
, though. For packaging binaries, you can also consult this: Packaging/Binaries - NixOS Wiki