I would like to package a web frontend Ollama-WebUI for Large Language Model (LLM) execution backend Ollama to make it easy for people to run Open-Source ChatGPT alternatives locally. However, i’m struggling to to include a webserver as the underlying binary to serve the web app (which compiles into a static page).
Using writeShellApplication
works, but i can’t set the version attribute.
let
# ollama-webui doesn't tag versions.
# version = "0.0.1";
version = "0820d846190569334151d2970169f77240a0916d";
# we just package the JS frontend part, not the Python reverse-proxy backend
# for simplicity.
ollama-webui-frontend-static-page = buildNpmPackage rec {
pname = "ollama-webui-static-page";
inherit version;
src = fetchFromGitHub {
owner = "ollama-webui";
repo = "ollama-webui";
rev = version;
hash = "sha256-3Lf7v5tEVrhHrEcpdJYV8EmoDSeHKVenLbtMyWUmjVU=";
};
npmDepsHash = "sha256-N+wyvyKqsDfUqv3TQbxjuf8DF0uEJ7OBrwdCnX+IMZ4=";
# "npm run build" creates a static page in the "build" folder.
installPhase = ''
cp -R ./build $out
'';
};
in writeShellApplication {
name = "ollama-webui";
# inherit version;
runtimeInputs = [ nodePackages.http-server ];
text = ''
${nodePackages.http-server}/bin/http-server ${ollama-webui-frontend-static-page}
'';
}
What is a better alternative? I tried symlinkJoin
with wrapProgram
but didn’t get it to run, and i think i’m also not able to specify the version attribute.