7900XT with ollama + rocm

Am on and 7900XT and getting the following error running ollama + rocm. On unstable-nixos channel:

Jan 21 02:08:35 latitude2 ollama[25521]: time=2025-01-21T02:08:35.025+08:00 level=INFO source=routes.go:1310 msg="Listening on  (version 0.5.4)"
    Jan 21 02:08:35 latitude2 ollama[25521]: time=2025-01-21T02:08:35.026+08:00 level=INFO source=routes.go:1339 msg="Dynamic LLM libraries" runners=[cpu]
    Jan 21 02:08:35 latitude2 ollama[25521]: time=2025-01-21T02:08:35.026+08:00 level=INFO source=gpu.go:226 msg="looking for compatible GPUs"
    Jan 21 02:08:35 latitude2 ollama[25521]: time=2025-01-21T02:08:35.027+08:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
    Jan 21 02:08:35 latitude2 ollama[25521]: time=2025-01-21T02:08:35.028+08:00 level=INFO source=amd_linux.go:391 msg="skipping rocm gfx compatibility check" HSA_OVERRIDE_GFX_VERSION=10.1.1
    Jan 21 02:08:35 latitude2 ollama[25521]: time=2025-01-21T02:08:35.028+08:00 level=INFO source=types.go:131 msg="inference compute" id=GPU-184f51a642a76786 library=rocm variant="" compute=gfx1100 driver=0.0 name=1002:744c total="20.0 GiB" available="5.6 GiB"127.0.0.1:11434

the fix should hit unstable soon Nixpkgs PR #373234 ("ollama: 0.5.4 -> 0.5.5") progress (run it via docker in the meantime to see it work Run Ollama with Docker: CPU, NVIDIA, and AMD GPU support | Tech Undocs )

1 Like

thank you for the update, will do :slight_smile: