That repo is neat but yes it does have quite a few more requirements than the small amount of work I’ve put in above to get diffusers packaged. You’re right though, it would make the flake much more accessible if I could package the 2.4GB version.
The entire goal of this small repo is to showcase how simple it is to share reproducible environments with Nix in a fun and interesting way so I will do my best in the coming week to package the repo you linked.
Good news! diffusers integrated that change into their library. I’ve gone ahead and configured the change by default so the repo only requires 3.2GB of VRAM now
Go ahead and run
nix run --impure github:collinarnett/stable-diffusion-nix#jupyterLab
You probably can use rocm as long as you compile rocm support with torch. I don’t have an AMD card to test so I can’t give you a clear answer.
I’m working on trying to make this repo better by updating diffusers to the latest version. I have a working repo with an up to date diffusers locally but unfortunately it requires compiling quite a few big packages. I am waiting for this issue to be addressed on nixpkgs-unfree to prevent people from having to deal with compiling these large packages. This way you can just change the input to the diffusers pytorch package with one that supports rocm and be good to go.
I will do my best to publish an update to the repo that will allow you to use rocm easily this weekend.
I am also considering merging my changes into nixifiedai but I will work on the above first.