Anyone have a flake for playing with Stable Diffusion?

I’m planning to have a play with Stable Diffusion over the weekend:

Just curious if anyone out there already has a Nix flake up with a devShell or some apps for playing with?

2 Likes

You need to run conda to download lot of dependencies, I’m not sure this would be easy to use with flakes :frowning:

I followed Getting Stable Diffusion Running on NixOS - Xe and it’s working well, but you need to type a few commands. I don’t see how this could be turned into flakes :confused: I hope someone can find with a simpler system for using it :+1:t3:

2 Likes

:wave: :wave: It’s a WIP, but there’s already a working mode that gets the “full” Stable Diffusion up and running with the “sd-web-ui” interface, assuming you have it cloned and the model installed to the right directory.

Please see the README for details, usage, wip notes, and credits.

call for help: to make this less hacky and more reliable, there are more python packages that need attention or initial packaging. If you have some spare time and want to take a whack, again see the README.

7 Likes

I haven’t really had a chance to de-pip and add more nix derivations, but I did find out that what I had works with the AUTOMATIC1111 web ui fork as well:

I’ve been able to do img2img, use some upscalers, and apparently this fork supports 4GB GPU models too… so this might be more accessible to a lot more folks.

1 Like

I have a flake to the https://github.com/bes-dev/stable_diffusion.openvino one. It uses OpenVino, any recent CPU will run very fast. I’m using poetry2nix. Here is my fork with flakes GitHub - tfmoraes/stable_diffusion.openvino

3 Likes

Here is my 5 cents - nix-only (without pip, conda, etc) flake with support of AMD ROCM

4 Likes

@gbtb Awesome! Thank you! Also, I totally just spent many hours packaging a very similar set of packages. I really should’ve kept on top of Discourse more. :upside_down_face:

Maybe we can collaborate, or I can just port some of my stuff into your flake. I’ve got a few things that you might find useful:

  1. A hacky patch to the codebase to allow override the model load path with env var(s)
  2. A reasonable patch to the setup.py that adds the invoke and pretrain scripts so that buildPythonPackage can properly wrap them for running.

This gets me pretty darn close to:

export WEIGHTS_stable_diffusion_1_4=~/code/stable-diffusion/v1-5-pruned.ckpt
nix run github:colemickens/invoke-ai

There’s still an issue with:

  1. ldm python module only loads when it’s available via the current working dir
  2. the pretraining dumps output files straight into the ldm module directory, yay
3 Likes

Well, we can definitely coordinate our efforts :slightly_smiling_face:. My two goals when I started was to 1) Get SD running with AMD ROCM and 2) To try and do it as much as possible in nix/nixpgks, because otherwise I could just use it from within a docker container. For InvokeAI it’s done already, for AUTOMATIC111 fork we’re almost there.
What’s next is debatable, I would like to contribute some missing packages back to nixpkgs, after the blocker issue will be resolved. Patching init scripts and models to put their weights inside specific directories seems interesting too, as it may allow packaging SD and GUIs directly with all required models (even though I’m not sure that getting this blobs inside nixpkgs is feasible, technically and legally)

Thanks for this!

Currently testing on my own all-AMD machine, the devShell build seemed to work well, and I’m currently in the middle of the invoke AI model downloads - all smooth so far :slight_smile:

you can also give a try to GitHub - tfmoraes/stable_diffusion.openvino for CPU only rendering. It’s working well, compared to my nvidia 1060, the Ryzen 5 5600X is processing just twice slower, which is still great.

1 Like