Stable Diffusion Using Nix Flakes

Hello :wave:

Introduction

I made this repository which uses Flakes to quickly get up and running with Stable Diffusion in a Jupyter Notebook.

Setup

The only requirements before starting are:

Running

Users here will be familiar with the setup process for flakes but after that running is as simple as:

nix run --impure github:collinarnett/stable-diffusion-nix#jupyterLab

place your user token in the HF_TOKEN variable at the top of the page and you should be able to immediately generate images.

Info

This is built using the following awesome projects:

  • nixGL - A wrapper tool for nix OpenGL application.
  • jupyterWith - Declarative and reproducible Jupyter environments
  • flake-utils - Pure Nix flake utility functions

Please let me know if there are any problems since I haven’t been able to test this extensively

10 Likes

Cool!

Would it be a lot of efforts to provide a flake for GitHub - basujindal/stable-diffusion: Optimized Stable Diffusion modified to run on lower GPU VRAM ? This allows to use stable-diffusion with 2.4 GB of VRAM instead of 10, making it a lot more accessible.

6 Likes

That repo is neat but yes it does have quite a few more requirements than the small amount of work I’ve put in above to get diffusers packaged. You’re right though, it would make the flake much more accessible if I could package the 2.4GB version.

The entire goal of this small repo is to showcase how simple it is to share reproducible environments with Nix in a fun and interesting way so I will do my best in the coming week to package the repo you linked.

2 Likes

That will be the opportunity for me to try jupyterLab, I hear a lot about it but I still don’t understand how it works / what to do with it :sweat_smile:

Good news! diffusers integrated that change into their library. I’ve gone ahead and configured the change by default so the repo only requires 3.2GB of VRAM now :grinning_face_with_smiling_eyes:

Go ahead and run

nix run --impure github:collinarnett/stable-diffusion-nix#jupyterLab

to get the latest version!

3 Likes

Super cool!!! Thanks :pray:t3:

Is it possible to use a local weights file downloaded from HuggingFace instead of the API? What is doing the API exactly? :thinking:

Is it possible to use an AMD card with this? I assume it is nvidia only. (Per requirements)

You probably can use rocm as long as you compile rocm support with torch. I don’t have an AMD card to test so I can’t give you a clear answer.

I’m working on trying to make this repo better by updating diffusers to the latest version. I have a working repo with an up to date diffusers locally but unfortunately it requires compiling quite a few big packages. I am waiting for this issue to be addressed on nixpkgs-unfree to prevent people from having to deal with compiling these large packages. This way you can just change the input to the diffusers pytorch package with one that supports rocm and be good to go.

I will do my best to publish an update to the repo that will allow you to use rocm easily this weekend.

I am also considering merging my changes into nixifiedai but I will work on the above first.

1 Like