Setting up a S3 binary cache

I spent days trying to setup a private binary cache and avoid paying for Cachix. Hope it helps someone!

https://fmnxl.com/blog/setting-up-nix-binary-cache/

5 Likes

Why is it better to pay AWS rather than support Cachix? :sweat_smile:

5 Likes

Don’t you need to pay for s3 usage as well?

Just wanted to mention that there are lot of other providers that support S3, including those with flat fees such as Wasabi.

That said, I’d recommend everyone to just get Cachix :wink: . It really works seamlessly, integrates well with GitHub Actions. Plus the authors contribute a lot to the Nix ecosystem. So I’d rather see the money go to them.

</ fanboyism> :stuck_out_tongue:.

5 Likes

I know I’m contributing to evil by paying for S3.

But I think it would benefit the community if we have the freedom of choice in implementing a private cache, while having the simplicity of something like cachix. Be it another cloud provider, or some machine running in someone’s bedroom. I’m thinking like Minio (https://github.com/minio/minio) on a raspberry pi.

At the moment Cachix couples the API with its storage backend. I believe that they can be separate things, and In the spirit of open source that we should feel free to desire better things/code/API, while also desiring to reduce cost.

Needless to say, at the same time I am thankful of Domen for Cachix and his contribution to Nix in general, and it is great that people are happy to pay to sustain Cachix.

8 Likes

I guess that makes sense, storage platform could be anything.

1 Like

Sometimes it’s a matter of compliance or a corporate policy. Enterprise adoption is important.

3 Likes

I appreciate the write-up! It’s great to just have more information out there on some of the more advanced topics in Nix :clap:

I had written something similar

Although personally I use cachix for my OSS projects; I am trying to convince my employer for our enterprise use. It likely will be a difficult sell as our infrastructure is on Google Cloud.
(+ there are a bunch of compliance shenanigans)

I’ve been looking for a straightforward way to actually replicate this S3 solution on Google Cloud Storage.

I suspect I will just have to manually copy each folder using the GoogleCloud CLI.

1 Like

Thanks for your article! If you haven’t noticed, I actually included a link to it on top of the page :slight_smile:

Isn’t there a straightforward way to migrate from S3 to Google Cloud? I came across Storage Transfer Service…

On a sidenote, I just found out that home-manager has a config for setting up caches and public keys in ~/.config/nix/nix.conf (https://github.com/jonascarpay/nix/blob/da4bfa0f89850c686f3fc01ff251e0110cd3e7a9/home-modules/caches.nix)

1 Like

Whoops :slight_smile:
I scrolled too fast!

I was actually interested in the Tweag article you linked; but it’s written in a very unwelcoming way. All I got was “if you wan to support multiple signers, you need a multi-nix user installation”

Just for info, there is an open-source, self-hosted, object storage server Minio. It uses the S3 API to serve buckets on your own machines. Maybe that could be enough for a Nix cache.
I use it to mirror the content of a local folder.

4 Likes

I actually have gotten a binary cache to work using Minio. The hardest part was figuring out that minio needed to be served over https for it to work. I could probably do a full writeup with exact step by step instructions if that is of interest to people.

7 Likes

That would be my preferred setup. Please do when time permits.

1 Like

I am in progress doing the same, such a blog post would be very helpful :blush:

I use Minio for other goals behind Caddy. This way I get a effortless https certification. Caddy is far easier to configure than Nginx.

@griff : I am really interested by your writeup

1 Like

The one thing I like about using a proxy like minio or even nginx; is if you are in a firewall network you can provide the S3 or GCP bucket transparently; with the server doing the authN+Z

I have a little demo of using wireguard / Tailscale to setup a little VPN that does just that.
I just host the VPN server on AWS and my laptop can use the cache without any knowledge of S3 keys but it’s all private.