Setting up a S3 binary cache

Why is it better to pay AWS rather than support Cachix? :sweat_smile:

8 Likes

Don’t you need to pay for s3 usage as well?

1 Like

Just wanted to mention that there are lot of other providers that support S3, including those with flat fees such as Wasabi.

That said, I’d recommend everyone to just get Cachix :wink: . It really works seamlessly, integrates well with GitHub Actions. Plus the authors contribute a lot to the Nix ecosystem. So I’d rather see the money go to them.

</ fanboyism> :stuck_out_tongue:.

11 Likes

I know I’m contributing to evil by paying for S3.

But I think it would benefit the community if we have the freedom of choice in implementing a private cache, while having the simplicity of something like cachix. Be it another cloud provider, or some machine running in someone’s bedroom. I’m thinking like Minio (GitHub - minio/minio: The Object Store for AI Data Infrastructure) on a raspberry pi.

At the moment Cachix couples the API with its storage backend. I believe that they can be separate things, and In the spirit of open source that we should feel free to desire better things/code/API, while also desiring to reduce cost.

Needless to say, at the same time I am thankful of Domen for Cachix and his contribution to Nix in general, and it is great that people are happy to pay to sustain Cachix.

14 Likes

I guess that makes sense, storage platform could be anything.

1 Like

Sometimes it’s a matter of compliance or a corporate policy. Enterprise adoption is important.

5 Likes

I appreciate the write-up! It’s great to just have more information out there on some of the more advanced topics in Nix :clap:

I had written something similar
https://fzakaria.com/2020/07/15/setting-up-a-nix-s3-binary-cache.html

Although personally I use cachix for my OSS projects; I am trying to convince my employer for our enterprise use. It likely will be a difficult sell as our infrastructure is on Google Cloud.
(+ there are a bunch of compliance shenanigans)

I’ve been looking for a straightforward way to actually replicate this S3 solution on Google Cloud Storage.

I suspect I will just have to manually copy each folder using the GoogleCloud CLI.

2 Likes

Thanks for your article! If you haven’t noticed, I actually included a link to it on top of the page :slight_smile:

Isn’t there a straightforward way to migrate from S3 to Google Cloud? I came across Storage Transfer Service…

On a sidenote, I just found out that home-manager has a config for setting up caches and public keys in ~/.config/nix/nix.conf (https://github.com/jonascarpay/nix/blob/da4bfa0f89850c686f3fc01ff251e0110cd3e7a9/home-modules/caches.nix)

1 Like

Whoops :slight_smile:
I scrolled too fast!

I was actually interested in the Tweag article you linked; but it’s written in a very unwelcoming way. All I got was “if you wan to support multiple signers, you need a multi-nix user installation”

Just for info, there is an open-source, self-hosted, object storage server Minio. It uses the S3 API to serve buckets on your own machines. Maybe that could be enough for a Nix cache.
I use it to mirror the content of a local folder.

4 Likes

I actually have gotten a binary cache to work using Minio. The hardest part was figuring out that minio needed to be served over https for it to work. I could probably do a full writeup with exact step by step instructions if that is of interest to people.

9 Likes

That would be my preferred setup. Please do when time permits.

2 Likes

I am in progress doing the same, such a blog post would be very helpful :blush:

I use Minio for other goals behind Caddy. This way I get a effortless https certification. Caddy is far easier to configure than Nginx.

@griff : I am really interested by your writeup

1 Like

The one thing I like about using a proxy like minio or even nginx; is if you are in a firewall network you can provide the S3 or GCP bucket transparently; with the server doing the authN+Z

I have a little demo of using wireguard / Tailscale to setup a little VPN that does just that.
I just host the VPN server on AWS and my laptop can use the cache without any knowledge of S3 keys but it’s all private.

Hi @fmnxl‚ it seems your blog has gone offline.

I’m looking into setting up a gitlab pipeline for projects at my company, where introducing cachix wouldn’t be possible just yet. So I’m very interested in this topic.

2 Likes

I encountered the same, however, I found a copy of it on web.archive.org

2 Likes

The article link is broken.

1 Like

FYI, I’ve written up my investigations into setting up a S3 binary cache here: https://jcollie.github.io/nixos/2022/04/27/nixos-binary-cache-2022.html

1 Like

(Sorry about the bump - thought it’s relevant)

I had the same issue and made Attic which is a more user-friendly solution that supports global deduplication and garbage collection. It can be backed by either local storage or S3[1].

[1]: Well, don’t actually use S3 - Use Backblaze B2 or Cloudflare R2 instead which are compatible.

4 Likes