Create a private PaaS on Hetzner Cloud or your own servers in minutes. Leverage NixOS and Nix Packages to build a reproducible and auditable private cloud for your projects.
The nix-infra 0.16.0-beta release adds support of deploying to existing servers. This allows you to run it on your home lab or with providers other than Hetzner Cloud. The only requirement is that you have SSH access using SSH-key authentication.
Learn more about nix-infra and download the latest release.
Project templates to help you get started with your configuration:
- nix-infra-machine – create a fleet of standalone machines
- nix-infra-ha-cluster – create a high availability cluster
Full disclosure: I develop and test on macOS. While I have released binaries for Linux, there might be some housekeeping left to do. Feel free to open an issue on Github.
If you hate the AI hype train you may want to stop reading here… There are also three MCP servers (experimental) that might be fun to try out.
First out is your AI sysops assistant. It turns out LLMs are pretty good at Linux admin-work and can relieve you of some of the laborious work making sure your servers are healthy. There are two variants, one that allows you to interact with the cluster PaaS and one focused on your fleet of standalone machines:
- nix-infra-cluster-mcp
- nix-infra-machine-mcp
No credentials will be shared with the LLM and commands passed by the LLM are parsed and verified for safety. Obviously this isn’t fool proof in any way. Assume the LLM will destroy your environment at any time and prepare accordingly.
Another useful MCP server helps with configuration and testing:
- nix-infra-dev-mcp
This tool allows limited access to your project configuration. It can also edit the project app modules and integration tests. Finally it will spin up a test environment on your target servers and run your integration tests there.
Are LLMs useful? Surprisingly yes, I have done testing with Opus 4.5 and it has proven both fun and productive. A year ago, you couldn’t get much use out of LLMs with NixOS, but the SOTA models have improved a lot.
The declarative nature and atomic updates of NixOS provides a huge benefit when it comes to AI assisted workflows and I truly believe this could level the playing field when it comes to build vs. buy of platform services.
If you came this far, thank you for patience and I hope you get a chance to try out nix-infra!
Ref: Initial announcement