Hello,
On this page we can find information such as:
- cache.nixos.org contains 120 TB of packages for 3 architectures
- hydra executes over 350 000 builds a week, but many packages are very small
As a nixos-unstable user, I notice the amount of download data required to update a workstation is ridiculous, like a few gigabytes per week for my Plasma workstation.
I’m convinced as a community we should thrive to make NixOS more sustainable with regard to resources usage.
The only ideas I have in mind right now would be to have deduplication in the cache repository at the package level, exactly like with the nix-store --optimise
, maybe it’s already working this way?
The other idea would be to download a delta of the packages, this would drastically reduce bandwidth usage, and reduce the download time at the same time.
Because of reproducibility, it’s hard to skip rebuilds, guix has a “graft” system for packages update with really minor changes to avoid recompiling the whole dependency graph, but I suppose it kills reproducibility?
However, the number of weekly build could be reduced / optimized by only building package sets that would be really used by most people. I don’t exactly understand how it works, but when I update my nixpkgs flake, it’s downloading a json in the registry and there is a new nixpkgs tarball available every few days.
So, I don’t really have game changer ideas, but that’s an important topic for me, and I always feel a bit guilty to contribute to nixpkgs because of this. I open the idea to the community, maybe we can find a solution together to improve the current state of Nixpkgs.