Pylightnix - A lightweight nix-like DSL in Python for ML

Hi. Let me announce a Pylightnix project which is a Python DSL for nix-style build systems.

Pylightnix doesn’t have tight connection with Nix and does have somewhat different goals, but since it borrows most of ideas from Nix, I think it would be good to describe this project here.

So, Pylightnix is an attempt to address some data-deployment problems we see in machine learning applications. It’s features are:

  • Written in mypy-typed Python, compact codebase
  • Provides API for 2-staged build process (has derivations and realizations)
  • Uses Python for both evaluation of build-expressions and ‘realizing’ them into build artifacts
  • No build isolation beyond basic safety checks, also no multi-threading.
  • There is a support for non-deterministic builds. Derivations here may have multiple realizations and there is a mechanism that guarantees a self-consistency across derivation’s closure.

The project is in its early stage, I tried to write some documentation but probably it is not very good yet.
Still I think that the bare minimum is ready (well, garbage collector is not ready)). Anyway, I am going to left it as-is for some time and switch to applications. Will be glad to have a discussion.

Thank you

3 Likes

Wish you best on your endeavors, sir!

Thanks for sharing this work. How does it compare to Spack ?

I didn’t know about Spack, thanks for the link! At the first glance, Spack is a Nix competitor. It’s size is quite big and it’s history goes back to 2013 and they are also focused on software deployment. Spack is written in Python, but I would think twice before using it as a library for small project: +300K lines of Python2 code would certainly add risks.

Spack’s doc highlight it’s ‘variants’ solver feature. If I understand the idea correctly, in Nix we may declare libfoo = callPackage foo.nix {}; libfoo_with_jpeg = callPackage foo.nix {jpeg=true;} while Spack allows us to declare a ‘jpeg’ variant and then do the search. Maybe it is a good thing for HPC, but for now I think that in complex systems it is safer to write the dependency graph explicitly.

In contrast, Pylightnix is designed to be used as a library, and I see it more as an experiment management tool for data science rather than a software deployment tool. SW-deployment is also possible, of cause, but users would have to provide their own build isolation (BTW, I didn’t see any comments on isolation in Spack).

Also, in Spack I can’t find any comments about non-deterministic builds, does it care of them at all? To the best of my knowledge, Nix does care of this problem, but tries to avoid it. Pylightnix attempts to make the situation manageable and handle cases like ML-model trainings which are full of randomness.

“Spack I can’t find any comments about non-deterministic builds, does it care of them at all?”

I cannot help you here.

Thank you for investigating the comparison !

1 Like

This will be a little terse. I don’t mean it to be rude. But I wanted to give you feedback before I forget (because I think you may be addressing something I’ve been thinking about a lot) but I don’t have time to phrase it carefully atm…

After skimming your post and docs in the repo, I don’t know what problem(s) this solves. I say this for a selfish reason: I have a problem–a sort of (non-ML) data (authoring/composition and) deployment problem–that I’m currently working on.

Pylightnix might solve all, part, or none of my problem. But this post says it attempts to address some data deployment problems you see in machine learning without describing those problems, and then turns to implementation details. The repo, for its part, only says that it is a “purely-functional solution for data deployment problem”, spends some time comparing itself to Nix, and then turns to implementation details.

Even if you are 100% confident that only people working on ML will find pylightnix useful, I’d still suggest you name these problems. Maybe a good goal is: try to outline the problem so that someone who stumbles on your post or repo today, and then finally runs into one of the problems it addresses for the first time 3 months from now, will remember that they saw your project and know to go looking for it.

2 Likes

If you’re interested in reproducible experiment managers for ML, check out ck and Popper – ck is sort of a meta-package manager, designed for building reproducible ML workflows. Popper is more for reproducing single papers.

On Spack (disclaimer: I’m the lead developer):

+300K lines of Python2 code would certainly add risks.

Spack is ~56k SLOC of Python for the core and ~90k SLOC for the builtin package repository. There are also some vendored external packages (which we include so that users do not have to install dependencies – it works right out of the repo). All that code is from years of substantial contributions, so I am not sure I understand why it adds rather than mitigates risks. It’s not designed as a library, but we should probably pull a few things out of it as libraries given the work that’s gone into them.

I didn’t see any comments on isolation in Spack

Every build gets its own process with a cleaned environment, and builds have compiler wrappers injected into them to force includes, RPATHs, library search paths, etc. to point to dependencies. We clean out many user environment variables that affect builds (LD_LIBRARY_PATH, etc.)

We do not use a chroot environment. You can bring external packages into a build with the external packages mechanism. If you do this, it makes builds “impure” by Nix standards but it allows us to do things like use preinstalled, often proprietary packages on HPC machines (like the system MPI library, compilers, etc.)

I can’t find any comments about non-deterministic builds

I can’t track down exactly what is meant by “non-deterministic” builds in the Nix community, so I don’t know how much help this will be. But, we try to enable reproducible builds.

The concretizer (resolver) creates a DAG with package, version, compilers, compiler versions, build options, target architecture, etc., and that DAG is hashed recursively. We call this a “concrete” spec, i.e. one with all parameters filled in. Package recipes are templated by these parameters, and rebuilding packages with the same concrete DAG should be reproducible. So, given a spack.lock file for a spack environment (which contains the concrete DAG), you should be able to reproduce a build deterministically. It’s not bitwise reproducible in the Debian sense (AFAIK neither is Nix) but there should be no variation in the options given to the build or the commands run. We do provide a spack install --dirty flag in case users insist on preserving their environment settings in the build environment, but we discourage its use.

You can also “re-concretize” the abstract spack.yaml file from one platform on another, to get a functionally equivalent (but not identical) environment. We’d like to improve our solver to the point that it could produce a resolution that is “as close as possible” to one from another platform, modulo platform-specific constraints, but that is not done yet.

See the recent FOSDEM talks on Spack’s concretizer and our archspec library for packaging optimized binaries if you’re interested in more details on the motivation for Spack.

2 Likes

@abathur, thank you for your response. I accept your points. My original plan sounded like ‘take a shortcut by not writing documentation’ :slight_smile: In my opinion, Pylightnix, despite being of different scale, shares many features with Nix. So I hoped that the community here is able to deduce most the main idea and provide me with some early feedback. Actually, I think that it is exactly what happened.

I think that I had to write it more clearly that while I want to address data-deployment problem, Pylightnix doesn’t offer any solution to it yet, because currently it doesn’t include any code to e.g. sync network storages. I still plan to implement it in some way later.

For now I have updated the Readme, where I attempted to describe features for real. Also I am re-stating Pylightnix as a lightweight immutable data manipulation library until I have anything related to network deployment.

Also I wrote a tutorial which I hope could give an idea on how could this library be used. https://github.com/grwlf/ultimatum-game/blob/bd1efa80bc0abf7fdf61bd5ba8bbb0645f166d64/docs/Pylightnix.md . (Please don’t be shy to point grammar mistakes!)

1 Like

@tgamblin, thank you for the explanation and especially for the Ck and Popper links. This projects look related to Pylightnix, I’m going to read more about them.

Spack is ~56k SLOC of Python for the core and ~90k SLOC for the builtin package repository

I agree that SLOC are better measure. I am going to write a brief review of related projects in future. Probably I’ll use this measure there. For now I’ve added links to the ‘related work’ section of the Readme.

On the build determinism:

I can’t track down exactly what is meant by “non-deterministic” builds in the Nix community

There is a old race in Nix to make compilers issue deterministic results. I have checked the Github and found ~30 issues [1] which may be related. And here is the example of the compiler’s ticket #4012: Compilation results [interface files] are not deterministic · Issues · Glasgow Haskell Compiler / GHC · GitLab with long discussion (I didn’t read all of it and may miss the point). If I understand the situation correctly, deterministic builds would greatly reduce the need of binary caches and save compilation time because it would allow users to rebuild missing components without rebuilding their referrers. Unfortunately, it seems to be hard to achieve this even if we know what do we want, not to speak about authors who don’t care of the problem.

In Pylightnix I attempt to admit the fact that this problem has no good solution so we have to deal with non-determinism. I attempted to 1) Allow multiple realizations of the the very same derivations. 2) Introduce ‘matchers’ which should purely select ‘best’ realizations out of available. Matchers may select one or many realizations to include into other derivation’s dependencies or ask the core to build new realizations by executing the builder.

It should be possible to reduce this system to the one that Nix uses (as I see it, I didn’t check it very carefully) by using a trivial matcher in every Pylightnix derivation. Trivial matcher would allow only a single realization and runs the builder if there are no realizations at all.

[1] - https://github.com/NixOS/nixpkgs/issues?utf8=✓&q=is%3Aissue+is%3Aopen+deterministic

1 Like

Thanks, Sergey. The README is much improved. I think it communicates most of what I’m looking for on a first encounter.

I skimmed the new tutorial, but I’ll need to find time later to give it a proper read.

Hi. I’ve recently implemented a ‘promise’ feature in Pylightnix. I’d like to describe it here briefly and share some thoughts.

  1. ‘Promises’ are special attributes of a derivation which refer to the files that we promise will be
    created by this derivation. When the builder script (or in Pylightnix - the realize function in Python)
    exits, the core checks that corresponding files do exist. Later, any other derivation which depends
    on our derivation may access those promises and be sure that files do exist.

    In informal Nix expression language, this may look like the following:

    mydrv = mkDerivation {
      name = 'mydrv';
      promise = {
        artifact = "./usr/bin/artifact";
      };
      buildCommand = ""
        mkdir -p $out/usr/bin
        echo "bla" > $out/usr/bin/artifact  # Produce the artifact
      "";
    };
    

    Later, referrers may access the artifact by either "${mydrv}/usr/bin/artifact" or by
    "${mydrv.promise.artifact}". The difference is that in latter case In the presence of the Promise section the presence of file is guaranteed by Nix.

    Corresponding Pylightnix constructs my be seen here:
    https://github.com/stagedml/pylightnix/blob/ada8a65602be3b843451ede809df923d0b03d57b/docs/demos/MNIST.md#stage-2-the-recognizer
    (near accuracy:PromisePath = [promise, 'accuracy.txt'])

  2. Did anybody discuss a feature like this for Nix? May it be already implemented? I think that it may be very useful here, because it would allow to catch ‘missing-file’ errors early.
    Probably one may think that Nix expressions are a kind of type-system for build scripts (which are typically written in shell). In this analogy, the promise feature adds a kind of type classes to this system.

1 Like