Writing modules for NUR

I have created a rather simple nix module for tox-node: nur-tox/tox-node.nix at 72726c07ee5dcdeba00e847dfbf5c1e3f9bfede9 · tox-rs/nur-tox · GitHub. Then I added this to configuration.nix:

let
  nur-no-pkgs = import (builtins.fetchTarball "https://github.com/nix-community/NUR/archive/master.tar.gz") {};
in
{
  imports =
    [ # Include the results of the hardware scan.
      ...
      nur-no-pkgs.repos.tox.modules.tox-node
    ];
}

But when I try to use my module, I get the following error:

error: unable to fork: Cannot allocate memory

Mhm. How much memory do you have on the system you are evaluating?

4 GB. Shouldn’t it be enough?

Not sure. Currently we need to evaluate nixpkgs twice in order to get the import from derivation thing working (importing a nix expression by downloading a url first). Maybe this is enough to make evaluating fail on systems with less RAM. Could you try to enable a swap file and check again?
I can import the module on my system. The growing demand of memory is a problem of nixpkgs/NixOS
because every module needs to be evaluated.

Indeed, with swap on and no other apps launched it works.

In case you are deploying to a server, you can also evaluate on a bigger machine with nixops to build you system. Apart from that I don’t have a better short term solution. The long term fix has to be done in nixpkgs itself by not loading all modules by default, for example as described here: [RFC 0022] Minimal module list by edolstra · Pull Request #22 · NixOS/rfcs · GitHub

Interesting that modules in nixpkgs do not seem to have problems with RAM usage. So perhaps an easier fix would be adding tox-node to nixpkgs.

Like I said: I think it is because importing NUR modules will cause nixpkgs to be evaluated a second time, which can increase the RAM usage. Maybe adding nur-combined via a nix-channel or importing the repository from a local git checkout could solve this.