Install Custom Kernels Into Jupyter Lab

After playing around with Jupyter in a couple of different configurations (using tools like jupyterwith, custom environments, poetry, etc.), I’ve realized that my ideal configuration for Jupyter is to have a user service that starts when I login, and then allow myself to plug in kernels that I build external to my jupyter setup.

In a non-nix world, this is pretty easy, you can create virtualenv’s and then just install the kernel into your jupyterlab instance, and you can create a new notebook and connect to that kernel.

How does one go about adding kernels to jupyterlab with nix? Here’s my home-manager config:

{ config, pkgs, lib, ... }:

{
  home.packages = with pkgs; [
    jupyter-all
  ];

  systemd.user.services.jupyter = {
    Unit = {
      Description = "Jupyter Lab Development Environment";
    };
    Install = {
      WantedBy = [ "graphical-session.target" ];
    };
    Service = {
      type = "Simple";
      ExecStart = "${pkgs.jupyter-all}/bin/jupyter-lab";
      Restart = "always";
      RestartSec = 3;
    };
  };
}

After reading the nixpkgs source for jupyter, I feel like I should just be able to override it:

home.packages = with pkgs; [
    pkgs.jupyter.override { definitions = { clojure = pkgs.clojupyter.definition; };}
  ];

And in fact, I can build that with nix repl, but when I try to put that in my home-manager config I get the following error:

warning: Git tree '/home/douglas/src/nix-config' is dirty
error:
       … while calling the 'derivationStrict' builtin

         at /builtin/derivation.nix:9:12: (source not available)

       … while evaluating derivation 'home-manager-generation'
         whose name attribute is located at /nix/store/jnk1mfi594y0dsmzla4nmkbr4rnrqa71-source/pkgs/stdenv/generic/make-derivation.nix:348:7

       … while evaluating attribute 'buildCommand' of derivation 'home-manager-generation'

         at /nix/store/jnk1mfi594y0dsmzla4nmkbr4rnrqa71-source/pkgs/build-support/trivial-builders/default.nix:87:14:

           86|       enableParallelBuilding = true;
           87|       inherit buildCommand name;
             |              ^
           88|       passAsFile = [ "buildCommand" ]

       (stack trace truncated; use '--show-trace' to show the full trace)

       error: A definition for option `home.packages."[definition 4-entry 2]"' is not of type `package'. Definition values:
       - In `/nix/store/n8x4nh1v6r2cbfrc6y4h3gi6lnq8isgb-source/home-profiles/jupyterlab.nix': <function, args: {definitions?, jupyter-kernel, python3}>

1 Like

I have it working now with the following incantaion:

  (pkgs.jupyter.override { definitions = { clojure = pkgs.clojupyter.definition; };}).out

It should be noted that the output of this is a python3 package that happens to include jupyter, so it will conflict with any other python3 that you have installed into your list of packages.

I’m going to update to look something like:

let
  myJupyter = (pkgs.jupyter.override { }).out
in
  systemd.user.services.jupyter = {
    # ...
    Service = {
      ExecStart = "${myJupyter}/bin/jupyter-lab";
    };
  };

That should allow me to have the custom jupyter env without leaking a new python into my environment

For posterity, here’s my config that works to create a user-specific jupyterlab, and install a kernel from a dev environment on my system.

In this case, it’s a devShell that I use nix-direnv to enter, and have chosen the path to the interpreter.

This will also work for manual virtualenvs you have on your machine (just activate the virtual env, and run which python to figure out the interpreter).

In the future, I want to try to figure out how to tie the list of kernels to actual devShells I have in my flake, but that will have to come later.

Here’s the file:

{ config, pkgs, lib, ... }:

let
  myJupyter = pkgs.jupyter.override {
    definitions = {
      clojure = pkgs.clojupyter.definition;
      custom = {
        displayName = "Python MSDoc";
        language = "python";
        logo32 = "${pkgs.jupyter.sitePackages}/ipykernel/resources/logo-32x32.png";
        logo64 = "${pkgs.jupyter.sitePackages}/ipykernel/resources/logo-64x64.png";
        argv = [
          # Actualized in devShell, need to figure out how to link this to that actual devShell
          "/nix/store/2864ar7vhhw1fn04jwl5cdj9c39ffswi-python3-3.11.6-env/bin/python"
          "-m"
          "ipykernel_launcher"
          "-f"
          "{connection_file}"
        ];
      };
    };
  };
in
{
  systemd.user.services.jupyter = {
    Unit = {
      Description = "Jupyter Lab Development Environment";
    };
    Install = {
      WantedBy = [ "graphical-session.target" ];
    };
    Service = {
      type = "Simple";
      ExecStart = "${myJupyter}/bin/jupyter-lab";
      Restart = "always";
      RestartSec = 3;
    };
  };
}

Finally, I figured out how to bring devShells from my flake into this config. This way, I’m free to create devShells to use for developing software, and if I want to bring the kernel into my jupyterlab instance, I can do so and don’t have to worry about managing multiple jupyter instances:

{ pkgs, devShells, ... }:

let
  myJupyter = pkgs.jupyter.override {
    definitions = {
      python_msdoc = let python_msdoc = (builtins.head devShells.x86_64-linux.python_msdoc.nativeBuildInputs); in {
        displayName = "Python MSDoc";
        language = "python";
        logo32 = "${pkgs.jupyter.sitePackages}/ipykernel/resources/logo-32x32.png";
        logo64 = "${pkgs.jupyter.sitePackages}/ipykernel/resources/logo-64x64.png";
        argv = [
          "${python_msdoc}/bin/python"
          "-m"
          "ipykernel_launcher"
          "-f"
          "{connection_file}"
        ];
      };
      python_torch = let python_torch = (builtins.head devShells.x86_64-linux.python_torch.nativeBuildInputs); in {
        displayName = "PyTorch";
        language = "python";
        logo32 = "${pkgs.jupyter.sitePackages}/ipykernel/resources/logo-32x32.png";
        logo64 = "${pkgs.jupyter.sitePackages}/ipykernel/resources/logo-64x64.png";
        argv = [
          "${python_torch}/bin/python"
          "-m"
          "ipykernel_launcher"
          "-f"
          "{connection_file}"
        ];
      };
    };
  };
in
{
  systemd.user.services.jupyter = {
    Unit = {
      Description = "Jupyter Lab Development Environment";
    };
    Install = {
      WantedBy = [ "graphical-session.target" ];
    };
    Service = {
      type = "Simple";
      ExecStart = "${myJupyter}/bin/jupyter-lab";
      Restart = "always";
      RestartSec = 3;
    };
  };
}

In order to do this, your devShell must include ipykernel as part of the python environment.

1 Like

Thanks, this was a big help, and I’m now using some of these ideas in a flake: GitHub - AlexChalk/ml_env at 3fb4d915e2ffac3d340b6b406defcf7753a587ad.

1 Like

@douglas I too wanted to play in the past a bit with custom kernels etc, and I didn’t had the competence to dive into Nixpkgs like that. Do you think you can share your insights in the WiKi?

https://wiki.nixos.org/wiki/NixOS_Wiki