We are currently building a HPC benchmark to evaluate how several programs behave using different MPI implementations. We have some custom packages like fftw or vtk that are dependencies of the programs that we want to test, using a generic mpi dependency:
which then is manually set to the one that we want to use:
#mpi = mpich;
#mpi = openmpi;
mpi = intel-mpi;
However, we are thinking in how to achieve it without modifying the default.nix source.
Ideally I would like to have some function that can generate a new package set, like this:
genPkgs = pkgs: mpi: <magicallyReplaceAllMpi>
Not sure how to do it with overlays, as the overlay requires the knowledge of which MPI implementation to use, so we won’t follow the self: super: convention.
Unfortunately there is no default mpi in Nixpkgs, although it seems most (if not all) packages use openmpi by default.
I suggest calling the Nixpkgs package set several times, each time with an overlay setting openmpi to your mpi implementation of choice. Basically the same as what is done in QChem.
Note there is a nixpkgsFun package that allows you to call Nixpkgs from itself, but you don’t really need it.
let
nixpkgs = import (fetchTarball "channel:nixos-20.03");
pkgs = nixpkgs {};
mpis = with pkgs; {
inherit mpich openmpi;
};
# Create an overlay with our MPI of choice, and import Nixpkgs with it.
nixpkgsFun = name: drv: nixpkgs {
overlays = [
(self: super: {
openmpi = drv;
})
];
};
mpiSets = with pkgs.lib; mapAttrs nixpkgsFun mpis;
in mpiSets
The mpich Nixpkgs now has mpich as default mpi, nix-build -A mpich.openmpi. Similarly, nix-build -A openmpi.openmpi has openmpi as default. All dependents will use the new default.
Long-term I think we should have a default mpi, like we have blas.
All packages in nixpkgs use openmpi (as far as I am aware) as it is the de-facto open source standard MPI. I implemented the override solution in QChem out of curiosity but rarely make use of it. I agree with @FRidh that something like we have for blas now would better (implemented directly in nixpkgs).
A potential problem is that some package may not build with other MPI implementations.
Thanks for the reply. With the nixpkgsFun I was able to reevaluate the nixpkgs with a custom overlay from <nixpkgs>. However, I was looking for some way to do this after the nixpkgs is already composed by the user overlays, and then add my custom one on top. Something like: