I’ve been trying to replace the processing backend of numpy with cuPy to accelerate my project. When I try to enter the shell.nix after adding cuPy as a package, my CPU usage maxes out at 100% (Intel® Core™ i7-14650HX × 24) during the compiling process. After a while, all applications including the terminal crashes. I’ve faced the same issue with the pytorch package before as well. (This issue seems to be present in other python packages with gpu acceleration functionality, might be because I’m using an nvidia GPU) You can see the shell.nix file below:
Cuda is unfortunately completely unrelated to vaapi (in fact the nvidia-vaapi-driver needs cuda to function), so nothing in that thread will help here.
Indeed, it seems like nix is trying to build the cupy packages on a device other than my dGPU. However, I’ve been trying to build the package in discrete graphics mode (iGPU isn’t recognized) to begin with, so it would seem that nix doesn’t recognize the dGPU. Here’s the output for nvidia-smi during the build process:
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.124.04 Driver Version: 570.124.04 CUDA Version: 12.8 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA GeForce RTX 4060 ... Off | 00000000:01:00.0 On | N/A |
| N/A 44C P4 13W / 55W | 1371MiB / 8188MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
+-----------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=========================================================================================|
| 0 N/A N/A 9278 G ...me-shell-47.4/bin/gnome-shell 785MiB |
| 0 N/A N/A 9726 G ...-xwayland-24.1.6/bin/Xwayland 2MiB |
| 0 N/A N/A 10197 G ...r-user/nodev/bin/zen-twilight 285MiB |
| 0 N/A N/A 11673 G ...Ptr --variations-seed-version 91MiB |
| 0 N/A N/A 41133 G .../per-user/nodev/bin/alacritty 24MiB |
| 0 N/A N/A 43902 G .../per-user/nodev/bin/alacritty 24MiB |
+-----------------------------------------------------------------------------------------+
I’m unsure about how I can make nix-shell recognize the dGPU though. Is there any specific environment variable that needs to be explicitly set beforehand?