I have a Thinkpad T480 with a MX150. I’m running nixos-unstable.
lspci| grep -E 'VGA|3D'
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 620 (rev 07)
01:00.0 3D controller: NVIDIA Corporation GP108M [GeForce MX150] (rev a1)
°
xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x47 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 3 outputs: 5 associated providers: 0 name:modesetting
Provider 1: id: 0xf7 cap: 0x0 crtcs: 0 outputs: 0 associated providers: 0 name:modesetting
nvidia-smi
Sat Apr 18 22:59:23 2020
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 440.82 Driver Version: 440.82 CUDA Version: 10.2 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce MX150 Off | 00000000:01:00.0 Off | N/A |
| N/A 49C P8 N/A / N/A | 0MiB / 2002MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
nix-shell -p nix-info --run "nix-info -m"
- system: `"x86_64-linux"`
- host os: `Linux 5.4.32, NixOS, 20.09pre221706.b61999e4ad6 (Nightingale)`
- multi-user?: `yes`
- sandbox: `yes`
- version: `nix-env (Nix) 2.3.4`
- channels(root): `"nixos-20.09pre221706.b61999e4ad6"`
- channels(edi): `""`
- nixpkgs: `/nix/var/nix/profiles/per-user/root/channels/nixos
configuration.nix
{ config, pkgs, ... }: {
imports = [ ./hardware-configuration.nix ];
nixpkgs.config.allowUnfree = true;
services = {
xserver = {
enable = true;
exportConfiguration = true;
videoDrivers = [ "modesetting" "nvidia" ];
displayManager.gdm = {
enable = true;
wayland = false;
};
desktopManager.gnome3.enable = true;
};
};
hardware = {
nvidia = {
modesetting.enable = true;
prime = {
offload.enable = true;
intelBusId = "PCI:0:2:0";
nvidiaBusId = "PCI:1:0:0";
};
};
opengl = {
driSupport = true;
driSupport32Bit = true;
};
};
system.stateVersion = "20.09";
}
glxinfo | grep "OpenGL renderer"
OpenGL renderer string: Mesa Intel(R) UHD Graphics 620 (KBL GT2)
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia __VK_LAYER_NV_optimus=NVIDIA_only __NV_PRIME_RENDER_OFFLOAD_PROVIDER=NVIDIA-G1 glxinfo | grep "OpenGL renderer"
name of display: :1
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 152 (GLX)
Minor opcode of failed request: 24 (X_GLXCreateNewContext)
Value in failed request: 0x0
Serial number of failed request: 39
Current serial number in output stream: 40"
I used to have a working PRIME setup when I used Archlinux. I don’t know what I’m missing?