Power Managment with nvidia GPU

I have a Lenovo Ideapad Laptop with a GTX 1650. I got it to work in offload mode. Now the X11 server is automatically running on the deicatded GPU. I’d like to have more control over that. How can I configure what processes are running on what GPU?

{ pkgs, ... }:

let
  nvidia-offload = pkgs.writeShellScriptBin "nvidia-offload" ''
    export __NV_PRIME_RENDER_OFFLOAD=1
    export __NV_PRIME_RENDER_OFFLOAD_PROVIDER=NVIDIA-G0
    export __GLX_VENDOR_LIBRARY_NAME=nvidia
    export __VK_LAYER_NV_optimus=NVIDIA_only
    exec "$@"
  '';
in {
  environment.systemPackages = [ nvidia-offload ];

  services.xserver.videoDrivers = [ "nvidia" ];
  hardware.nvidia.prime = {
    offload.enable = true;

    # Bus ID of the Intel GPU. You can find it using lspci, either under 3D or VGA
    amdgpuBusId = "PCI:5:00:0";

    # Bus ID of the NVIDIA GPU. You can find it using lspci, either under 3D or VGA
    nvidiaBusId = "PCI:1:00:0";
  };
}

this is my nvidia.nix config file.
Thanks a lot

2 Likes

The configuration you’re showing should run it on the iGPU, I think? At least that’s what offload mode normally does, the idea being that the dGPU is only used for intensive applications that need the extra power.

Any process that has the environment variables from that nvidia-offload script set will be offloaded, so either set them by some other means or use the nvidia-offload script.

What I am searching is that I can turn the nvidia card completely off when I don’t need it. Like when I am running on battery.
And I know that the Xorg server is running on the dGPU because it shows up in nvidia-smi

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 520.56.06    Driver Version: 520.56.06    CUDA Version: 11.8     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0 Off |                  N/A |
| N/A   37C    P8     1W /  N/A |      5MiB /  4096MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1168      G   ...xorg-server-1.20.14/bin/X        4MiB |
+-----------------------------------------------------------------------------+

There are two options available for PM:

*        hardware.nvidia.powerManagement.enable
             Experimental power management through systemd. For more information, see the NVIDIA docs, on
             Chapter 21. Configuring Power Management Support.

             Type: boolean

             Default: false

             Declared by:
                 <nixpkgs/nixos/modules/hardware/video/nvidia.nix>

*        hardware.nvidia.powerManagement.finegrained
             Experimental power management of PRIME offload. For more information, see the NVIDIA docs, chapter
             22. PCI-Express runtime power management.

             Type: boolean

             Default: false

             Declared by:
                 <nixpkgs/nixos/modules/hardware/video/nvidia.nix>

However the finegrained option doesn’t work with older cards. It does do what you want though, so you’ll have to figure out if it’ll work with your card.

2 Likes

Yep, for that to work you want your iGPU to run the X server. Then, when on battery, you simply don’t launch other applications with the dGPU, and you can - in theory - run with the dGPU entirely off.

In practice this is only supported on newer GPUs, though I think the 1650 is supported. You can read more about it in the nvidia powermanagement docs. I think the settings @nrdxp suggests set the powermanagement bits on the nvidia device to enable this.

If your iGPU is not running the X server, I’m pretty sure something has gone horribly wrong, that entirely defies the purpose of offload rendering.

Edit: Sorry, to be clear here, the nvidia GPU should turn off automatically if you’re not running anything on it if - and only if - the fine-grained power management is turned on. That is, right now, X is preventing your GPU from shutting down.

So, once your X server is fixed and runs on the right GPU, if you want to keep the nvidia GPU off, just don’t ask it to render anything by making sure no processes have those environment variables.

To use that, do I have to add the module file into my /etc/nixos/ dir and include it into the configuration.nix?

No, it is included in the list of default modules so you just have to set it.

I added that to the config, rebuilt and booted without problems but the nvidia-smi is still the same as above, it shows that “X” is running on the dGPU

Here’s my nvidia-smi with X definitely running on the Nvidia card, since that’s the only GPU in my system:

±----------------------------------------------------------------------------+
| NVIDIA-SMI 470.161.03 Driver Version: 470.161.03 CUDA Version: 11.4 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce … Off | 00000000:01:00.0 N/A | N/A |
| N/A 24C P0 N/A / N/A | 343MiB / 2001MiB | N/A Default |
| | | N/A |
±------------------------------±---------------------±---------------------+

±----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| No running processes found |
±----------------------------------------------------------------------------+

No mention of X as a running process, and note higher memory usage than yours is showing.

I’m not sure how to interpret the difference.

That doesn’t explain my case.
I just played around with some other way that are mentioned in the wiki and such but nothing really worked. I will continue reading the docs tomorrow (it is getting late for me).
Even if not processes are on the dGPU it is still on because I can measure activity and a major difference in power consumption.

That’s… odd! I’m also on a just-nvidia system, albeit with wayland/xwayland, and using alacritty which also directly uses the GPU:

Tue May  9 00:04:54 2023
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.41.03              Driver Version: 530.41.03    CUDA Version: 12.1     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                  Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf            Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce RTX 2070 S...    Off| 00000000:07:00.0  On |                  N/A |
| 39%   32C    P8               20W / 215W|    186MiB /  8192MiB |      5%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|    0   N/A  N/A      2544      G   Hyprland                                     89MiB |
|    0   N/A  N/A      2598      G   Xwayland                                     14MiB |
|    0   N/A  N/A     65273      G   alacritty                                    49MiB |
+---------------------------------------------------------------------------------------+

I’m curious if you’re actually rendering with the nvidia driver. glxinfo -B can give more info on what is driving OpenGL, if you’re accidentally running nouveau that might explain that - I do notice your driver version looks pretty old, and you’re not getting any fan readout so something seems fishy nah, ignore that, legacy driver and @lovirent lacks fan readout too - I’m guessing you’re on a laptop if it lacks a dedicated fan, are you sure it’s your only GPU?

^ That might also be useful for @lovirent, though I don’t doubt it will show nvidia. The question is why the iGPU isn’t used.

name of display: :0
display: :0  screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
    Vendor: AMD (0x1002)
    Device: RENOIR (renoir, LLVM 14.0.6, DRM 3.42, 5.15.109) (0x1636)
    Version: 22.2.5
    Accelerated: yes
    Video memory: 512MB
    Unified memory: no
    Preferred profile: core (0x1)
    Max core profile version: 4.6
    Max compat profile version: 4.6
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.2
Memory info (GL_ATI_meminfo):
    VBO free memory - total: 153 MB, largest block: 153 MB
    VBO free aux. memory - total: 3006 MB, largest block: 3006 MB
    Texture free memory - total: 153 MB, largest block: 153 MB
    Texture free aux. memory - total: 3006 MB, largest block: 3006 MB
    Renderbuffer free memory - total: 153 MB, largest block: 153 MB
    Renderbuffer free aux. memory - total: 3006 MB, largest block: 3006 MB
Memory info (GL_NVX_gpu_memory_info):
    Dedicated video memory: 512 MB
    Total available memory: 3584 MB
    Currently available dedicated video memory: 153 MB
OpenGL vendor string: AMD
OpenGL renderer string: RENOIR (renoir, LLVM 14.0.6, DRM 3.42, 5.15.109)
OpenGL core profile version string: 4.6 (Core Profile) Mesa 22.2.5
OpenGL core profile shading language version string: 4.60
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile

OpenGL version string: 4.6 (Compatibility Profile) Mesa 22.2.5
OpenGL shading language version string: 4.60
OpenGL context flags: (none)
OpenGL profile mask: compatibility profile

OpenGL ES profile version string: OpenGL ES 3.2 Mesa 22.2.5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

How can I control the driver version? You can see my driver config up at the top.

Any yes I am sure that i have an iGPU and a dGPU.
I’ve used them in my prev setup and they also both show up in neofetch for example

Yep, your output looks fine, that was directed to @ahen since their nvidia-smi isn’t showing anything. Your driver version is only a bit behind mine, you’ve presumably just not updated as recently.

That’s interesting, that probably means that your processes will run on the AMD GPU. Any chance you could run nvtop while running something that uses the GPU besides X? Maybe even two applications, with one intentionally on the nvidia GPU.

glxgears from mesa-demos is a good test.

If both of them are in fact used, and X renders on the dGPU, I’m not sure what’s wrong. Do you use multiple monitors?


as you can see everything is on the iGPU exept the X server

That’s… Odd. I’m personally at a bit of a loss here, though I haven’t used prime personally, so maybe someone who has it working properly could chime in.

Just out of curiosity and to make helping easier for others, a bit more information might be useful:

  1. Can you confirm you can actually deliberately offload something with e.g. nvidia-offload glxgears?
  2. Can you share your X server logs? They will contain some info on drivers as well. You’ll typically find them in /var/log/, or if you’ve enabled logToJournal you can get them with journalctl --boot -t xsession

I’ve heard that sometimes laptops with multi-GPU setups will run things on an external monitor through the dGPU, since the port runs off it. This is not the problem in your case, is it?

glxgears work as expected. I can see the gears spinning at 60fps (monitor refresh rate)
Here is my X11 log: (not all of it, tell me if you need more)

[    14.124] 
X.Org X Server 1.20.14
X Protocol Version 11, Revision 0
[    14.124] Build Operating System: Nix 
[    14.124] Current Operating System: Linux compooter 5.15.109 #1-NixOS SMP Wed Apr 26 11:51:56 UTC 2023 x86_64
[    14.124] Kernel command line: initrd=\efi\nixos\z9yxmnmdblcylza8kna4i8r5zbazw3rf-initrd-linux-5.15.109-initrd.efi init=/nix/store/ay25r3ymlhyd6zivhxwq5g56z8iwgyx6-nixos-system-compooter-22.11.4037.cc45a3f8c98/init loglevel=4 nvidia-drm.modeset=1 nvidia.NVreg_PreserveVideoMemoryAllocations=1
[    14.124] Build Date: 15 December 2021  07:01:53PM
[    14.124]  
[    14.124] Current version of pixman: 0.42.2
[    14.124] 	Before reporting problems, check http://wiki.x.org
	to make sure that you have the latest version.
[    14.124] Markers: (--) probed, (**) from config file, (==) default setting,
	(++) from command line, (!!) notice, (II) informational,
	(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[    14.124] (++) Log file: "/var/log/X.0.log", Time: Thu May 11 16:12:31 2023
[    14.126] (++) Using config file: "/nix/store/qvz3sywrkbm3rzmlkl88qmjv06jf6ab9-xserver.conf"
[    14.126] (==) Using config directory: "/etc/X11/xorg.conf.d"
[    14.126] (==) Using system config directory "/nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/share/X11/xorg.conf.d"
[    14.127] (==) ServerLayout "Layout[all]"
[    14.127] (**) |-->Screen "Screen-amdgpu[0]" (0)
[    14.127] (**) |   |-->Monitor "<default monitor>"
[    14.127] (**) |   |-->Device "Device-amdgpu[0]"
[    14.127] (**) |   |-->GPUDevice "Device-nvidia[0]"
[    14.127] (==) No monitor specified for screen "Screen-amdgpu[0]".
	Using a default monitor configuration.
[    14.127] (**) Option "DontZap" "on"
[    14.127] (**) Option "AllowMouseOpenFail" "on"
[    14.127] (==) Automatically adding devices
[    14.127] (==) Automatically enabling devices
[    14.127] (==) Automatically adding GPU devices
[    14.127] (==) Max clients allowed: 256, resource mask: 0x1fffff
[    14.131] (**) FontPath set to:
	/nix/store/kpqvrm9svih2qm6kwxi448m446xzwcdd-unifont-15.0.01/share/fonts,
	/nix/store/xwfvxlrbihm6w28hgc1r48f5yaivx5a6-font-cursor-misc-1.0.3/lib/X11/fonts/misc,
	/nix/store/zl866jqf27w5g0ihpfk9fvbjg5fpicxj-font-misc-misc-1.1.2/lib/X11/fonts/misc,
	/nix/store/aa6k95giq9qhfv8zaamjnkjyx0xb3zv5-font-adobe-100dpi-1.0.3/lib/X11/fonts/100dpi,
	/nix/store/fn7b9xhqc9x4f7f7nhv7lx66pkg7iwyj-font-adobe-75dpi-1.0.3/lib/X11/fonts/75dpi
[    14.131] (**) ModulePath set to "/nix/store/6jiixjif9rcz4gyvjbyaqfnp5m5dbb1n-xf86-video-amdgpu-21.0.0/lib/xorg/modules/drivers,/nix/store/ysdm9z04jndf2drjvqpa1vz7qzsmkq50-nvidia-x11-520.56.06-5.15.109-bin/lib/xorg/modules/extensions,/nix/store/ysdm9z04jndf2drjvqpa1vz7qzsmkq50-nvidia-x11-520.56.06-5.15.109-bin/lib/xorg/modules/drivers,/nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules,/nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/extensions,/nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/drivers,/nix/store/7hdjf8sqab7gbwnj71gxixzv6w5656vn-xf86-input-evdev-2.10.6/lib/xorg/modules/input,/nix/store/0v4q06gri0188w6yrzwp7cxphvmj5hd1-xf86-input-libinput-1.2.0/lib/xorg/modules/input"
[    14.131] (II) The server relies on udev to provide the list of input devices.
	If no devices become available, reconfigure udev or disable AutoAddDevices.
[    14.131] (II) Loader magic: 0x62bd40
[    14.131] (II) Module ABI versions:
[    14.131] 	X.Org ANSI C Emulation: 0.4
[    14.131] 	X.Org Video Driver: 24.1
[    14.131] 	X.Org XInput driver : 24.1
[    14.131] 	X.Org Server Extension : 10.0
[    14.132] (++) using VT number 7

[    14.132] (II) systemd-logind: logind integration requires -keeptty and -keeptty was not provided, disabling logind integration
[    14.133] (II) xfree86: Adding drm device (/dev/dri/card0)
[    14.150] (II) xfree86: Adding drm device (/dev/dri/card1)
[    14.164] (--) PCI: (1@0:0:0) 10de:1f99:17aa:3a45 rev 161, Mem @ 0xd0000000/16777216, 0xfcc0000000/268435456, 0xfcd0000000/33554432, I/O @ 0x00003000/128, BIOS @ 0x????????/524288
[    14.164] (--) PCI:*(5@0:0:0) 1002:1636:17aa:3a45 rev 199, Mem @ 0xfce0000000/268435456, 0xfcf0000000/2097152, 0xd1700000/524288, I/O @ 0x00001000/256
[    14.164] (II) Open ACPI successful (/var/run/acpid.socket)
[    14.164] (II) "glx" will be loaded by default.
[    14.164] (II) LoadModule: "glx"
[    14.166] (II) Loading /nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/extensions/libglx.so
[    14.168] (II) Module glx: vendor="X.Org Foundation"
[    14.168] 	compiled for 1.20.14, module version = 1.0.0
[    14.168] 	ABI class: X.Org Server Extension, version 10.0
[    14.168] (II) LoadModule: "amdgpu"
[    14.168] (II) Loading /nix/store/6jiixjif9rcz4gyvjbyaqfnp5m5dbb1n-xf86-video-amdgpu-21.0.0/lib/xorg/modules/drivers/amdgpu_drv.so
[    14.172] (II) Module amdgpu: vendor="X.Org Foundation"
[    14.172] 	compiled for 1.20.14, module version = 21.0.0
[    14.172] 	Module class: X.Org Video Driver
[    14.172] 	ABI class: X.Org Video Driver, version 24.1
[    14.172] (II) LoadModule: "nvidia"
[    14.172] (II) Loading /nix/store/ysdm9z04jndf2drjvqpa1vz7qzsmkq50-nvidia-x11-520.56.06-5.15.109-bin/lib/xorg/modules/drivers/nvidia_drv.so
[    14.177] (II) Module nvidia: vendor="NVIDIA Corporation"
[    14.177] 	compiled for 1.6.99.901, module version = 1.0.0
[    14.177] 	Module class: X.Org Video Driver
[    14.178] (II) AMDGPU: Driver for AMD Radeon:
	All GPUs supported by the amdgpu kernel driver
[    14.178] (II) NVIDIA dlloader X Driver  520.56.06  Thu Oct  6 21:29:26 UTC 2022
[    14.178] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[    14.201] (II) Loading sub module "fb"
[    14.201] (II) LoadModule: "fb"
[    14.201] (II) Loading /nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/libfb.so
[    14.202] (II) Module fb: vendor="X.Org Foundation"
[    14.202] 	compiled for 1.20.14, module version = 1.0.0
[    14.202] 	ABI class: X.Org ANSI C Emulation, version 0.4
[    14.202] (II) Loading sub module "wfb"
[    14.202] (II) LoadModule: "wfb"
[    14.202] (II) Loading /nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/libwfb.so
[    14.203] (II) Module wfb: vendor="X.Org Foundation"
[    14.203] 	compiled for 1.20.14, module version = 1.0.0
[    14.203] 	ABI class: X.Org ANSI C Emulation, version 0.4
[    14.203] (II) Loading sub module "ramdac"
[    14.203] (II) LoadModule: "ramdac"
[    14.203] (II) Module "ramdac" already built-in
[    14.204] (II) AMDGPU(0): Creating default Display subsection in Screen section
	"Screen-amdgpu[0]" for depth/fbbpp 24/32
[    14.204] (==) AMDGPU(0): Depth 24, (--) framebuffer bpp 32
[    14.204] (II) AMDGPU(0): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps)
[    14.204] (==) AMDGPU(0): Default visual is TrueColor
[    14.204] (==) AMDGPU(0): RGB weight 888
[    14.204] (II) AMDGPU(0): Using 8 bits per RGB (8 bit DAC)
[    14.204] (--) AMDGPU(0): Chipset: "Unknown AMD Radeon GPU" (ChipID = 0x1636)
[    14.204] (II) Loading sub module "fb"
[    14.204] (II) LoadModule: "fb"
[    14.204] (II) Loading /nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/libfb.so
[    14.204] (II) Module fb: vendor="X.Org Foundation"
[    14.204] 	compiled for 1.20.14, module version = 1.0.0
[    14.204] 	ABI class: X.Org ANSI C Emulation, version 0.4
[    14.204] (II) Loading sub module "dri2"
[    14.204] (II) LoadModule: "dri2"
[    14.204] (II) Module "dri2" already built-in
[    14.425] (II) Loading sub module "glamoregl"
[    14.425] (II) LoadModule: "glamoregl"
[    14.425] (II) Loading /nix/store/sf8jb4zb1ac75c2y5w1bbk2xgq9hl503-xorg-server-1.20.14/lib/xorg/modules/libglamoregl.so
[    14.431] (II) Module glamoregl: vendor="X.Org Foundation"
[    14.431] 	compiled for 1.20.14, module version = 1.0.1
[    14.431] 	ABI class: X.Org ANSI C Emulation, version 0.4
[    14.485] (II) AMDGPU(0): glamor X acceleration enabled on RENOIR (renoir, LLVM 14.0.6, DRM 3.42, 5.15.109)
[    14.485] (II) AMDGPU(0): glamor detected, initialising EGL layer.
[    14.485] (==) AMDGPU(0): TearFree property default: auto
[    14.485] (==) AMDGPU(0): VariableRefresh: disabled
[    14.485] (II) AMDGPU(0): KMS Pageflipping: enabled
[    14.486] (II) AMDGPU(0): Output eDP has no monitor section
[    14.496] (II) AMDGPU(0): EDID for output eDP
[    14.496] (II) AMDGPU(0): Manufacturer: CMN  Model: 15e7  Serial#: 0
[    14.496] (II) AMDGPU(0): Year: 2020  Week: 44
[    14.496] (II) AMDGPU(0): EDID Version: 1.4
[    14.496] (II) AMDGPU(0): Digital Display Input
[    14.496] (II) AMDGPU(0): 8 bits per channel
[    14.496] (II) AMDGPU(0): Digital interface is DisplayPort
[    14.496] (II) AMDGPU(0): Max Image Size [cm]: horiz.: 34  vert.: 19
[    14.496] (II) AMDGPU(0): Gamma: 2.20
[    14.496] (II) AMDGPU(0): No DPMS capabilities specified
[    14.496] (II) AMDGPU(0): Supported color encodings: RGB 4:4:4 
[    14.496] (II) AMDGPU(0): First detailed timing is preferred mode
[    14.496] (II) AMDGPU(0): Preferred mode is native pixel format and refresh rate
[    14.496] (II) AMDGPU(0): redX: 0.590 redY: 0.350   greenX: 0.330 greenY: 0.555
[    14.496] (II) AMDGPU(0): blueX: 0.153 blueY: 0.119   whiteX: 0.313 whiteY: 0.329
[    14.496] (II) AMDGPU(0): Manufacturer's mask: 0
[    14.496] (II) AMDGPU(0): Supported detailed timing:
[    14.496] (II) AMDGPU(0): clock: 138.8 MHz   Image Size:  344 x 193 mm
[    14.496] (II) AMDGPU(0): h_active: 1920  h_sync: 1968  h_sync_end 2000 h_blank_end 2080 h_border: 0
[    14.496] (II) AMDGPU(0): v_active: 1080  v_sync: 1083  v_sync_end 1088 v_blanking: 1112 v_border: 0
[    14.496] (II) AMDGPU(0):  N156HCA-EAB
[    14.496] (II) AMDGPU(0):  CMN
[    14.496] (II) AMDGPU(0):  N156HCA-EAB
[    14.496] (II) AMDGPU(0): EDID (in hex):
[    14.496] (II) AMDGPU(0): 	00ffffffffffff000daee71500000000
[    14.496] (II) AMDGPU(0): 	2c1e0104a52213780228659759548e27
[    14.496] (II) AMDGPU(0): 	1e505400000001010101010101010101
[    14.496] (II) AMDGPU(0): 	010101010101363680a0703820403020
[    14.496] (II) AMDGPU(0): 	350058c110000018000000fe004e3135
[    14.496] (II) AMDGPU(0): 	364843412d4541420a20000000fe0043
[    14.496] (II) AMDGPU(0): 	4d4e0a202020202020202020000000fe
[    14.496] (II) AMDGPU(0): 	004e3135364843412d4541420a20004e
[    14.496] (II) AMDGPU(0): Printing probed modes for output eDP
[    14.496] (II) AMDGPU(0): Modeline "1920x1080"x60.0  138.78  1920 1968 2000 2080  1080 1083 1088 1112 -hsync -vsync (66.7 kHz eP)
[    14.496] (II) AMDGPU(0): Modeline "1680x1050"x60.0  138.78  1680 1968 2000 2080  1050 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "1280x1024"x60.0  138.78  1280 1968 2000 2080  1024 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "1440x900"x60.0  138.78  1440 1968 2000 2080  900 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "1280x800"x60.0  138.78  1280 1968 2000 2080  800 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "1280x720"x60.0  138.78  1280 1968 2000 2080  720 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "1024x768"x60.0  138.78  1024 1968 2000 2080  768 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "800x600"x60.0  138.78  800 1968 2000 2080  600 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Modeline "640x480"x60.0  138.78  640 1968 2000 2080  480 1083 1088 1112 -hsync -vsync (66.7 kHz e)
[    14.496] (II) AMDGPU(0): Output eDP connected
[    14.496] (II) AMDGPU(0): Using exact sizes for initial modes
[    14.496] (II) AMDGPU(0): Output eDP using initial mode 1920x1080 +0+0
[    14.496] (II) AMDGPU(0): mem size init: gart size :bf6ea000 vram size: s:1df2d000 visible:1df2d000
[    14.496] (==) AMDGPU(0): DPI set to (96, 96)
[    14.496] (==) AMDGPU(0): Using gamma correction (1.0, 1.0, 1.0)
[    14.496] (II) Loading sub module "ramdac"
[    14.496] (II) LoadModule: "ramdac"
[    14.496] (II) Module "ramdac" already built-in
[    14.497] (==) NVIDIA(G0): Depth 24, (==) framebuffer bpp 32
[    14.497] (==) NVIDIA(G0): RGB weight 888
[    14.497] (==) NVIDIA(G0): Default visual is TrueColor
[    14.497] (==) NVIDIA(G0): Using gamma correction (1.0, 1.0, 1.0)
[    14.497] (**) Option "AllowNVIDIAGpuScreens"
[    14.497] (**) NVIDIA(G0): Enabling 2D acceleration
[    14.497] (II) Loading sub module "glxserver_nvidia"
[    14.497] (II) LoadModule: "glxserver_nvidia"
[    14.497] (II) Loading /nix/store/ysdm9z04jndf2drjvqpa1vz7qzsmkq50-nvidia-x11-520.56.06-5.15.109-bin/lib/xorg/modules/extensions/libglxserver_nvidia.so
[    14.538] (II) Module glxserver_nvidia: vendor="NVIDIA Corporation"
[    14.538] 	compiled for 1.6.99.901, module version = 1.0.0
[    14.538] 	Module class: X.Org Server Extension
[    14.538] (II) NVIDIA GLX Module  520.56.06  Thu Oct  6 21:26:26 UTC 2022
[    14.539] (II) NVIDIA: The X server supports PRIME Render Offload.
[    14.539] (--) NVIDIA(0): Valid display device(s) on GPU-0 at PCI:1:0:0
[    14.539] (--) NVIDIA(0):     DFP-0
[    14.540] (II) NVIDIA(G0): NVIDIA GPU NVIDIA GeForce GTX 1650 (TU117-A) at PCI:1:0:0
[    14.540] (II) NVIDIA(G0):     (GPU-0)
[    14.540] (--) NVIDIA(G0): Memory: 4194304 kBytes
[    14.540] (--) NVIDIA(G0): VideoBIOS: 90.17.72.00.50
[    14.540] (II) NVIDIA(G0): Detected PCI Express Link width: 16X
[    14.541] (--) NVIDIA(GPU-0): DFP-0: disconnected
[    14.541] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
[    14.541] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
[    14.541] (--) NVIDIA(GPU-0): 
[    14.541] (II) NVIDIA(G0): Validated MetaModes:
[    14.541] (II) NVIDIA(G0):     "NULL"
[    14.541] (II) NVIDIA(G0): Virtual screen size determined to be 640 x 480
[    14.541] (WW) NVIDIA(G0): Unable to get display device for DPI computation.
[    14.541] (==) NVIDIA(G0): DPI set to (75, 75); computed from built-in default
[    14.542] (II) AMDGPU(0): [DRI2] Setup complete
[    14.542] (II) AMDGPU(0): [DRI2]   DRI driver: radeonsi
[    14.542] (II) AMDGPU(0): [DRI2]   VDPAU driver: radeonsi
[    14.593] (II) AMDGPU(0): Front buffer pitch: 7680 bytes
[    14.593] (II) AMDGPU(0): SYNC extension fences enabled
[    14.594] (II) AMDGPU(0): Present extension enabled
[    14.594] (==) AMDGPU(0): DRI3 enabled
[    14.594] (==) AMDGPU(0): Backing store enabled
[    14.594] (II) AMDGPU(0): Direct rendering enabled
[    14.604] (II) AMDGPU(0): Use GLAMOR acceleration.
[    14.604] (II) AMDGPU(0): Acceleration enabled
[    14.604] (==) AMDGPU(0): DPMS enabled
[    14.604] (==) AMDGPU(0): Silken mouse enabled
[    14.605] (II) AMDGPU(0): Set up textured video (glamor)
[    14.612] (II) NVIDIA: Reserving 24576.00 MB of virtual memory for indirect memory
[    14.612] (II) NVIDIA:     access.
[    14.634] (II) NVIDIA(G0): Setting mode "NULL"
[    14.642] (==) NVIDIA(G0): Disabling shared memory pixmaps
[    14.642] (==) NVIDIA(G0): Backing store enabled
[    14.642] (==) NVIDIA(G0): Silken mouse enabled
[    14.642] (==) NVIDIA(G0): DPMS enabled
[    14.643] (II) Loading sub module "dri2"
[    14.643] (II) LoadModule: "dri2"
[    14.643] (II) Module "dri2" already built-in
[    14.643] (II) NVIDIA(G0): [DRI2] Setup complete
[    14.643] (II) NVIDIA(G0): [DRI2]   VDPAU driver: nvidia
[    14.643] (II) Initializing extension Generic Event Extension
[    14.643] (II) Initializing extension SHAPE
[    14.643] (II) Initializing extension MIT-SHM
[    14.643] (II) Initializing extension XInputExtension
[    14.643] (II) Initializing extension XTEST
[    14.643] (II) Initializing extension BIG-REQUESTS
[    14.644] (II) Initializing extension SYNC
[    14.644] (II) Initializing extension XKEYBOARD
[    14.644] (II) Initializing extension XC-MISC
[    14.644] (II) Initializing extension SECURITY
[    14.644] (II) Initializing extension XFIXES
[    14.644] (II) Initializing extension RENDER
[    14.644] (II) Initializing extension RANDR
[    14.644] (II) Initializing extension COMPOSITE
[    14.644] (II) Initializing extension DAMAGE
[    14.644] (II) Initializing extension MIT-SCREEN-SAVER
[    14.644] (II) Initializing extension DOUBLE-BUFFER
[    14.644] (II) Initializing extension RECORD
[    14.644] (II) Initializing extension DPMS
[    14.644] (II) Initializing extension Present
[    14.645] (II) Initializing extension DRI3
[    14.645] (II) Initializing extension X-Resource
[    14.645] (II) Initializing extension XVideo
[    14.645] (II) Initializing extension XVideo-MotionCompensation
[    14.645] (II) Initializing extension GLX
[    14.645] (II) Initializing extension GLX
[    14.645] (II) Indirect GLX disabled.
[    14.649] (II) AIGLX: Loaded and initialized radeonsi
[    14.649] (II) GLX: Initialized DRI2 GL provider for screen 0
[    14.649] (II) Initializing extension XFree86-VidModeExtension
[    14.650] (II) Initializing extension XFree86-DGA
[    14.650] (II) Initializing extension XFree86-DRI
[    14.650] (II) Initializing extension DRI2
[    14.650] (II) Initializing extension NV-GLX
[    14.650] (II) Initializing extension NV-CONTROL
[    14.650] (II) AMDGPU(0): Setting screen physical size to 508 x 285
[    14.728] (II) config/udev: Adding input device Power Button (/dev/input/event4)
[    14.728] (**) Power Button: Applying InputClass "Keyboard catchall"
[    14.728] (**) Power Button: Applying InputClass "evdev keyboard catchall"
[    14.728] (**) Power Button: Applying InputClass "libinput keyboard catchall"
[    14.728] (II) LoadModule: "libinput"
[    14.730] (II) Loading /nix/store/0v4q06gri0188w6yrzwp7cxphvmj5hd1-xf86-input-libinput-1.2.0/lib/xorg/modules/input/libinput_drv.so
[    14.734] (II) Module libinput: vendor="X.Org Foundation"
[    14.734] 	compiled for 1.20.14, module version = 1.2.0
[    14.734] 	Module class: X.Org XInput Driver
[    14.734] 	ABI class: X.Org XInput driver, version 24.1

I only have one monitor if that is what you mean