Getting NVIDIA to work, avoiding screen tearing

Hey, folks!

I’ve got some problems with screen tearing and NVIDIA and was hoping I
could find some help here.

This is concerning a Dell XPS 9570 with a GeForce GTX 1050 Ti card.

In short: I recently started experiencing screen tearing again after
having not had it for about a year. It (probably by coincidence)
seemed to show back up after I started Slack for the first time in
ages. I took trying to fix this as an opportunity to see whether the
NVIDIA drivers would work for my laptop yet (they didn’t previously),
and they did. However, the tearing persists and I think it’s because
I’m not sure how to address it.

I’ve followed NixOS Wiki Nvidia
entry
and had a look around forum
threads and the Arch wiki, but feel like I’m still missing some
important fundamental knowledge. I have approximately zero knowledge
of how this works or how to test it, so please bear with me.

I think at the crux of this is that I don’t know how to activate the
card or how to tell whether it’s being used. Using nvtop it lists
only a single process (the X server), and I can’t seem to add more
processes to it. Using the nvidia-offload script, however, at least
glxinfo shows that it’s using the GPU.

Following the Arch wiki on avoiding screen tearing, I tried running

$ nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"

But whatever I do with nvidia-settings, I either get a completely
blank output (when I’m trying to query things), or when I assign
things (such as in the command above), all I get is:

ERROR: Error resolving target specification '' (No targets match target specification), specified in assignment 'CurrentMetaMode=nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }'.

I have tried searching the internet for more information on this, but
can’t seem to find out what this means.

I also seem to be missing some UI elements from the GUI
nvidia-settings applications. At least both the Arch wiki and other
sources reference items (such as an “Advanced” button) that I can’t find.

Here is some system info (based on what was listed in this question):

$ nvidia-smi

Wed Dec  9 09:02:39 2020       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 455.38       Driver Version: 455.38       CUDA Version: 11.1     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  GeForce GTX 105...  Off  | 00000000:01:00.0 Off |                  N/A |
| N/A   38C    P8    N/A /  N/A |      4MiB /  4042MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1138      G   ...-xorg-server-1.20.8/bin/X        4MiB |
+-----------------------------------------------------------------------------+
$ lspci -k | grep -E "(VGA|3D)"
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 630 (Mobile)
01:00.0 3D controller: NVIDIA Corporation GP107M [GeForce GTX 1050 Ti Mobile] (rev a1)
$ xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x46 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 3 outputs: 4 associated providers: 0 name:modesetting
Provider 1: id: 0x26d cap: 0x0 crtcs: 0 outputs: 0 associated providers: 0 name:NVIDIA-G0
$ nix-shell -p nix-info --run "nix-info -m"
 - system: `"x86_64-linux"`
 - host os: `Linux 5.9.12, NixOS, 20.09.2152.e34208e1003 (Nightingale)`
 - multi-user?: `yes`
 - sandbox: `yes`
 - version: `nix-env (Nix) 2.3.9`
 - channels(thomas): `"dhall, nixpkgs-21.03pre251123.b839d4a8557"`
 - channels(root): `"nixos-20.09.2152.e34208e1003, nixpkgs-21.03pre251181.dd1b7e377f6"`
 - nixpkgs: `/home/thomas/.nix-defexpr/channels/nixpkgs`
$ glxinfo | grep "OpenGL renderer"
OpenGL renderer string: Mesa Intel(R) UHD Graphics 630 (CFL GT2)

$ nvidia-offload glxinfo | grep "OpenGL renderer"
OpenGL renderer string: GeForce GTX 1050 Ti with Max-Q Design/PCIe/SSE2

configuration.nix (relevant sections)

{ config, pkgs, ... }:

let

  nvidia-offload = pkgs.writeShellScriptBin "nvidia-offload" ''
    export __NV_PRIME_RENDER_OFFLOAD=1
    export __NV_PRIME_RENDER_OFFLOAD_PROVIDER=NVIDIA-G0
    export __GLX_VENDOR_LIBRARY_NAME=nvidia
    export __VK_LAYER_NV_optimus=NVIDIA_only
    exec -a "$0" "$@"
  '';


in {
  
  boot = {
    kernelParams =
      [ "acpi_rev_override" "mem_sleep_default=deep" "intel_iommu=igfx_off" "nvidia-drm.modeset=1" ];
    kernelPackages = pkgs.linuxPackages_latest;
    extraModulePackages = [ config.boot.kernelPackages.nvidia_x11 ];
  };

  hardware.nvidia.prime = {
    offload.enable = true;
    nvidiaBusId = "PCI:1:0:0";
    intelBusId = "PCI:0:2:0";
  };

  services.xserver = {
    enable = true;

    videoDrivers = [ "nvidia" ];

    windowManager.exwm = {
      enable = true;
    };

    screenSection = ''
      Option         "metamodes" "nvidia-auto-select +0+0 {ForceFullCompositionPipeline=On}"
      Option         "AllowIndirectGLXProtocol" "off"
      Option         "TripleBuffer" "on"
    '';
  };
}

Thanks for any input you might have, and if you need to know anything
else, please don’t hesitate to ask!

Cheers.

2 Likes

I can’t look at this right now, but the quality of your issue report is A++++++. I wish all requests for assistance looked like this of the bat…, relevant bits, back ground, history… much better than

‘My video card is BROKEN…pls fix’.

hopefully someone might be able to assist you… however… it could be a serious case of.

NFS
2 Likes

Have the same model as you, are you using a compositor?

Hah, thanks! And yeah, when asking someone to help me out of the kindness of their hearts, I figure it’s better if you try and make things as easy on them as possible.

It doesn’t exactly seem like people are flocking to this issue, so if you happen to find some time to take a look at it later, that’d be great and much appreciated. I’ll keep fiddling and see what I find anyway.

I did try using Compton/Picom previously, but it didn’t seem to make any difference (in fact, it seemed to make the tearing worse, but that might be a coincidence). I tried some recommended settings for it, but can’t quite remember where I found them. I’ll report back if I can find them.

How about you? Have you managed to make it work? Do you use a compositor? Got any other tips?

Update:
I think I tried following the tips in this article (and some other reddit thread that I can’t find), and set backend to glx and vSync to true.

I don’t experience any screen tearing. I don’t think I do anything special regarding my graphics.

nvidia-offload can’t offload every single app, i.e., Chromium has issues in their GPU selection, though you can sort-ish force the NVIDIA GPU to be used.

If the screen tearing doesn’t happen with any other app, I would look into Slack and the Electron version to see if it’s just the app’s problem (I think I recall in the past there was issues).

Ah, I think there’s been a misunderstanding here. Sorry if I haven’t been clear. The screen tearing occurs in every app I run (save for glxgears). These apps are mostly Emacs, Alacritty, and Firefox. I run Emacs as my window manager through EXWM. The reason I mentioned Slack is that it hadn’t worked for a long time (it would error on startup), but it suddenly worked and that seemed to coincide with the screen tearing appearing.

If you have the same model, do you get an error on running this command (or any other nvidia-settings --assign ... command)?

nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"

Also, would you mind showing me what your configuration.nix is like? It might be that I have some other settings that cause it.

Thanks!

Interestingly, when setting

system = {
    variables."__NV_PRIME_RENDER_OFFLOAD" = "1";
    variables."__NV_PRIME_RENDER_OFFLOAD_PROVIDER" = "NVIDIA-G0";
    variables."__GLX_VENDOR_LIBRARY_NAME" = "nvidia";
    variables."__VK_LAYER_NV_optimus" = "NVIDIA_only";
}

(the same as the nvidia-offload script) in my configuration.nix, nvtop lists all applications that I start. When just using the nvidia-offload script, it wouldn’t. However, this still doesn’t fix the tearing.

1 Like

@eadwu Sorry to pick this up again, but you seemed to have some
insight and I was wondering whether I could ask you for a little more
of your time.

I checked out your config on GitHub, but it seems to be put together
by a large number of files, so I wasn’t quite able to figure out what
the final config you end up with is. I tried to apply a number of the
settings that I found in your config, but to no avail.

As a preliminary step, could you tell me what the output is if you run
this in a shell?

nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"

Specifically, do you get an error or does it (appear to) do something?
As mentioned previously, I get this response:

ERROR: Unable to load info from any available system


ERROR: Unable to load info from any available system

If that is different to what you get, then that might suggest that
there is something downright wrong with how things are configured on
my system.

Thanks for your time and input!

I don’t modify anything through nvidia-settings. I don’t think the screen tearing is a problem with NVIDIA, as it isn’t directly driving the display.

Huh. Actually, that might be a good point. Thanks! Do you have any ideas as to what the actual issue might be or any ways to mitigate it?

None past what I’ve already said. There are a lot of things that can probably contribute to your case screen tearing.

From your initial post of how you experienced it again after not experiencing it, it can be that in the midst of doing some configuration changes you may have accidentally manipulated something that screwed up the configuration.

Not sure if this’ll help, but I get by with no screen tearing on Nvidia 1060, xmonad and picom. You can check out my picom config. It’s fairly minimal, but without it, tearing is pretty bad.

@eadwu Alright. Appreciate your thoughts. I’ll look more into it. I might be able to roll back to a config around that time and see whether it works. I’ll check it out.

@nrdxp Cheers! I tried it out, but it didn’t help. In fact, It seemed to make things worse, for whatever reason. I also tried activating vSync, but no difference there either.

Hi, this is a bit late answer, but I just switched my Arch Linux setup to NixOS, and here’s what I do in my setup to remove tearing. Bear in mind this is from a desktop computer running 2080ti and your mileage may vary:

services.xserver = {
  videoDrivers = [ "nvidia" ];

  config = ''
    Section "Screen"
        Identifier     "Screen0"
        Device         "Device0"
        Monitor        "Monitor0"
        DefaultDepth   24
        Option         "Stereo" "0"
        Option         "nvidiaXineramaInfoOrder" "DFP-5"
        Option         "metamodes" "nvidia-auto-select +0+0 {ForceCompositionPipeline=On, ForceFullCompositionPipeline=On}"
        Option         "SLI" "Off"
        Option         "MultiGPU" "Off"
        Option         "BaseMosaic" "off"
        SubSection     "Display"
        Depth          24
        EndSubSection
    EndSection
  '';
};

My setup otherwise: i3, picom and xorg. Doesn’t work as good as Intel/Wayland or AMD/Wayland does, but does its job until the prices of new AMD cards come down.

Edit: needed some extra picom settings too to get everything butter smooth. The whole config here: GitHub - pimeys/nixos: NixOS Configuration

1 Like

Hello, I hope you are all well.

I am new to nixos but I had the vsync on working on a laptop with a hybrud intel and nvidia gpu.

Thanks to this thread, and my previous config in archlinux, I succeeded to craft a nixos config that works for me.

In nvidia-settings, I get this :

So the vsync is on. No tearing. Sync Mode, not Offload Mode.

Here is the config:

boot = {
    kernelParams =
      [ "acpi_rev_override" "mem_sleep_default=deep" "intel_iommu=igfx_off" "nvidia-drm.modeset=1" ];
    kernelPackages = pkgs.linuxPackages_5_4;
    extraModulePackages = [ config.boot.kernelPackages.nvidia_x11 ];
  };
  
# Enable the X11 windowing system.
  services.xserver.enable = true;

  # Enable the KDE Desktop Environment.
  services.xserver.displayManager.sddm.enable = true;
  services.xserver.desktopManager.plasma5.enable = true;

  ## NVIDIA 
  services.xserver = {
    videoDrivers = [ "nvidia" ];

    config = ''
      Section "Device"
          Identifier  "Intel Graphics"
          Driver      "intel"
          #Option      "AccelMethod"  "sna" # default
          #Option      "AccelMethod"  "uxa" # fallback
          Option      "TearFree"        "true"
          Option      "SwapbuffersWait" "true"
          BusID       "PCI:0:2:0"
          #Option      "DRI" "2"             # DRI3 is now default
      EndSection

      Section "Device"
          Identifier "nvidia"
          Driver "nvidia"
          BusID "PCI:1:0:0"
          Option "AllowEmptyInitialConfiguration"
      EndSection
    '';
    screenSection = ''
      Option         "metamodes" "nvidia-auto-select +0+0 {ForceFullCompositionPipeline=On}"
      Option         "AllowIndirectGLXProtocol" "off"
      Option         "TripleBuffer" "on"
    '';
  };

  hardware.nvidia.prime = {
    # Sync Mode
    sync.enable = true;
    # Offload Mode
    #offload.enable = true;

    # Bus ID of the NVIDIA GPU. You can find it using lspci, either under 3D or VGA
    nvidiaBusId = "PCI:1:0:0";

    # Bus ID of the Intel GPU. You can find it using lspci, either under 3D or VGA
    intelBusId = "PCI:0:2:0";
  };

Pick what you need, maybe the some settings are not optimals.

Enjoy

1 Like

Sorry for taking so long to respond!; I’ve been busy and I’m always a little scared of going down the nvidia rabbit hole again :sweat_smile:

@oh-lawd Thanks for the input! I tried it out, but the X server doesn’t seem to want to start using that config. Might be worth mentioning that I put the config under the screenSection option:

    videoDrivers = [ "nvidia" ];

    screenSection = ''
      Identifier     "Screen0"
      Device         "Device0"
      Monitor        "Monitor0"
      DefaultDepth   24
      Option         "Stereo" "0"
      Option         "nvidiaXineramaInfoOrder" "DFP-5"
      Option         "metamodes" "nvidia-auto-select +0+0 {ForceCompositionPipeline=On, ForceFullCompositionPipeline=On}"
      Option         "SLI" "Off"
      Option         "MultiGPU" "Off"
      Option         "BaseMosaic" "off"
      SubSection     "Display"
      Depth          24
      EndSubSection
    '';

@3cola Thanks to you too! Using your provided config, it seems I’ve managed to make NVidia-settings work for the first time, at least :smile: I’ve never had any options available in that menu before:

As you can see, however, it says “synchronization: off”, and I’m still getting a lot of tearing.

I’m fairly confident that I built the system with sync.enable = true, so I’m not sure what’s going on here. Do you know if there’s anything else that might impact this? I’ll rebuild again just to double check, though.

Update

Compared to @3cola’s advice, it seems I was missing the nvidia-drm.modeset=1 kernel param at least and had a slightly different value for one of the options. When I went back and fixed that, synchronization is now turned on :tada:

I do not have desktopManager and displayManager configured in my config (I use EXWM as a window manager and whatever is the default for those options).

As such, I have marked @3cola’s response as the answer to my question.

Thank you to everyone who has helped me out with this. I hope it can help someone else in the future too.

2 Likes