The Struggle to get Thunderbolt eGPU to output to its display at all in NixOS

I’ve been at this for months now, and tonight alone for hours and counting.

Hardware involved is: an LG Gram 17 laptop w/ Intel 620 integrated graphics. eGPU is a Razer Core X with an AMD RX 570. External display is a 4k TV via the GPU’s Displayport.

I think maybe the problem lies in the X configuration, but I’ve been messing with the services.xserver.config option, adding new sections and what not but at this point I’m stumped. If anyone can help me get this display on it’d be extremely appreciated.


This is how far I’ve gotten:

hardware.bolt.enabled, the GPU is authorized. Shows up fine under lspci -k:


(this is after booting with Thunderbolt 3 cable already plugged in)

However, no displays besides the internal laptop display:

Let’s check out xrandr --prop. There’s the internal display on eDP-1:


(other disconnected outputs include DP-1, HDMI-1, DP-2, HDMI-2)

and xrandr --listproviders shows 2 available providers, the former being the default internal graphics, but it appears the latter isn’t using the correct amdgpu driver, it’s still using modesetting:

Now at this point if I just ignore that problem and try setting the provider to the 2nd one with
xrandr --setprovideroutputsource 0x841 0x47
…it DOES actually makes the connected TV finally show up in system settings:


(under the new output DP-1-3, for some odd reason…)

However when I check the “Enable” box and Apply, nothing happens. If I close & reopen Display Settings, the Enable box is then just frustratingly unchecked again. And if I log out/reboot, everything is reset back to the beginning of course.

I had some success manually adding the PCI bus address of the GPU to the Device and Screen sections of the Xorg config, as well as tweaking Screen to Screen 0 under the ServerLayout section (per the Archwiki’s instructions). Which after a reboot miraculously made the 2nd provider show the correct “Radeon RX 570 Series @ pci:0000:06:00.0” name and id: 0xee instead of 0x841; but without the ServerLayout stuff which I have currently commented-out as you see below, it’s now showing both providers using modesetting again. Furthermore, during when it was showing the correct provider, enabling the display in system settings didn’t even work regardless, it just did the same behavior of absolutely nothing. And I think I still had to do a xrandr --setprovider to get the display to show up in settings then as well.

Meanwhile, the Ubuntu 20.10 live ISO works to display on the TV through the eGPU straight out of the box. I’m completely stumped, can anyone give me an idea of where I might be going wrong here?

6 Likes

what does the xorg log file has to say? I think it is in /var/log/xorg.0.log or something like that ( I cannot be more precise because I’ve switched to wayland)

Good question, I enabled the logfile, here’s the pastebin:

It seems that it doesn’t find any monitor:

[    21.341] (II) modeset(G0): glamor X acceleration enabled on Radeon RX 570 Series (POLARIS10, DRM 3.41.0, 5.13.5, LLVM 12.0.1)
[    21.341] (II) modeset(G0): glamor initialized
[    21.343] (II) modeset(G0): Output DP-1-3 has no monitor section
[    21.343] (II) modeset(G0): Output HDMI-1-3 has no monitor section
[    21.343] (II) modeset(G0): Output DVI-D-1-1 has no monitor section
[    21.343] (II) modeset(G0): Output DVI-D-1-2 has no monitor section

I know that bolt has some kind of “authorization system” built in, are you sure that’s all ok there (boltctl has an authorize subcommand)?

surely not Digital rights management ??? or have i misunderstood ‘authorization system’ in this context?

What’s is authorizing? and who is giving permission?

boltctl is the commandline interface given to “admins” to interact with the boltd service.

Thunderbolt devices are really PCI devices. From what I understand, without an authorization system they could to use DMA upon connection. See for example Thunderclap and Linux by Christian Kellner

1 Like

i see, very interesting…

so , as long as you keep any eye on your machine you can turn it off, and you dont wake up to find an unusual pci devices plugged into your machine, with the words ‘national skiing association’, written in crayon on the side, your good to go.

in all seriousness , thanks for the link, will review…

the new External PCI 4.0 bus spec is very interesting, allowing extension of high speed buses outside the chassis of the machine/laptop is pretty amazing.

Maybe one day, one of my funding bids will come off, or someone will take pity on me and give me a job, so i can afford to play with this. https://www.youtube.com/watch?v=NeFRc13ILPc , i’d probably like to get nixos running on it, so this thread is probably quite important.

:slight_smile:

:slight_smile: The guy just wants to use a more powerful video card

not exactly, they want to use a more powerful video card on a laptop.

that is science fiction as far as i’m concerned…

how old you azazel75? did you grow up with punch cards, or smart phones more powerful than the first desktop computer you had?

@nixinator , were you around in the punch card era? Lol it is pretty crazy the capabilities we have now such as with laptops like this LG gram + thunderbolt 3.0, but yes rest assured it’s not science fiction, it’s very possible. It’s actually extremely convenient for me since this laptop is 17" and under 3lbs, the only problem is that it still runs an i7-8565u mobile chipset so the integrated GPU is severely lacking. With the external GPU enclosure, I can stick a full-size graphics card in it, dock to it with the laptop over thunderbolt 3 - which runs at around 40Gbps bandwidth - allowing for basically complete desktop graphics power when I’m at my desk, as well as fully battery-optimized mobile capability when I’m away.

In any case, it works seamlessly on Windows with hotplugging and everything, and like I said everything works nearly as good with the latest Ubuntu ISO on a live-booted USB stick. So I know for a fact it’s indeed possible even in linux. Just remains a question of what I’m missing in my nixos configuration to make it identical to Ubuntu’s so that it’ll work similarly.

@azazel75 it’s definitely all authorized & ready to go, I’m traveling at the moment so I can’t post a screenshot of plasma’s thunderbolt section of System Settings or the output of boltctl list to prove it but I can guarantee that part is at least not the problem, so that’s all taken care of.

Good catch on the monitor though, I was wondering if the issue was maybe with needing a monitor section. So would maybe manually adding that to the xorg config allow it to make the connection? It did seem strange to me that it did pick up the provider and I think even grabbed the EDID from the TV as long as I manually switched the provider over with xrandr, but it would make sense if xorg itself as the middleman, not having a monitor section to route it through, would get tripped up there. I guess I’ll have to try messing with that next week or so when I’m home again and see if that does it

Assuming that is part of the problem, the only other thing I’m still concerned about is it looks like those messages are still coming through the modesetting driver - shouldn’t those be output from amdgpu? Or would modesetting act as a “mask” for the amdgpu driver and still successfully drive the card + display somehow? I was reading about how the various drivers work and I couldn’t really understand if that’s what modesetting is supposed to be able to do or not…feels like there’s still something potentially preventing amdgpu from taking over as it should. I thought modesetting only usually kicked in for intel integrated gfx alone.

2 Likes

i was around much before punched tape, actually when data was stored on stone tablets.

In all seriousness, could probably diff ubuntu and nixos and see what patches are required from ubnutu to get this working, however, i’ll need the hardware, and thats the problem with hardware integration testing, if i can’t get access to it, i won’t be able to help.

unless someone here wants to send me the equipment, and i very much doubt that, sound e-x-p-e-n-s-i-v-e :-).

indeed it would be cool to get this working, so sci-fi can be sci-fact.

1 Like

I just wanted to update that I finally got home from vacation, tried adding the monitor section as I thought might work… it didn’t. Like, no change at all.

For reference here’s what I have added to my configuration.nix:

  services.xserver.config = lib.mkAfter ''
#Section "ServerLayout"
#       Identifier "Layout[all]"
# Reference the Screen sections for each driver.  This will
# cause the X server to try each in turn.
#       Screen 0 "Screen-amdgpu[0]"
#       Screen "Screen-radeon[0]"
#       Screen "Screen-nouveau[0]"
#       Screen "Screen-modesetting[0]"
#       Screen "Screen-fbdev[0]"
#EndSection

Section "Monitor"
    Identifier "Monitor[1]"
#    HorizSync
#    VertRefresh
    Modeline "3840x2160_60.00" 593.41 3840 4016 4104 4400 2160 2168 2178 2250 +hsync +vsync
EndSection

Section "Device"
    Identifier "Device-amdgpu[0]"
    Driver      "amdgpu"
    BusID       "PCI:06:00.0"   # Edit according to lspci, translate from hex to decimal
EndSection

Section "Screen"
    Identifier "Screen-amdgpu[0]"
    Device "Device-amdgpu[0]"
    Monitor "Monitor[1]"
EndSection
  '';

(commented-out stuff was what I tried for kicks earlier on, but that didn’t work either)

I have no clue where to go from here and nothing seems to work. Think I gotta abandon this venture yet again unless an answer basically falls from the sky at this point.

Kinda sucks, I know it should be possible to be able to do this, but for whatever reason I just cannot find a way. With all the effort that’s been put into it and nothing to show for it, I think I might give a shot at Ubuntu or similar mainstream distro again since eGPU support works straight out of the gate there… I can live without my external display or whatever and have for a while, but I’d really like to be able to use it and nixos unfortunately doesn’t seem to like it

Could you post your Xorg.log again? Does it find the monitor?

Reviving this thread because I’ve encountered this exact issue on a nearly identical setup. Does anyone happen to have some ideas or suggestions? Even adding a monitor as @jDally987 did results in the “No monitor specified” exception.

Here is the entirity of my xserver conf in nix

{ lib, config, pkgs, ... }:

with lib;
let
  cfg = config.sys;
in {
  services.xserver = mkIf (cfg.desktop.enable && cfg.desktop.displayProtocol == "xserver") {
    enable = true;
    displayManager.startx.enable = true;
    # FIXME: eventually check if a laptop
    config = mkAfter ''
      Section "Monitor"
        Identifier "Monitor[1]"
        Modeline "1920x1080x144.0" 325.08 1920 1944 1976 2056 1080 1083 1088 1098 +hsync +vsync
      EndSection

      Section "Device"
        Identifier "Device-amdgpu[0]"
        Driver     "amdgpu" 
        BusID      "PCI:130:0:0"
        Option     "AllowExternalGpus" "True"
        Option     "AllowEmptyInitialConfiguration"
      EndSection

      Section "Screen"
        Identifier "Screen-amdgpu[0]"
        Device "Device-amdgpu[0]"
        Monitor "Monitor[1]"
      EndSection
    '';
    videoDrivers = [ "amdgpu" "modesetting" ];
    libinput = {
      enable = true;
      touchpad = mkIf (cfg.laptop.model != "none") {
        tapping = false;
        naturalScrolling = true;
        # left-click = 1 finger click
        # right-click = 2 finger click
        # middle-click = 3 finger click
        clickMethod = "clickfinger";
        disableWhileTyping = true;
      };
      mouse = {
        tapping = false;
        naturalScrolling = false;
        middleEmulation = false;
        disableWhileTyping = false;
      };
    };
  };
}

Here is the xserver.conf that is generated by nix

Here is my log file.

I was finally able to get something to display using the following configuration. I’m not quite sure what is wrong with the nixos-generated xorg configuration, so this solution is definitely not the cleanest right now, but I’ll post updates as I iterate.

{ lib, config, pkgs, ... }:

with lib;
let
  cfg = config.sys;
in {
  services.xserver = mkIf (cfg.desktop.enable && cfg.desktop.displayProtocol == "xserver") {
    enable = true;
#    exportConfiguration = false;
    displayManager.startx.enable = true;
    displayManager.xserverArgs = mkAfter [ "-config" "/etc/X11/xorg2.conf" ];
    # FIXME: eventually check if a laptop
    config = mkAfter ''
      Section "Monitor"
        Identifier "Monitor[1]"
        Modeline "1920x1080x144.0" 325.08 1920 1944 1976 2056 1080 1083 1088 1098 +hsync +vsync
      EndSection

      Section "Device"
        Identifier "Device-amdgpu[1]"
        Driver     "amdgpu" 
        BusID      "PCI:130:0:0"
        Option     "AllowExternalGpus" "True"
        Option     "AllowEmptyInitialConfiguration"
      EndSection

      Section "Screen"
        Identifier "Screen-amdgpu[1]"
        Device "Device-amdgpu[1]"
        Monitor "Monitor[1]"
      EndSection
    '';
    videoDrivers = [ "amdgpu" "modesetting" ];
    libinput = {
      enable = true;
      touchpad = mkIf (cfg.laptop.model != "none") {
        tapping = false;
        naturalScrolling = true;
        # left-click = 1 finger click
        # right-click = 2 finger click
        # middle-click = 3 finger click
        clickMethod = "clickfinger";
        disableWhileTyping = true;
      };
      mouse = {
        tapping = false;
        naturalScrolling = false;
        middleEmulation = false;
        disableWhileTyping = false;
      };
    };
  };
}

With this nix configuration, you copy an existing xorg configuration to /etc/X11/xorg2.conf. This will serve as your xserver configuration.

The xorg2.conf that a copied to /etc/X11/ is really just a slight modification of the xorg.conf generated by nixos (look here for a comparison). Basically the only thing I did was remove extraneous definitions of “Screen-amdgpu[0]” and “Device-amdgpu[0]” because for some reason they were not being overwritten.