Has anyone compiled ALVR on NixOS 23.11 recently?

Has anyone compiled ALVR on NixOS 23,11 recently?
I manged to get pretty far with:

nix-shell -p rustup yasm libdrm pkg-config vulkan-headers vulkan-loader vulkan-tools libva openssl libjack2 jack2 alsa-lib gtk3 unzip x264 ffmpeg 

Now I am getting clang errors though. I have found some examples of adding the clang environment into a nix.shell file, but I am missing something in my understand.

It seems that ALVR used to have a nix.shell bundled with it, but it has since been deleted. It is probably because they created a set of distrobox builder scripts. Unfortunately all of those scripts are tagged with #!/bin/bash.
I suppose and easy way to get what I want would be to just fix their script headers, nix-env -iA distrobox, and just try to get it compiled, but I worry that doing it that way will produce something that has to be run inside of distrobox as well.

I finally found the old shell.nix and updated with some of my findings. I am fairly new at this and there are probably a thousand things that can be done better, but this shell (on 23.11) allows for compilation:

{ pkgs ? import <nixpkgs> { } }:

with pkgs;

let
  # lunarg = pkgs.vulkan-tools-lunarg.overrideAttrs (oldAttrs: rec {
  #   patches = [
  #     (fetchurl {
  #       url =
  #         "https://gist.githubusercontent.com/ckiee/038809f55f658595107b2da41acff298/raw/6d8d0a91bfd335a25e88cc76eec5c22bf1ece611/vulkantools-log.patch";
  #       sha256 = "14gji272r53pykaadkh6rswlzwhh9iqsy1y4q0gdp8ai4ycqd129";
  #     })
  #   ];
  # });
in mkShell {
  stdenv = pkgs.clangStdenv;
  nativeBuildInputs = [ cmake pkg-config ];
  buildInputs = [
    binutils-unwrapped
    alsaLib
    openssl
    glib
    # (ffmpeg-full.override { samba = null; })
    cairo
    pango
    atk
    gdk-pixbuf
    gtk3
    clang
    # lunarg
    vulkan-headers
    vulkan-loader
    vulkan-validation-layers
    # xorg.libX11
    # xorg.libXrandr
    libunwind
    python3 # for the xcb crate
    libxkbcommon
    jack2
    bear

    rustup
    yasm
    libdrm
    # pkg-config
    # vulkan-headers
    # vulkan-loader
    vulkan-tools
    libva
    # openssl
    # libjack2
    # jack2
    alsa-lib
    # gtk3
    unzip
    x264
    ffmpeg 
  ];

  # VK_LAYER_PATH = "${lunarg}/etc/vulkan/explicit_layer.d:${vulkan-validation-layers}";
  LIBCLANG_PATH = "${llvmPackages.libclang.lib}/lib";
  RUST_ANDROID_GRADLE_PYTHON_COMMAND = "${pkgs.python3Minimal}/bin/python3";
  shellHook = ''
    export PATH=$(pwd)/android:$PATH
  '';
  LD_LIBRARY_PATH = lib.makeLibraryPath [
    libGL
    libxkbcommon
    wayland
    # xorg.libX11
    # xorg.libXcursor
    # xorg.libXi
    # xorg.libXrandr
  ];

}

Inside nix-shell, compile with:

cargo xtask prepare-deps --platform linux --no-nvidia
cargo xtask build-streamer --release

Unfortunately running still causes this error:

[nix-shell:~/Projects/ALVR/build/alvr_streamer_linux]$ bin/alvr_dashboard 
[00:41:41.158497478 ERROR eframe::native::run] Exiting because of error: Found no glutin configs matching the template: ConfigTemplate { color_buffer_type: Rgb { r_size: 8, g_size: 8, b_size: 8 }, alpha_size: 8, depth_size: 0, stencil_size: 0, num_samples: None, min_swap_interval: None, max_swap_interval: None, config_surface_types: WINDOW, api: None, transparency: false, single_buffering: false, stereoscopy: None, float_pixels: false, max_pbuffer_width: None, hardware_accelerated: None, max_pbuffer_height: None, native_window: None }. Error: Error { raw_code: None, raw_os_message: None, kind: NotSupported("provided display handle is not supported") } during event Resumed
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: NoGlutinConfigs(ConfigTemplate { color_buffer_type: Rgb { r_size: 8, g_size: 8, b_size: 8 }, alpha_size: 8, depth_size: 0, stencil_size: 0, num_samples: None, min_swap_interval: None, max_swap_interval: None, config_surface_types: WINDOW, api: None, transparency: false, single_buffering: false, stereoscopy: None, float_pixels: false, max_pbuffer_width: None, hardware_accelerated: None, max_pbuffer_height: None, native_window: None }, Error { raw_code: None, raw_os_message: None, kind: NotSupported("provided display handle is not supported") })', alvr/dashboard/src/main.rs:113:6
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
warning: queue 0x55bf14f63a80 destroyed while proxies still attached:
  wl_output@15 still attached
  wl_output@14 still attached
  xdg_activation_v1@13 still attached
  zwp_text_input_manager_v3@12 still attached
  zwp_pointer_constraints_v1@11 still attached
  zwp_relative_pointer_manager_v1@10 still attached
  wl_seat@9 still attached
  wp_fractional_scale_manager_v1@8 still attached
  wp_viewporter@7 still attached
  wl_subcompositor@6 still attached
  wl_shm@5 still attached
  wl_compositor@4 still attached
  wl_registry@2 still attached

I get the gist that glutin is a Rust config tool for GL, but I still am not familiar enough to know what to do here. I am guessing there is still a missing dependency or version mismatch?

I never got my own build to work, but the “Portable” .tar.gz works on nix if the interpreter is replaced and the executable is run through steam-run.

Unpacking and patching interpreter:

tar -xvzf alvr_streamer_linux.tar.gz
cd alvar_streamer_linux/bin
nix run nixpkgs#patchelf -- --set-interpreter "$(nix eval nixpkgs#stdenv.cc.bintools.dynamicLinker --raw)" ./alvr-dashboard

Executing:

steam-run ./alvr-dashboard
1 Like

In this configuration ALVR runs, but after it launches SteamVR it cannot communicate with the SteamVR it just launched. Since this sounds like more of an ALVR bug at this point, I posted this to the ALVR github.

I wanted to update this thread to point there and just in case anyone here had a thought which may help me get this working.

For anyone else like me that didn’t know about this. appimage-run works with the ALVR appimage.
See: STeamVR Disconnected · alvr-org/ALVR · Discussion #1953 · GitHub

Mhhm, it seems to work, but the firewall rules are not able to be added in the installer (obviously) and have to be added in your config which makes the whole thing a little disjointed, adding config, downloadinng appimage and running it.

I agree that a nixpkg would be better, but I didn’t get very far down that line before I realized that it was beyond my current nixability.