Derivation variables ignored, installing from tarballs.nixos.org instead?


#1

I’m trying to update the Cura Lulzbot dervication to version 3.6.3.

I’ve tried changing the version number, nothing changes. I then changed the download URLs to make them broken, and nothing changed. Putting in bogus version numbers didn’t break anything either.

Each time the output of nix-env -f ~/.nixpkgs.default.nix -i cura-lulzbot was the same:

replacing old 'cura-lulzbot-3.6.3'
installing 'cura-lulzbot-1234567890'
these derivations will be built:
  /nix/store/1vl2i2rk5wsjp3lhgzhsaidxlij76g0n-cura-lulzbot_1234567890_amd64.deb.drv
  /nix/store/qp7979j00yvy9c7cygdw9j8qw7yqpr11-cura-lulzbot-1234567890.drv
building '/nix/store/1vl2i2rk5wsjp3lhgzhsaidxlij76g0n-cura-lulzbot_1234567890_amd64.deb.drv'...

trying http://tarballs.nixos.org/sha256/1gsfidg3gim5pjbl82vkh0cw4ya253m4p7nirm8nr6yjrsirkzxg
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100 8642k  100 8642k    0     0  7244k      0  0:00:01  0:00:01 --:--:-- 7821k
building '/nix/store/qp7979j00yvy9c7cygdw9j8qw7yqpr11-cura-lulzbot-1234567890.drv'...
unpacking sources
installing
building '/nix/store/zqrf2mh58zzwypw567zc2lm6vsnb8yk7-user-environment.drv'...
created 4805 symlinks in user environment

While the version numbers change with the variables, the tarball hash does not.

What’s going on? Is this some kind of forced caching? If so, how do I disable it?


#2

If you request a file (or store path) with a given hash, you get it. Regardless of the “recipe” having changed in whatever way.


#3

Thank you!

Just to clarify for any potential future readers:
Nix checks the hash before checking the URL, etc. If it already has a file with the desired hash, Nix will ignore the URL and just use the cached file.

In other words, if you want to change the output of fetchurl etc, make sure you change the hash.


#4

I should add that one of the main motivations (from many years ago) was that changing to a different mirror shouldn’t cause any rebuild (or even any network activity). Unfortunately, this property seems a relatively common stumbling point for newcomers, for some reason…


#5

Also, this has already lead to at least one upgrade being pushed to
master which forgot to bump the hash, leading to a chromium version that
people thought was upgraded but was actually not.

As this behavior is both pretty surprising and source of potential
security issues (a PR that appears to bump the version of a package can
actually be downgrading it, and it’s relatively hard to notice), I’m
more and more thinking we should maybe revisit this choice?

Especially as I don’t think we actually switch mirrors often without
also changing the version – though it’s a nice property in theory it
feels to me like in practice it’s bringing more harm than good.


#6

We could print a warning message when using a file based on the hash, not the URl

Or even better, perhaps we combine the file hash with the URL (e.g. Hash(url, Hash(file))) and have NixOS cache based on that combines hash. I think this would only be inefficient when switching mirrors, which is uncommon as far as I know.


#7

We have a long detailed thread suggesting related changes somewhere on GitHub.

Upstream downloads getting dead is not a rare thing, by my observations. (For the “smaller” projects at least – we have really lots of packages in NixPkgs.) And tarballs.nixos.org tend to be significantly faster on average.


#8

@Ekleog I’d advocate for addressing this specifically.

Why not have @GrahamcOfBorg fetch the URLs and make sure the hash matches?

If it doesn’t match, the bot would throw an error, both reminding the PR author that the hash must be changed, and also unmasking potentially malicious attempts to exploit fetchurl’s behavior.


#9

Vladimír čunát via Nix community nixos1@discoursemail.com writes:

We have a long detailed thread suggesting related changes somewhere on GitHub.

Upstream downloads getting dead is not a rare thing, by my observations. (For the “smaller” projects at least – we have really lots of packages in NixPkgs.) And tarballs.nixos.org tend to be significantly faster on average.

I’ve just git log -p and looked for url = changes without a sha256
change. The most recent one I could find is godef: Use fetchFromGitHub
(6c26419f5ed2d56316ecd2293c11751c9446afea) from Feb 19, that switched
from fetchgit to fetchFromGitHub. (disclaimer: I haven’t checked
more than that)

Is one rebuild of one leaf package per month enough to warrant this
pitfall? My current gut feeling is it’d be better to just include the
url in the hash, though it’s just a gut feeling for the time being. We
have many more mass-rebuilds, after all.


#10

generic-specialty via Nix community nixos1@discoursemail.com writes:

Why not have @GrahamcOfBorg fetch the URLs and make sure the hash matches?

If it doesn’t match, the bot would throw an error, both reminding the PR author that the hash must be changed, and also unmasking potentially malicious attempts to exploit fetchurl’s behavior.

I think this would be pretty hard to do, as it’d require ofborg to run
some kind of recursive---check on the derivation it’s building, which
would end up rebuilding all the dependencies all the time.


#11

Actually, I think including the URL in the hash would have the same effect, since it would cause new builders to download the new files when the URL changes since the expression hash no longer is in the cache.

Plus, it sounds like a much cleaner solution than manually configuring Of Borg to evaluate fetchurl unusually.