/nix/store/.links storage space double?

Hi,

Why does the directory “/nix/store/.links” suddenly double? How can I make it smaller?


I already tried:

nix-store --optimise
nix-collect-garbage -d

… but it doesn’t help.

Thanks

After reading through the code, when you run nix-store --optimise it populates the /nix/store/.links dir with hard links to all files in the store. This is what it uses to detect duplicate files (in order to link them together). nix-collect-garbage then deletes anything from /nix/store/.links that has a link count of 1 (meaning it no longer exists in the store).

So basically, after nix-store --optimise the /nix/store/.links dir should be measured as containing the entire disk usage of /nix/store.

How was that graph being generated? I wonder if it spiked after you did nix-store --optimise because whatever tool you’re using doesn’t know to deduplicate hard links.

1 Like

@lilyball
Thanks for your input. I’ll see what happens over the next few weeks.

Telegraf

That could be possible but i never changed something manuel.

Thanks @lilyball I got it.

I removed the option: nix.optimise.automatic = true; in my config. Now the disk space reduced about 40%.

FWIW, this is almost certainly a bug with your measuring tool; the whole point of the hardlinks is to reduce disk usage, but if your measuring tool doesn’t understand hard links then it will mis-report this as increased usage.

1 Like

I use “Telegraf” and the value is the same as shown by df -h.

That’s interresting. df -h should be correct.

If you want to investigate more on this issue (it may en up to be a bug, but most probably this is a misunderstanding of how optimization works, see at the bottom) you could read stackoverflow.com/du-counting-hardlinks-towards-filesize to get an idea of the issue described by @lilyball.

On my system (which has store optimization since forever), I get:

du /nix/store/.links /nix/store -sh
69G	/nix/store/.links
1,7G	/nix/store

by default, du does not count hard links twice. So the above means that most of the content is found by exploring .links and the remaining contains mostly hard links to .links.

du can also count hard links as separate files, in my system, it shows:

du /nix/store/.links /nix/store -shl
69G	/nix/store/.links
168G	/nix/store

This shows that about 30G are saved by hardlinks (168G total - 69G .links - 69G that would be used anyway ~= 30G deduplicated)

After a garbage collect:

$ nix-collect-garbage -d
[...]
deleting unused links...
note: currently hard linking saves 19383.33 MiB
4712 store paths deleted, 5864.73 MiB freed

$ du /nix/store/.links /nix/store -sh 
63G	/nix/store/.links
1,3G	/nix/store

$ du /nix/store/.links /nix/store -shl
63G	/nix/store/.links
148G	/nix/store

So about 22G saved by hard linking, which is quite close to the amount reported by nix (19G), up to rounding and the size of the directories that cannot be hardlinked (1.3G).

So you have to pay a small price for this deduplication. If you do not have multiple similar packages and keep one version of you system (which prevents you from performing rollbacks) then there may be little gain in optimizing the store, and you may even observe an increase of a few GiB in size for the duplicated folder structure, especially is you have a lot of small files.

1 Like

I have observed this “problem” further and now I have probably found the solution. Here is an excerpt of the disk usage:

What has changed? A simple reboot of the server!

Eww, that sounds like an “imperfection” in the filesystem’s accounting of free space.

Would you mind sharing your insights ? And the potential solution ?