For Manual (the documentation) changes, one has to edit the Manual text in a Markdown file, then also run the above script, which will build an XML artifact.
I opened an issue which complained about this; I figure if the CI can check whether or not I have built the XML correctly, it can install it instead. I am told this is no bueno
Why is this? Does it have something to do with the fact that the a ref in the repo does not simply represent instructions to build a Thing, but is itself the Thing? (In other Linux environments I have used tarballs or whatever in S3 labelled with the git sha, such that “deploying” a git ref means unpacking the corresponding tarball.)
Can we use power tools in CI for Documentation (e.g. Markdown → XML)?
If we can’t, is it because of requirements to safely build the software in the repo?
Can you please Explain It To Me Like I’m 5, and pretend I asked “why” 5 times?
The current toolchain for converting Markdown files into DocBook has many dependencies: pandoc is written in Haskell so GHC is needed to build it.
Most NixOS systems will want to have NixOS documentation available locally.
If the docs are not cached for some reason (for example the system overrides some dependency of the toolchain), they would need to be built and that would require downloading/building the toolchain and relevant depencies. That might be unreasonably big for many systems.
By committing the generated files into the repo, we move the burden of having the toolchain from users onto the people modifying the docs.
Now, it would be possible to offload running md-to-db.sh and committing the generated files to CI but then we would need a separate build command for testing, which would be confusing. Plus having CI push commits into branches would lead to mess as well.
We will hopefully move to a toolchain with a lighter dependency closure soon, which will allow us to drop this measure.