A lot rebuilds after library update

Hi. I needed a newer package version, so I updated it and some libraries it was using. So I decided to make a PR updating those packages. And after trying to run nixpkgs-review, I realized that there are approximately 250 rebuilds, including DE stuff and other large packages. Because I also need my laptop for other things, I limited nix-daemon to 8G via systemd. And after running for 10+ hours, it 12% of packages (30 packages). And there were 5 failed packages, with tons of packages with failed dependencies.

I read about mass rebuilds and staging branches, but they are aimed at 500+ rebuilds.

So before I dive deeper in this rabbit hole, I was wondering what is the correct approach at tackling this kind of situation. Specifically:

  1. I guess I will have to fix broken packages via updating them, but that might lead to even more rebuilds and more failed packages and so on, where should I draw the line?
  2. What if while fixing/updating packages, I run into a package that there are just no versions that support the updated version of a library?
  3. Is it normal for 30 packages to take 10+ hours to compile on 8G and 8 max parallel jobs? If that the case, I would probably need to setup a separate build server to compile that, taking up to a week.
  4. Or I am just doing everything wrong?))) This is the first time I run into mass rebuild.

PR in question: