Moderation actions need to be transparent on discourse

I’ve seen a staggering amount of replies in the recent threads getting hidden without any explanation. To me this is utterly unacceptable. There needs to be absolute, total transparency (a detailed log of moderation actions and their corresponding explanations) regarding to moderation actions to ensure accountability of those with power.

Since I can’t reply I’ll just edit this post: I’m not calling for mods of mods, I’m calling for accountability.

Edit2: I’ll add some more motivation
Recent event made me lose confidence in the moderators of the NixOS project.


Post hiding is kind of a weird animal on Discourse, from a technical perspective.

Any user can flag any post for moderation review. We’ll periodically work through the review queue and either affirm flags (which has the effect of hiding the post and sending a message to the poster about why their message was flagged) or reject them (which is invisible).

Additionally, Discourse keeps a score of how ‘good’ users are at flagging posts (something like their ratio of affirmed/total flags; I don’t actually know the exact formula), and if the total score of flaggers is high enough, Discourse will pre-emptively hide posts on that basis alone, before we get to look at it. The posts are still in our queue, and we’ll still manually review eventually.

So there’s going to be a little bit of chaos in the system always, during the period between a ‘democratic’ hiding and mods deciding if the hiding is appropriate or not.

Most of the time, I think the flags we affirm are fairly obvious calls, and it would be disruptive to the conversation to publicly announce each one (not to mention time consuming—there have been 98 flags here in the last seven days). Other times, particularly when we mods are flagging and hiding things ourselves, we do make a public post about what we’re doing and why.

Open to constructive suggestions that are not total time sinks.


There needs to be absolute, total transparency (a detailed log of moderation actions and their corresponding explanations) regarding to moderation actions

No, there doesn’t need to be absolute, total transparency. The mods are the mods. We don’t need mods of mods, or mods of mods of mods, etc. Posters should learn to read the room and submit high-quality posts.


I agree with @nat-418, the mods are the mods. Total transparency is something we should demand from algorithms, not human beings. With total transparency comes total scrutiny, and it’s, in my opinion, an unreasonable demand to ask from anyone, voluntarily or not.

I can, however, see @Poscat position as well. In times of turmoil, our established procedures are tested and we need to adjust them. Parts of the community have lost trust in the way the moderators conduct their responsibilities, and although I am not part of that group, I believe that this is a problem that needs addressing. Moderation, and by extend the feeling of safety within the community, is a matter of trust, after all. So instead of scrutinizing the moderators over how precise their responsibilities are executed, I’d propose we work on ways to restore trust to the system.

I could, for example, see benefits in having moderators being voted into position.
Their term would have to be sufficiently long, eg. a year, maybe two after the first successful one-year term, as to prevent to much pressure lying on each individual action. After each term, a mod applies for re-election which can be challenged by anyone by running themselves. If the community feels a mod does to bad of a job and their actions are not transparent enough, they’ll get voted out.
This would, in my opinion, add a sufficient amount of accountability to the position.

This is just off the top of my head, feel free to scrutinize

1 Like

I’ve already mentioned this in the matrix moderation room: the flag feature is clearly being abused by some to try to silence people they don’t agree with.
It’s not such a big deal because, it seems, you can choose to unhide the comments, but it makes following a discussion slightly annoying.

Would it be possible to turns this off? (i.e. not hiding a flagged comment until it has been reviewed)


I think it would be possible (the Discourse admins are a different group than the mods, and that’d be for them to do). I do like having it; overall I think that flags do get abused by some people but because of the scoring system, more often than not the things that the system actually hides are flags we’d affirm, and crowdsourcing first tier of moderation is a little easier on us than always being available to respond to flags ASAP.

Would you PM us some cases where you think a community-hidden post was due to abuse?


Maybe some people should be “silenced”: without the muting of low-quality posts the signal to noise ratio in here will get even worse. Not every post is good, not every opinion is meritorious, not every rant should be heard. You claim the flag is being abused, but where is the evidence for that?


The mods are under immense pressure these days. Adding even more pressure by dialing up public scrutiny in order to relitegate decisions seems like an excellent recipe for a bomb - resulting in the community having no moderation right when such is needed the most.

From the outside of that team it looks like it would be extremely helpful if those who have lost faith in the team would in good faith volunteer to help or recommend that others do so?


I’d consider this post an example of abusive flagging.

Edit: It turns out that this might not have been a good example after all.

1 Like

I’d like to take this opportunity to say that although I personally have some grievances, I am very grateful for the immense patience of the moderation team. This has been one hell of a week.

The “hidden posts” issue is not a one-sided thing (for example, I’ve had a couple posts of my own hidden, but posts replying and disagreeing with me have also been hidden) and with the knowledge that it is indeed automatic and not opaque moderator action, I hope we can drop a bit of the tension. The moderators have not been trying to silence people. I actually feel like at least here on Discourse the moderation has felt like it has been unusually careful, which I appreciate.

Please remember that moderation is a thankless background task at best even when it is done right.

I don’t want to get into tone policing but I can understand why one would flag that post. It is a slight bit flippant, whether you see this as justified or not. (I didn’t/wouldn’t flag the post myself though.)

1 Like

I saw that post and was on the fence about it. ‘Don’t be such a drama queen’ is not great for presenting this community as welcoming, even if it’s in the service of a good point. Another community member flagged it and another mod affirmed the flag. Given these three points of triangulation, I wouldn’t consider that one abuse.

‘Tone policing’ is a tricky issue. I don’t want to have a whole debate about it here. But my understanding is that one component of tone policing is the implicit assumption of social authority in a conversation—one participant asserts the right to establish the terms of the conversation, which is a new unfairness for the other participant on top of whatever grievance they were already expressing. Mods have explicit authority to establish the terms of public conversation here—that’s the job. That doesn’t mean, IMO, total moral freedom for mods to make arbitrary choices about who can complain and where, when, and how; but we do have the responsibility to make reasonable choices about those things, in contexts where people without that responsibility making those choices would be tone policing.

Remember that hiding a post sends a message to the author. If they want to edit their post and choose words that are less inflammatory, they are encouraged to do so and that unhides the post. Close calls like this one would be easy to correct.


Indeed, I was on the edge too on this post, and then ended up approving the flag because –regardless of the rest of the post’s content– I consider “drama queen” to be sexist and unacceptable.

In better times, I’d have given the author feedback about it, asked for an edit and then unhid it after the correction. But when everything is on fire and the review queue has half a dozen more flags again, finding the energy for giving that kind of feedback is tricky.


No, there doesn’t need to be absolute, total transparency. The mods are the mods. We don’t need mods of mods, or mods of mods of mods, etc. Posters should learn to read the room and submit high-quality posts.

This is a pretty good take IMHO. If we accept that it is our shared responsibility to

  • Try as much as possible to avoid and defuse conflict.
  • Inquire “why did you do this action?” > infer “well, one must come to the the conclusion …”
  • Present or point to real evidence when discussing the action of some person, otherwise, inquire with them about why they did it as step one.
  • Don’t just demand things from the people who volunteer their time to work on this project. Inquire about why they are doing it. Create constructive suggestions or proposed solutions, and discuss those in good faith with the people who have to live with the solution (community members, project teams etc)
  • Be willing to compromise with people you disagree with. Don’t turn every RFC into another epic attempt to recreate a constitutional convention. Trying to future proof RFC’s has paralyzed the nix community’s ability to be able to keep up with creating and implementing rules and structure. There are many plausible alternative futures that could emerge. There are many different ways to solve a problem. It is absolutely better to timebox RFC, try to make it fundamentally reasonable, and have people who can support consolidating feedback and evolving what emerges from an RFC. Think → Talk → Act instead of Think → Talk → Stall
1 Like

For €1,958.94 / month we should not be getting weird animals. For that kind of money we should have animal control on call and our own personal zookeeper. Maybe even dancing llamas.

That is a surprisingly large amount of money for rented forum software; maybe that price would make sense if “because that’s how the their software works” was something we never heard, but we’ve been hearing it quite a lot.

Edit: I saw that number under the heading “Significant Expenses” and assumed that it was an expense. The paragraph above it explains that it is not. The financial summary probably ought to have a separate subheading for these things that are not expenses but hypothetically could be someday.

1 Like

Not every post is good, not every opinion is meritorious, not every rant should be heard.

I think users can decide for themselves after having read the comment. IMHO hiding posts should be reserved for spam, off-topic or abusive content.

You claim the flag is being abused, but where is the evidence for that?

Just look at the top threads in the meta category, there are dozens of comments hidden by default in each. I would have maybe flagged a couple.

To @rhendric:

more often than not the things that the system actually hides are flags we’d affirm, and crowdsourcing first tier of moderation is a little easier on us

Fair enough. We’re are probably just disagreeing on what comments should be hidden, then. I’ll send you a couple of examples.

1 Like

Sorry, I edited my post to replace that bit.


I work for Flying Circus. The actual costs for Discourse are 0, it’s Free Software and Flying Circus is hosting it for free:

It’s just a fictional price a company would have to pay for the service level (including 24/7 emergency support) we are sponsoring.


It would be interesting to analyze the “trust levels” people doing the flagging that leads to the post hiding, and compare it to the model that the discourse project created, and believe to be reflective of how a community can collectively decide about content

We could just declare that every reader decides for themselves, which is optimal for the freedom of each individual. However the byproduct of that has proven to be destructive to:

  • The people who volunteer their time to moderate
  • People who come to the community looking for interaction, information and knowledge about Nix, nixpkgs, Nixos, and even how our org operates

It costs these people large amounts of time and energy when every comment stands, and then discussions devolve into tit-for-tat power struggles as we have seen recently.

I think it’s worth asking if we think the model above created by discourse works, if we can think that people who have earned “trust” in that model, in this forum, are qualified to co-govern the discussion content of their peers here. Or, is it not a good fit, and why?

Are the hidden comments an effective “voice of the community”? Or are they just contingents of people doing battle in via the forum feature? And if the latter, why does/doesn’t the “trust level” work to help assure that the majority of the time when posts are flagged, most people acting in good faith, and with experience in the community would agree that there is a rationale reason why it should be hidden or moderated? Who is doing what to whom, and why, and how? The data is actually present in discourse, and we should take a look IMHO (if we are not already doing so).

I do not blame/criticize the moderators at all for any of the above, and I support your decision making around interactions happening here. The above is meant as a suggestion for how we might analyze the hidden posts issue, and have some insight on it.

I think the model used by discourse works extremely well. At the end of the day, you have to have some people who ultimately make the call about what stays and what goes, or else the community will just have endless proxy wars. People will start to game the trust system if the trust system gives them too much power.

As of now, the trust system gives people the power to be more helpful to the community and do things like change post titles, refile posts into places where they’ll fit better semantically, and add tags so that things are more searchable. Generally their flags hide stuff faster, but I don’t think that power can extend past that. Trust in discourse is a function of time and participation.

I’ve been moderating another community on discourse for almost 10 years. It is by far the best forum software that exists, is generally well thought out, and makes things as easy as they can be, I think.


I read every hidden comment here. Fwiw, one of the things I noticed is that they do tend to resemble the comments that were leading to endless tit-for-tat battles, some of which even spilled over into github, matrix, etc.

I agree personally it does seem to be beneficial for the reasons you state.