Moderation actions need to be transparent on discourse

The mods are under immense pressure these days. Adding even more pressure by dialing up public scrutiny in order to relitegate decisions seems like an excellent recipe for a bomb - resulting in the community having no moderation right when such is needed the most.

From the outside of that team it looks like it would be extremely helpful if those who have lost faith in the team would in good faith volunteer to help or recommend that others do so?


I’d consider this post an example of abusive flagging.

Edit: It turns out that this might not have been a good example after all.

1 Like

I’d like to take this opportunity to say that although I personally have some grievances, I am very grateful for the immense patience of the moderation team. This has been one hell of a week.

The “hidden posts” issue is not a one-sided thing (for example, I’ve had a couple posts of my own hidden, but posts replying and disagreeing with me have also been hidden) and with the knowledge that it is indeed automatic and not opaque moderator action, I hope we can drop a bit of the tension. The moderators have not been trying to silence people. I actually feel like at least here on Discourse the moderation has felt like it has been unusually careful, which I appreciate.

Please remember that moderation is a thankless background task at best even when it is done right.

I don’t want to get into tone policing but I can understand why one would flag that post. It is a slight bit flippant, whether you see this as justified or not. (I didn’t/wouldn’t flag the post myself though.)

1 Like

I saw that post and was on the fence about it. ‘Don’t be such a drama queen’ is not great for presenting this community as welcoming, even if it’s in the service of a good point. Another community member flagged it and another mod affirmed the flag. Given these three points of triangulation, I wouldn’t consider that one abuse.

‘Tone policing’ is a tricky issue. I don’t want to have a whole debate about it here. But my understanding is that one component of tone policing is the implicit assumption of social authority in a conversation—one participant asserts the right to establish the terms of the conversation, which is a new unfairness for the other participant on top of whatever grievance they were already expressing. Mods have explicit authority to establish the terms of public conversation here—that’s the job. That doesn’t mean, IMO, total moral freedom for mods to make arbitrary choices about who can complain and where, when, and how; but we do have the responsibility to make reasonable choices about those things, in contexts where people without that responsibility making those choices would be tone policing.

Remember that hiding a post sends a message to the author. If they want to edit their post and choose words that are less inflammatory, they are encouraged to do so and that unhides the post. Close calls like this one would be easy to correct.


Indeed, I was on the edge too on this post, and then ended up approving the flag because –regardless of the rest of the post’s content– I consider “drama queen” to be sexist and unacceptable.

In better times, I’d have given the author feedback about it, asked for an edit and then unhid it after the correction. But when everything is on fire and the review queue has half a dozen more flags again, finding the energy for giving that kind of feedback is tricky.


No, there doesn’t need to be absolute, total transparency. The mods are the mods. We don’t need mods of mods, or mods of mods of mods, etc. Posters should learn to read the room and submit high-quality posts.

This is a pretty good take IMHO. If we accept that it is our shared responsibility to

  • Try as much as possible to avoid and defuse conflict.
  • Inquire “why did you do this action?” > infer “well, one must come to the the conclusion …”
  • Present or point to real evidence when discussing the action of some person, otherwise, inquire with them about why they did it as step one.
  • Don’t just demand things from the people who volunteer their time to work on this project. Inquire about why they are doing it. Create constructive suggestions or proposed solutions, and discuss those in good faith with the people who have to live with the solution (community members, project teams etc)
  • Be willing to compromise with people you disagree with. Don’t turn every RFC into another epic attempt to recreate a constitutional convention. Trying to future proof RFC’s has paralyzed the nix community’s ability to be able to keep up with creating and implementing rules and structure. There are many plausible alternative futures that could emerge. There are many different ways to solve a problem. It is absolutely better to timebox RFC, try to make it fundamentally reasonable, and have people who can support consolidating feedback and evolving what emerges from an RFC. Think → Talk → Act instead of Think → Talk → Stall
1 Like

For €1,958.94 / month we should not be getting weird animals. For that kind of money we should have animal control on call and our own personal zookeeper. Maybe even dancing llamas.

That is a surprisingly large amount of money for rented forum software; maybe that price would make sense if “because that’s how the their software works” was something we never heard, but we’ve been hearing it quite a lot.

Edit: I saw that number under the heading “Significant Expenses” and assumed that it was an expense. The paragraph above it explains that it is not. The financial summary probably ought to have a separate subheading for these things that are not expenses but hypothetically could be someday.

1 Like

Not every post is good, not every opinion is meritorious, not every rant should be heard.

I think users can decide for themselves after having read the comment. IMHO hiding posts should be reserved for spam, off-topic or abusive content.

You claim the flag is being abused, but where is the evidence for that?

Just look at the top threads in the meta category, there are dozens of comments hidden by default in each. I would have maybe flagged a couple.

To @rhendric:

more often than not the things that the system actually hides are flags we’d affirm, and crowdsourcing first tier of moderation is a little easier on us

Fair enough. We’re are probably just disagreeing on what comments should be hidden, then. I’ll send you a couple of examples.

1 Like

Sorry, I edited my post to replace that bit.


I work for Flying Circus. The actual costs for Discourse are 0, it’s Free Software and Flying Circus is hosting it for free:

It’s just a fictional price a company would have to pay for the service level (including 24/7 emergency support) we are sponsoring.


It would be interesting to analyze the “trust levels” people doing the flagging that leads to the post hiding, and compare it to the model that the discourse project created, and believe to be reflective of how a community can collectively decide about content

We could just declare that every reader decides for themselves, which is optimal for the freedom of each individual. However the byproduct of that has proven to be destructive to:

  • The people who volunteer their time to moderate
  • People who come to the community looking for interaction, information and knowledge about Nix, nixpkgs, Nixos, and even how our org operates

It costs these people large amounts of time and energy when every comment stands, and then discussions devolve into tit-for-tat power struggles as we have seen recently.

I think it’s worth asking if we think the model above created by discourse works, if we can think that people who have earned “trust” in that model, in this forum, are qualified to co-govern the discussion content of their peers here. Or, is it not a good fit, and why?

Are the hidden comments an effective “voice of the community”? Or are they just contingents of people doing battle in via the forum feature? And if the latter, why does/doesn’t the “trust level” work to help assure that the majority of the time when posts are flagged, most people acting in good faith, and with experience in the community would agree that there is a rationale reason why it should be hidden or moderated? Who is doing what to whom, and why, and how? The data is actually present in discourse, and we should take a look IMHO (if we are not already doing so).

I do not blame/criticize the moderators at all for any of the above, and I support your decision making around interactions happening here. The above is meant as a suggestion for how we might analyze the hidden posts issue, and have some insight on it.

I think the model used by discourse works extremely well. At the end of the day, you have to have some people who ultimately make the call about what stays and what goes, or else the community will just have endless proxy wars. People will start to game the trust system if the trust system gives them too much power.

As of now, the trust system gives people the power to be more helpful to the community and do things like change post titles, refile posts into places where they’ll fit better semantically, and add tags so that things are more searchable. Generally their flags hide stuff faster, but I don’t think that power can extend past that. Trust in discourse is a function of time and participation.

I’ve been moderating another community on discourse for almost 10 years. It is by far the best forum software that exists, is generally well thought out, and makes things as easy as they can be, I think.


I read every hidden comment here. Fwiw, one of the things I noticed is that they do tend to resemble the comments that were leading to endless tit-for-tat battles, some of which even spilled over into github, matrix, etc.

I agree personally it does seem to be beneficial for the reasons you state.


Honestly I do the same most days - even the ones which were hidden because I blocked someone. Then on other days I am very happy I can skip them and manage my energy levels a bit better.


I’m glad to hear it helps. Sometimes watching ‘likes’ pile on to hidden posts, I’ve wondered if there’s any value in bothering with making these calls at all.


Yes, there absolutely is! There is zero possible ways that discourse or any other community platform would survive without moderation. I think what the average, non-moderator person can not see is how nice things are with moderation vs not having it all all.


I think that even if everybody reads them, marking them as hidden still communicates that this post is “not okay” in some way. To me, it also indicates that replying to it is not necessary/desired.


Someone initiated an RFC on this, leading to a discussion on Reddit. While I personally disagree with the RFC, I appreciate the intent behind it.

I think users can decide for themselves after having read the comment.

That’s the point of the flag system. We users read posts, we see they are bad, and we flag them as bad.
You might not agree they are bad. That’s too bad. That doesn’t make the flagging system bad.


I support the flagging, and what @nat-418 says here.

Personally, I have been trying an alternative to peer flagging, where I instead respond and try to offer alternatives to what the person is doing that could be more constructive, and explain why the comment they left might end up being a problem.

However, not everyone has the time to do this. And within the last 2 months sometimes the pace of contentious comments blew up to the point that no one could moderate them.

See also

And the responses to my inquiry on it here Moderation actions need to be transparent on discourse - #18 by samrose