Hi everyone, this message is written on behalf of the community teams’ representatives.
In the months since NixCon 2023 there have been discussions on how to establish an authoritative collection of “official” projects in the Nix ecosystem. The main goal of such a distinction is to help contributors and users navigate the tooling landscape, and allow them to judge more easily the support status and maturity level of the various components. The hope is to make better use of our most valuable collective resources: attention and effort.
In the April 2024 community teams’ representatives meeting, the attendees discussed the latest proposal for how to approach that, and agreed that individuals should drive the process of curating that collection, as well as refining the selection criteria. We would like at least one person reasonably experienced with the ecosystem to take that responsibility, involve and communicate with all relevant stakeholders, and implement transitions or other changes where needed.
This is an important endeavor that will shape the face of the ecosystem, and the teams will provide support and guidance to anyone who picks up any of the various (largely non-technical) tasks. Activtities could involve asking power users and beginners what they expect from the ecosystem, researching or experimenting with existing solutions, talking with project authors or maintainers about their goals and capacities, proposing changes to repository descriptions or documentation, finding places for the relevant pieces of information, etc.
If you’re interested to help out, please answer here or write me a private message on Discourse or Matrix to get onboarded.
It is too early to say that I think it is more important to set an authoritative Grade on Projects with well-defined rules and sort/categorize the landscape, than make things Official, unless the idea is to move them from Community to Official, and it will have official support from NixOS Org (whatever that means)?
There are a few concrete candidates that have been discussed in the documents linked from the past meeting notes, such as Hydra, NixOps, Home Manager, npins, … The idea is to try things out and refine the criteria in the process.
What is that view?
Nice! Synthesizing or maybe even simply picking some of that would be great. We just have to make sure the requirements we choose can actually be met in reality.
I don’t think this is a problem. The criteria are not primarily about popularity but fitness for purpose. Generally the Nix ecosystem does not seem to suffer a lack of popularity, but a lack of high-quality code and documentation.
Only awesome is awesome
Research if the stuff you’re including is actually awesome. Only put stuff on the list that you or another contributor can personally recommend. You should rather leave stuff out than include too much.
I at the risk of making things unnecessarily complicated and derailing this topic, I think an “authoritative collection” is going to be more than a simple document and we are going to need something more like database that we can use to create index documents or diagrams like the technology radar.
Gathering information on many implementations has always been a chore, there exist various lists or comparisons for XMPP clients, servers and libraries, but these are often out of date, inaccurate, incomplete, or generally unmaintained.
This specification aims at solving this problem by putting the work of publishing and keeping up to date said information onto the maintainers of the software. Given many already do maintain this kind of list, the inconvenience should be minimal.
The information listed SHOULD include, but isn’t limited to, the project name, homepage, description, logo, screenshots if relevant, specifications supported (RFCs and XEPs). A full list of supported properties is described in RDF format at http://usefulinc.com/ns/doap#.
A central point should be defined to gather the list of implementations publishing their information, this specifications proposes xmpp.org for this purpose.
I was hoping for something semi-curated. An index that people can submit their projects into. And an index where a team can mark projects with tags like official, mature, best-practices. If there was a “technology radar” this would be a curated map of projects surrounded by an un-curated periphery.
Probably best to just make the damn official list and not worry about ontologies, submission, and document generation.
I think it’s difficult to make a judgement about which projects are “best”, but one thing you can do is to assess which projects are “healthy” or “mature”. This is how affiliation with software foundations often goes: you have to demonstrate that your project has good documentation, a decent body of contributors, good contribution information, etc. in order to become affiliated. They don’t judge whether the project is any good at what it does… but projects don’t tend to have a healthy body of contributors unless a bunch of people like them!
So I wonder if that’s a better direction to go in. Look at Apache, or Hyperledger, or the OpenSSF Best Practices. Make a list of requirements, use it to certify some projects as “mature”.
As a user this is super useful information. For example, NixOps seems to often be considered official, but in practice is essentially un-maintained. The second fact seems much more important to me.
Separately, having an “awesome” list seems cool. A big list of maybe interesting projects for people to browse around seems also useful.