When I talk to people who tried to install GrimoireLab, I get one consistent answer. It is difficult and our GrimoireLab tutorial is too complicated. I believe this status quo is hurting the adoption of GrimoireLab software and CHAOSS metrics. This blog post is about how to make it easier for anyone to start using GrimoireLab.
A contributed article on The New Stack details how the CHAOSS D&I Working Group got started and evolved. Georg Link and Sarah Conway take a close look at the working group’s history and capture insights that are only known to those who were part of the evolution. The article outline is:
Back in October, my colleague John Hawley and I reflected on our visit to last year’s U.S. CHAOSScon where we gave a talk on “The Pains and Tribulations of Finding Data.” At the end of that post, we mentioned learning more at the conference about Grimoire Lab’s Perceval tool for tracking data from multiple open source projects on a single dashboard. That opportunity helped me develop the work that was the subject of a talk I gave at the recent CHAOSScon Europe 2019.
Currently, GrimoireLab allows to produce analytics with data extracted from more than 30 tools related with contributing to Open Source development such as version control systems, issue trackers and forums. Despite the large set of metrics available in GrimoireLab, none of them relies on information extracted from source code, thus limiting the end-users to benefit of a wider spectrum of software development data.
The GMD Working Group is one of the CHAOSS working groups, tasked with defining useful metrics relevant for the analysis of software development projects from the point of view of
GMD (growth-maturity-decline). It also works in the areas of risk and value. For all of them, we’re intending to follow the same process to produce metrics, similar to what other CHAOSS working groups are doing. This post describes this process, that we have recently
completed for the first metric (many others should follow during the next weeks).
Community managers take a variety of perspectives, depending on where their communities are in the lifecycle of growth, maturity, and decline. This is an evolving report of what we are learning from community managers, some of whom we are working with on live experiments with a CHAOSS project prototyping software tool called Augur (http://www.github.com/CHAOSS/augur). At this point, we are paying particular focus to how community managers consume metrics and how the presentation of open source software health and sustainability metrics could make them more and in some cases less useful for doing their jobs.
Open communities lack a shared language to talk about metrics and share best practices. Metrics are aggregate information that summarise raw data into a single number, stripping away any context of data. Pedagogical metric displays are an idea for metrics that include an explanation and educates the user on how to interpret the metric. Metrics are inherently biased and can lead to discrimination. Many problems brought up during the MozFest session are worked on in the CHAOSS project.
Previously, we’ve explored the challenge of measuring progress in open source projects and looked forward to the recent CHAOSScon meeting, held right before the North American Open Source Summit (OSS). CHAOSS, for those who may not know, is the Community Health Analytics Open Source Software project. August’s CHAOSScon marked the first time that the project had held its own, independent pre-OSS event.
My colleague Matt Germonprez recently hit me and around 50 other people at CHAOSScon North America (2018) with this observation:
“A lot of times we get really great answers to the wrong questions.”
Matt explained this phenomena as “type III error”, an allusion to the more well known statistical phenomena of type I and type II errors. If you are trying to solve a problem or improve a situation, sometimes great answers to the wrong questions can still be useful because in all likelihood somebody is looking for the answer to that question! Or maybe it answers another curiosity you were not even thinking about. I think we should call this (Erdelez, 1997). There’s an old adage:
“Even a blind squirrel finds a nut every once in a while.”
The CHAOSS project aims to develop metrics and software for measuring open source projects. One group of people that care about this are community managers. Every year, Jono Bacon, a CHAOSS Governing Board member who professionalized community management with his book “The Art of Community”, invites community managers to his Community Leadership Summit. (In his book, Jono dedicated the entire chapter 7 to measuring communities.)Judging by the reactions on Twitter and engagement with other conference participants, metrics was a popular topic at the conference. It is no surprise, that members of the CHAOSS project would naturally be at this conference. This blog post summarizes the presence of CHAOSS at the Community Leadership Summit and highlights some takeaways and insights.