Academic Open Source Project Impact
Question: What is the impact of open source projects that an academician or a team of academicians creates as an important part of a university reappointment, tenure, and promotion process?
Overview
This metric measures the impact of new open source projects created by academicians as part of their academic role, with the purpose of supporting reappointment, tenure, and promotion (RPT) cases. It focuses on contributions released as open source by a researcher or team to showcase scholarly impact and engagement. Understanding this metric can help gauge the reach, influence, and longevity of these open source projects by examining aspects like community growth, citation count, and downstream dependencies. This metric highlights the importance of open source software as a scholarly output, potentially equal to traditional publications in academia. It also promotes greater recognition of open source projects, which can contribute to sustainable, community-driven research software and transparency in academia.
In terms of DEI, this metric encourages inclusive access to research through open source, making scholarly contributions accessible to a broader, global audience regardless of institutional affiliations or financial constraints.
Want to Know More?
Data Collection Strategies
Data points to consider:
- Publication in open access journals like the Journal of Open Source Software
- Number of downstream dependencies of the software
- Standardized citations via the CiteAs API
- Number of downloads or stars (e.g., GitHub)
- Number of contributors from outside the research team
- Frequency and recency of project updates
- Number of publications that cite the project or related software
- Lines of code contributed over time
- Number of preprints or journal articles referencing the software
Filters
Potential filters include:
- Project type (e.g., software, library, dataset)
- Academic discipline or field
- Platform (e.g., GitHub, arXiv, JOSS)
- Contribution types (e.g., internal vs. external contributors)
References
- GitHub Citation Guidelines for Software
- arXiv.org code
- ResearchStory
- ACM Artifact Review and Badging
- Altmetric
- Related Metric: Project Popularity
- CiteAs
- Zhao, R., & Wei, M. (2017). Impact evaluation of open source software: An Altmetrics perspective. Scientometrics, 110(2), 1017–1033. https://doi.org/10.1007/s11192-016-2204-y
- Moral-Muñoz, J. A., Herrera-Viedma, E., Santisteban-Espejo, A., & Cobo, M. J. (2020). Software tools for conducting bibliometric analysis in science: An up-to-date review. El Profesional de La Información, 29(1). https://doi.org/10.3145/epi.2020.ene.03
- Searles, A., Doran, C., Attia, J., Knight, D., Wiggers, J., Deeming, S., Mattes, J., Webb, B., Hannan, S., Ling, R., Edmunds, K., Reeves, P., & Nilsson, M. (2016). An approach to measuring and encouraging research translation and research impact. Health Research Policy and Systems, 14(1), 60. https://doi.org/10.1186/s12961-016-0131-2
Contributors
- Stephen Jacobs
- Vinod Ahuja
- Elizabeth Barron
- Matt Germonprez
- Kevin Lumbard
- Georg Link
- Peculiar C. Umeh
- Sean P Goggins
- Johan Linaker
- Yigakpoa L. Ikpae
Additional Information
To edit this metric please submit a Change Request here
To reference this metric in software or publications please use this stable URL: https://chaoss.community/?p=3583
The usage and dissemination of health metrics may lead to privacy violations. Organizations may be exposed to risks. These risks may flow from compliance with the GDPR in the EU, with state law in the US, or with other laws. There may also be contractual risks flowing from terms of service for data providers such as GitHub and GitLab. The usage of metrics must be examined for risk and potential data ethics problems. Please see CHAOSS Data Ethics document for additional guidance.