The Journal of Open Source Software
The Journal of Open Source Software (JOSS) is a developer friendly journal for research software packages.
What exactly do you mean by 'journal'
The Journal of Open Source Software (JOSS) is an academic journal (ISSN 2475-9066) with a formal
peer review process that is designed to improve the quality of the software submitted.
Upon acceptance into JOSS, a CrossRef DOI is minted and we list your paper on the JOSS website.
Don't we have enough journals already?
Perhaps, and in a perfect world we'd rather papers about software weren't necessary but we
recognize that for most researchers, papers and not software are the currency of academic research
and that citations are required for a good career.
We built this journal because we believe that after you've done the hard work of writing great
software, it shouldn't take weeks and months to write a paper about your work.
You said developer friendly, what do you mean?
We have a simple submission workflow and extensive documentation to help you prepare your
submission. If your software is already well documented then paper preparation should take no
more than an hour.
You can read more about our motivations to build JOSS in our
announcement blog post
Code of Conduct
Although spaces may feel informal at times, we want to remind authors and reviewers (and anyone else) that this is a professional space. As such, the JOSS community adheres to a code of conduct adapted from the Contributor Covenant code of conduct.
Authors and reviewers will be required to confirm they have read our code of conduct, and are expected to adhere to it in all JOSS spaces and associated interactions.
Open Source Initiative
JOSS is a proud affiliate of the
Open Source Initiative.
As such we are committed to public support for open source software and the role OSI plays
therein. You can read more about the OSI's affilate program
The Journal of Open Source Software is a NumFOCUS-sponsored project.
Firstly, thank you so much for agreeing to review for the Journal of Open Source Software (JOSS),
we're delighted to have your help. This document is designed to outline our editorial guidelines
and help you understand our requirements for accepting a submission into the JOSS. Our review
process is based on a tried-and-tested approach of the
Some guiding principles for you the reviewer
We like to think of JOSS as a 'developer friendly' journal. That is, if the submitting authors
have followed best practices (have documentation, tests, continuous integration, and a license)
then their review should be extremely rapid.
For those authors that don't quite meet the bar, please try to give clear feedback on how they
could improve their submission. A key goal of JOSS is to raise the quality of research software
generally and you (the experienced reviewer) are well placed to give this feedback.
We encourage reviewers to file issues against the
submitted repository's issue tracker. Include in your review
links to any new issues that you the reviewer believe to be impeding the acceptance of the
repository. (If the submitted repository is a GitHub repository, mentioning the review issue URL
in the submitted repository's issue tracker will create a mention in the review issue's history.)
The JOSS paper
The JOSS paper (the PDF associated with this submission) should only include:
- A list of the authors of the software
- Author affiliations
- A short summary describing the high-level functionality of the software
- A list of key references including a link to the software archive
Note the paper should not include software documentation such as API (Application
Programming Interface) functionality, as this should be outlined in the software documentation.
There should be an
OSI approved license
included in the repository. Common licenses such as those listed on
http://choosealicense.com are preferred.
Note there should be an actual license file present in the repository not just a reference to the
Acceptable: A plain-text LICENSE file with the contents of an OSI approved license
Not acceptable: A phrase such as 'MIT license' in a README file
There should be sufficient documentation for you, the reviewer to understand the core
functionality of the software under review. A high-level overview of this documentation should be
included in a README file (or equivalent). There should be:
A statement of need
The authors should clearly state what problems the software is designed to solve and who the
target audience is.
There should be a clearly-stated list of dependencies. Ideally these should be handled with an
automated package management solution.
Good: A package management file such as a
OK: A list of dependencies to install
Bad (not acceptable): Reliance on other software not listed by the authors
The authors should include examples of how to use the software (ideally to solve real-world
Reviewers should check that the software API is documented to a suitable level. This decision is
left largely to the discretion of the reviewer and their experience of evaluating the software.
Good: All functions/methods are documented including example inputs and outputs
OK: Core API functionality is documented
Bad (not acceptable): API is undocumented
Authors are strongly encouraged to include an automated test suite covering the core functionality
of their software.
Good: An automated test suite hooked up to an external service such as Travis-CI or similar
OK: Documented manual steps that can be followed to check the expected functionality of the
software (e.g. a sample input file to assert behaviour)
Bad (not acceptable): No way for you the reviewer to check whether the software works
There should be clear guidelines for third-parties wishing to:
- Contribute to the software
- Report issues or problems with the software
- Seek support
Include here some examples of well-documented software for people to review.
Reviewers are expected to install the software they are reviewing and to verify the core
functionality of the software.
An important note about 'novel' software
Submissions that implement solutions already solved in other software packages are accepted into
JOSS provided that they meet the criteria listed above and cite prior similar work.
What happens if the software I'm reviewing doesn't meet the JOSS criteria?
We ask that reviewers grade submissions in one of three categories: 1) Accept 2) Minor Revisions
3) Major Revisions. Unlike some journals we do not reject outright submissions requiring major
revisions - we're more than happy to give the author as long as they need to make these
What about submissions that rely upon proprietary languages/development environments?
As outlined in our author guidelines, submissions that rely upon a proprietary/closed source
language or development environment are acceptable provided that they meet the other submission
requirements and that you, the reviewer, are able to install the software & verify the
functionality of the submission as required by our reviewer guidelines.
If an open source or free variant of the programming language exists, feel free to encourage the
submitting author to consider making their software compatible with the open source/free variant.
A lapsed academic with a passion for new models of scientific collaboration, he's used big telescopes
to study dust in space, built
sequencing pipelines in Cambridge and
engaged millions of people in online citizen science by co-founding the
Lorena A Barba (@labarba),
: Computational Science And Engineering, High Performance Computing
Associate Professor of Mechanical and Aerospace Engineering at the George Washington University,
leading a research group in computational fluid dynamics, computational physics and
high-performance computing. Member of the Board for NumFOCUS, a non-profit in support of
open-source scientific software.
Associate Professor, Librarian, and Head of Archival Informatics and Special Collections at
Montana State University (MSU) Library, specializing in software development, metadata and data
modeling, linked and structured data, search engine optimization, and interface design. You can
find him on ORCID at
and as @jaclark on Twitter.
Bioinformatician and researcher at the
KEMRI-Wellcome Trust Research Programme one
of the Major Wellcome-Trust Overseas Programmes. George works with the
Virus Epidemiology and Control group and develops
bioinformatics methods for understanding virus transmission patterns and evolution. He undertook
his education in Kenya and is one of East-Africa's open source software developers with an keen
interest in bioinformatics and reproducible research.
Assistant professor in Computer Science and Engineering and Medicine at UC San Diego with a
research background in population genetics and bioinformatics. Interested in best practices for
reproducible and open computational science and in how to take advantage of online media to
change the face of scientific publishing.
Kathryn Huff (@katyhuff),
: Nuclear Engineering, Energy Engineering
Kathryn Huff is an Assistant Professor in Nuclear, Plasma, and Radiological Engineering at
the University of Illinois at Urbana-Champaign. Her research
focuses on modeling and simulation of advanced nuclear reactors and fuel cycles. She also
advocates for best practices in open, reproducible scientific computing.
A survey and experimental methodologist currently working as Associate Professor in Political
Behaviour at the London School of Economics and
Political Science. His research focuses on the effects of information on public opinion, as
well as techniques and tools for analyzing quantitative survey and experimental data. He has
published more than thirty R packages on CRAN, and has authored and contributed to numerous
other open source projects.
Assistant Professor in the
Psychology at the University of Nottingham.
Studying human memory and decision making using cognitive psychology and neuroimaging
approaches, and developing novel computational methods along the way.
Lead Developer at the Mozilla Science Lab. Abby has
led development on various open source projects for science including Contributorship Badges for
Science and WormBase. With a background in bioinformatics and computer science, she builds tools
that use the web to move science forward.
Biomechanical and design engineer. Program manager for mechanical interfaces at
MIT Media lab department of
Biomechatronics. Developing computational methods for prosthetic device design. GIBBON code developer.
Ariel Rokem (@arokem),
: Neuroscience, Machine Learning, Computational Social Science
Trained in cognitive neuroscience (PhD: UC Berkeley, 2010) and computational neuroimaging
(Postdoc, Stanford, 2011-2015), Ariel Rokem is now a data scientist at the University of
Washington eScience Institute, where he continues to develop software for the analysis of human
neuroimaging data, develops tools for reproducible and open research practices, and collaborates
with researchers from a variety of fields to advance data-intensive research.
Executive Director of Data Carpentry and Adjunct Professor in the BEACON Center for the Study of
Evolution in Action at Michigan State University. Her research background in is microbial
metagenomics and bioinformatics, and she has been a developer and contributor to several open
source bioinformatics projects. She also focuses on best practices in data analysis software
Research software engineer working at the UMCCR in
Melbourne, Australia. Likes to tap into many fields of science and computing including
deployable and reproducible scientific software in both HPC and Cloud computing environments for
scientific workflows and data analysis. In previous gigs enacted
NeuroStars a Q&A site for its growing neuroscience
community and also mentored different students via the Google Summer of Code program. More
recently, self taught embedded systems design and RF engineering among other hobbies.
Cost and Sustainability Model
The Journal of Open Source Software is an open access journal committed to running at minimal costs, with zero publication fees (article processing charges) or subscription fees.
Under the NumFOCUS nonprofit umbrella, JOSS is now eligible to seek grants for sustaining its future. With an entirely volunteer team, JOSS is seeking to sustain its operations via donations and grants, keeping its low cost of operation and free service for authors.
In the spirit of transparency, below is an outline of our current running costs:
- Annual Crossref membership: $275 / year
- JOSS paper DOIs: $1 / accepted paper
- JOSS website hosting (Heroku): $19 / month
Assuming a publication rate of 200 papers per year this works out at ~$3.50 per paper ((19*12) + 200 + 275) / 200 .
Copyright of JOSS papers is retained by submitting authors and accepted papers are subject to a Creative Commons Attribution 4.0 International License.
Any code snippets included in JOSS papers are subject to the MIT license regardless of the license of the submitted software package under review.