The Journal of Open Source Software

The Journal of Open Source Software (JOSS) is a developer friendly journal for research software packages.

What exactly do you mean by 'journal'

The Journal of Open Source Software (JOSS) is an academic journal with a formal peer review process that is designed to improve the quality of the software submitted. Upon acceptance into JOSS, a CrossRef DOI is minted and we list your paper on the JOSS website.

Don't we have enough journals already?

Perhaps, and in a perfect world we'd rather papers about software weren't necessary but we recognize that for most researchers, papers and not software are the currency of academic research and that citations are required for a good career.

We built this journal because we believe that after you've done the hard work of writing great software, it shouldn't take weeks and months to write a paper about your work.

You said developer friendly, what do you mean?

We have a simple submission workflow and extensive documentation to help you prepare your submission. If your software is already well documented then paper preparation should take no more than an hour.

You can read more about our motivations to build JOSS in our announcement blog post.

Code of Conduct

Although JOSS spaces may feel informal at times, we want to remind authors and reviewers (and anyone else) that this is a professional space. As such, the JOSS community adheres to a code of conduct adapted from the Contributor Covenant code of conduct.

Authors and reviewers will be required to confirm they have read our code of conduct, and are expected to adhere to it in all JOSS spaces and associated interactions.

Open Source Initiative

Osi small

The Journal of Open Source Software is a proud affiliate of the Open Source Initiative. As such we are committed to public support for open source software and the role OSI plays therein. You can read more about the OSI's affilate program here.

Author Guidelines

If you've already licensed your code and have good documentation then we expect that it should take less than an hour to prepare and submit your paper to JOSS.

What does a typical submission flow look like?

Before you submit we need you to:

  • Make software available in an open repository (GitHub, Bitbucket etc.) and include an OSI approved open source license
  • Author a short Markdown paper paper.md with a title, summary, author names, affiliations, and key references. See an example here
  • Ideally we would also like you to create a metadata file and include it in your repository describing your software. This script automates the generation of this metadata.

What should my paper contain?

JOSS papers should contain the following:

  • A list of the authors of the software
  • Author affiliations
  • A short summary describing the high-level functionality of the software
  • A list of key references including a link to the software archive

As this list shows JOSS papers are only permitted to contain a limited set of metadata (see header below), Summary & Reference sections. You can see an example paper here. Given this paper format it is not permitted to write a "full length" paper i.e. one that includes software documentation such as API (Application Programming Interface) functionality, as this should instead be outlined in the software documentation.

  ---
  title: 'Fidgit: An ungodly union of GitHub and figshare'
  tags:
    - example
    - tags
    - for the paper
  authors:
   - name: Arfon M Smith
     orcid: 0000-0000-0000-1234
     affiliation: 1
   - name: Mickey Mouse
     orcid: 0000-0000-0000-1234
     affiliation: 2
  affiliations:
   - name: Space Telescope Science Institute
     index: 1
   - name: Disney Inc.
     index: 2
  date: 14 February 2016
  bibliography: paper.bib
  ---

  # Summary

  This is a proof of concept integration between a GitHub [@GitHub] repo and figshare
  [@figshare] in an effort to get a DOI for a GitHub repository. When a repository is
  tagged for release on GitHub, Fidgit [@Fidgit] will import the release into figshare
  thus giving the code bundle a DOI. In a somewhat meta fashion, Fidgit is publishing
  itself to figshare with DOI 'http://dx.doi.org/10.6084/m9.figshare.828487'
  [@figshare_archive].

  -![Fidgit deposited in figshare.](figshare_article.png)

  # References
  

Submission then is as simple as:

  • Filling in the short submission form
  • Waiting for reviewers to be assigned over in

What are your requirements for submission?

  • Your software should be open source as per the OSI definition
  • Your software should have a research application
  • You should be a major contributor to the software you are submitting
  • Should be a significant contribution to the available open source software that either enables some new research challenges to be addressed or makes addressing research challenges significantly better (e.g., faster, easier, simpler)
  • Should be feature complete (no half-baked solutions)

What about submissions that rely upon proprietary languages/development environments?

We strongly prefer software that doesn't rely upon proprietary (paid for) development environments/programming languages. However, provided your submission meets our submission requirements (including having a valid open source license) then we will consider your submission for review. Should your submission be accepted for review, we may ask you, the submitting author, to help us find reviewers who already have the required development environment installed.


What does a typical review process look like?

We encourage you to familiarize yourself with our reviewer guidelines as this will help you understand what our reviewers will be looking for. Broadly speaking though, provided you have followed our pre-submission steps and meet our submission requirements then you should expect a rapid review (typically less than two weeks).

After submission:

  • One or more JOSS reviewers are assigned and the review is carried out in the
  • Authors respond to reviewer-raised issues (if any are raised) on the submitted repository's issue tracker. Reviewer contributions, like any other contributions, should be acknowledged in the repository.
  • Upon successful completion of the review, deposit a copy of your (updated) software repository with a data-archiving service such as Zenodo or figshare, issue a DOI for the software archive, and update the review issue thread with your DOI.
  • After assignment of a DOI, your paper metadata is deposited in CrossRef and listed on the JOSS website.
  • And that's it.

Reviewer Guidelines

Firstly, thank you so much for agreeing to review for the Journal of Open Source Software (JOSS), we're delighted to have your help. This document is designed to outline our editorial guidelines and help you understand our requirements for accepting a submission into the JOSS. Our review process is based on a tried-and-tested approach of the ROpenSci collaboration.

Some guiding principles for you the reviewer

We like to think of JOSS as a 'developer friendly' journal. That is, if the submitting authors have followed best practices (have documentation, tests, continuous integration, and a license) then their review should be extremely rapid.

For those authors that don't quite meet the bar, please try to give clear feedback on how they could improve their submission. A key goal of JOSS is to raise the quality of research software generally and you (the experienced reviewer) are well placed to give this feedback.

We encourage reviewers to file issues against the submitted repository's issue tracker. Include in your review links to any new issues that you the reviewer believe to be impeding the acceptance of the repository. (If the submitted repository is a GitHub repository, mentioning the review issue URL in the submitted repository's issue tracker will create a mention in the review issue's history.)

The JOSS paper

The JOSS paper (the PDF associated with this submission) should only include:

  • A list of the authors of the software
  • Author affiliations
  • A short summary describing the high-level functionality of the software
  • A list of key references including a link to the software archive

Note the paper should not include software documentation such as API (Application Programming Interface) functionality, as this should be outlined in the software documentation.

Software license

There should be an OSI approved license included in the repository. Common licenses such as those listed on http://choosealicense.com are preferred. Note there should be an actual license file present in the repository not just a reference to the license.

Acceptable: A plain-text LICENSE file with the contents of an OSI approved license
Not acceptable: A phrase such as 'MIT license' in a README file

Documentation

There should be sufficient documentation for you, the reviewer to understand the core functionality of the software under review. A high-level overview of this documentation should be included in a README file (or equivalent). There should be:

A statement of need

The authors should clearly state what problems the software is designed to solve and who the target audience is.

Installation instructions

There should be a clearly-stated list of dependencies. Ideally these should be handled with an automated package management solution.

Good: A package management file such as a Gemfile or package.json or equivalent
OK: A list of dependencies to install
Bad (not acceptable): Reliance on other software not listed by the authors

Example usage

The authors should include examples of how to use the software (ideally to solve real-world analysis problems).

API documentation

Reviewers should check that the software API is documented to a suitable level. This decision is left largely to the discretion of the reviewer and their experience of evaluating the software.

Good: All functions/methods are documented including example inputs and outputs
OK: Core API functionality is documented
Bad (not acceptable): API is undocumented

Tests

Authors are strongly encouraged to include an automated test suite covering the core functionality of their software.

Good: An automated test suite hooked up to an external service such as Travis-CI or similar
OK: Documented manual steps that can be followed to check the expected functionality of the software (e.g. a sample input file to assert behaviour)
Bad (not acceptable): No way for you the reviewer to check whether the software works

Community guidelines

There should be clear guidelines for third-parties wishing to:

  • Contribute to the software
  • Report issues or problems with the software
  • Seek support

Examples

Include here some examples of well-documented software for people to review.

Functionality

Reviewers are expected to install the software they are reviewing and to verify the core functionality of the software.

Other considerations

An important note about 'novel' software

Submissions that implement solutions already solved in other software packages are accepted into JOSS provided that they meet the criteria listed above and cite prior similar work.

What happens if the software I'm reviewing doesn't meet the JOSS criteria?

We ask that reviewers grade submissions in one of three categories: 1) Accept 2) Minor Revisions 3) Major Revisions. Unlike some journals we do not reject outright submissions requiring major revisions - we're more than happy to give the author as long as they need to make these modifications/improvements.

What about submissions that rely upon proprietary languages/development environments?

As outlined in our author guidelines, submissions that rely upon a proprietary/closed source language or development environment are acceptable provided that they meet the other submission requirements and that you, the reviewer, are able to install the software & verify the functionality of the submission as required by our reviewer guidelines.

If an open source or free variant of the programming language exists, feel free to encourage the submitting author to consider making their software compatible with the open source/free variant.

Editorial Board

Arfon

Arfon Smith (@arfon), Editor-in-Chief

A lapsed academic with a passion for new models of scientific collaboration, he's used big telescopes to study dust in space, built sequencing pipelines in Cambridge and engaged millions of people in online citizen science by co-founding the Zooniverse.

Topic Editors

Lorena

Lorena A Barba (@labarba), Editor: Computational Science and Engineering & High-performance Computing

Associate Professor of Mechanical and Aerospace Engineering at the George Washington University, leading a research group in computational fluid dynamics, computational physics and high-performance computing. Member of the Board for NumFOCUS, a non-profit in support of open-source scientific software.

George

George Githinji (@biorelated), Editor: Bioinformatics

Bioinformatician and researcher at the KEMRI-Wellcome Trust Research Programme one of the Major Wellcome-Trust Overseas Programmes. George works with the Virus Epidemiology and Control group and develops bioinformatics methods for understanding virus transmission patterns and evolution. He undertook his education in Kenya and is one of East-Africa's open source software developers with an keen interest in bioinformatics and reproducible research.

Melissa

Melissa Gymrek (@mgymrek), Editor: Bioinformatics

Assistant professor in Computer Science and Engineering and Medicine at UC San Diego with a research background in population genetics and bioinformatics. Interested in best practices for reproducible and open computational science and in how to take advantage of online media to change the face of scientific publishing.

Katy

Kathryn Huff (@katyhuff), Editor: Nuclear Engineering, Energy Engineering

Kathryn Huff is an Assistant Professor in Nuclear, Plasma, and Radiological Engineering at the University of Illinois at Urbana-Champaign. Her research focuses on modeling and simulation of advanced nuclear reactors and fuel cycles. She also advocates for best practices in open, reproducible scientific computing.

Dan

Daniel S. Katz (@danielskatz), Editor: Computer Science

Works on computer, computational, and data reseach at NCSA, GSLIS, and ECE at the University of Illinois at Urbana-Champaign, and has a strong interest in studying common elements of how research is done by people using software and data.

Chris

Christopher R. Madan (@cMadan), Editor: Psychology and Neuroimaging

Cognitive psychologist and neuroscientist. Postdoctoral research fellow in the Department of Psychology at Boston College. Studying human memory and decision making using cognitive psychology and neuroimaging approaches, and developing novel computational methods along the way.

Abby

Abigail Cabunoc Mayes (@acabunoc), Editor: Open Science

Lead Developer at the Mozilla Science Lab. Abby has led development on various open source projects for science including Contributorship Badges for Science and WormBase. With a background in bioinformatics and computer science, she builds tools that use the web to move science forward.

Kevin

Kevin M. Moerman (@Kevin-Mattheus-Moerman) Editor: Image-based bioengineering, and soft tissue biomechanics

Biomechanical and design engineer. Program manager for mechanical interfaces at MIT Media lab department of Biomechatronics. Developing computational methods for prosthetic device design. GIBBON code developer.

Kyle

Kyle Niemeyer (@kyleniemeyer), Editor: Computational Combustion and Fluid Dynamics

Mechanical engineer in the School of Mechanical, Industrial, and Manufacturing Engineering at Oregon State University. Computational researcher in combustion, fluid dynamics, and chemical kinetics, with an interest in numerical methods and GPU computing strategies.

Pjotr

Pjotr Prins (@pjotrp), Editor: Bioinformatics, Reproducible Research, Software Deployment and High Performance Computing

Bioinformatician at large and director of Genenetwork.org and a visiting research fellow of The University of Tennessee Health Science Center and the Personal genomics and bioinformatics Department of Human Genetics of the University Medical Centre Utrecht. Writing software is Pjotr's core business in academia. He loves programming languages and he is involved in a wide range of free and open source software projects. He guides students to write software and, every year, he is a mentor and organisation administrator in the Google Summer of Code.

Karthik

Karthik Ram (@karthik), Editor: Biodiversity Informatics and Data Science

A quantitative ecologist and data scientist at UC Berkeley' Institute for Data Science, his research focuses on food web dynamics, open science, open data, and reproducible research.

Ariel

Ariel Rokem (@arokem), Editor: Neuroscience, machine learning, computational social science

Trained in cognitive neuroscience (PhD: UC Berkeley, 2010) and computational neuroimaging (Postdoc, Stanford, 2011-2015), Ariel Rokem is now a data scientist at the University of Washington eScience Institute, where he continues to develop software for the analysis of human neuroimaging data, develops tools for reproducible and open research practices, and collaborates with researchers from a variety of fields to advance data-intensive research.

Tracy

Tracy Teal (@tracykteal), Editor: Bioinformatics

Executive Director of Data Carpentry and Adjunct Professor in the BEACON Center for the Study of Evolution in Action at Michigan State University. Her research background in is microbial metagenomics and bioinformatics, and she has been a developer and contributor to several open source bioinformatics projects. She also focuses on best practices in data analysis software development.

Roman

Roman Valls Guimera (@brainstorm), Editor: Bioinformatics, Computer Science

Research software engineer working at the UMCCR in Melbourne, Australia. Likes to tap into many fields of science and computing including deployable and reproducible scientific software in both HPC and Cloud computing environments for scientific workflows and data analysis. In previous gigs enacted NeuroStars a Q&A site for its growing neuroscience community and also mentored different students via the Google Summer of Code program. More recently, self taught embedded systems design and RF engineering among other hobbies.

Jake

Jake Vanderplas (@jakevdp), Editor: Astronomy and Machine Learning

Astronomer exploring the role of data science in academia at University of Washington's eScience Institute. Core contributor to scikit-learn and other science-focused Python packages.

Business Model

The Journal of Open Source Software is an open access journal committed to running at minimal costs, with zero publication fees (article processing charges) or subscription fees. With volunteer effort from our editorial board, community reviewers, donations and minimal infrastructure costs we believe JOSS can remain a free community service.

In the spirit of transparency, below is an outline of our current running costs:

  • Annual Crossref membership: $275 / year
  • JOSS paper DOIs: $1 / accepted paper
  • JOSS website hosting (Heroku): $19 / month

Assuming a publication rate of 200 papers per year this works out at ~$3.50 per paper ((19*12) + 200 + 275) / 200 .

Content Licensing

Creative Commons Licence Copyright of JOSS papers is retained by submitting authors and accepted papers are subject to a Creative Commons Attribution 4.0 International License.

Any code snippets included in JOSS papers are subject to the MIT license regardless of the license of the submitted software package under review.