DIME Analytics offers computational reproducibility checks and pre-publication code reviews for any working paper or publication. The reproducibility check verifies that the tables and figures that appear in your paper can be exactly reproduced using the same data and code files. The code review provides detailed feedback, identifies potential errors and inefficiencies, and suggests improvements to make before publication.
How it works
- Researchers provide DIME Analytics with a reproducibility package. The package should include all items listed in the Reproducibility Checklist.
- DIME Analytics confirms the reproducibility package runs after changing only the topmost directory specified in the master script. If the code runs, you will receive confirmation of the timeline for the reproducibility report (typically 2 weeks). If the code does not run, you will receive a request for resubmission.
- DIME Analytics runs the package three times and assesses whether all outputs (graphs and tables) are stable and exactly match those included in the final paper.
- For teams that request a code review in addition to the reproducibility check, DIME Analytics carefully reviews the code to identify potential errors and inefficiencies and suggests improvements to code increase efficiency and readability.
What you get
- A completed checklist indicating the reproducibility standards your paper meets, for inclusion in the public reproducibility package
- A detailed report on reproducibility of each output and suggestions for improvement
- Technical assistance creating a public release package for your paper on Github, if desired.
What it costs
- Typical cost is 2 days of GE staff time for the reproducibility check and 2–4 days of GE staff time for code review (depending on the length of the code). We will confirm the exact fee once the first run of the code is done, and a charge code must be provided at that time.
- Add to the acknowledgements in the paper: "Computational reproducibility verified by DIME Analytics.”