The Summer of Reproducibility (SoR) program has funded over 40 diverse student projects addressing key challenges in computational reproducibility over the past two years. These projects not only advance the state of reproducibility in computer science research but also provide excellent learning opportunities for students working alongside experienced mentors.
For those interested in proposing future projects or learning about successful initiatives, we’ve categorized recent SoR projects by their primary impact areas:
1. Artifact Evaluation Support
Projects in this category enhance methodologies for evaluating and verifying research artifacts, addressing the significant challenges in standardizing artifact review processes that take place at CS conferences like SC and FAST.1.
2. Reproducibility in Education
These projects integrate reproducibility principles into computer science curricula, responding to the recognized gap in reproducibility education identified by Fund 3.
3. Reproducibility Methodology
Projects focusing on methodology examine and formalize approaches to improve reproducibility, particularly addressing the challenge of experiment packaging identified in reproducibility studies 5.
4. Reproducibility Tools
This category encompasses software tools that automate or simplify reproducibility tasks, addressing the need for specialized tooling identified by both ACM and the National Academies 7.
5. Artifact Packaging
Projects in this category advance techniques for comprehensive packaging of research artifacts, addressing challenges in environment contextualization and experimental setup 9.
These categories and exemplars from the SoR program demonstrate how targeted student projects can systematically address key challenges in computational reproducibility. By producing concrete artifacts, methodologies, and educational resources, these initiatives contribute to the broader scientific goal of improving reproducibility in computer science research while training the next generation of researchers in reproducible practices.
We hope they provide some inspiration to the community seeking new projects to continue the tremendous progress towards practical reproducibility in computer science.
References
- https://dl.acm.org/doi/abs/10.1145/3456287.3465479
- K. Kraßnitzer, “AutoAppendix: Towards One-Click reproducibility of high-performance computing experiments,” 2024.
- F. Fund, “We Need More Reproducibility Content Across the Computer Science Curriculum,” Proceedings of the 2023 ACM Conference on Reproducibility and Replicability, 2023.
- M. Saeed, “Using Reproducibility in Machine Learning Education,” 2023.
- arxiv.org/html/2503.07080v2, arxiv.org/html/2312.11028v1
- X. Ren, “Measuring Open-source Database Systems under TPC-C Benchmark with Unreported Settings,” 2023.
- https://www.nationalacademies.org/our-work/reproducibility-and-replicability-in-science
- A. Dabral, “Automatic reproducibility of COMPSs experiments through the integration of RO-Crate in Chameleon,” 2024.
- https://pmc.ncbi.nlm.nih.gov/articles/PMC8067906/
- J. Shin and M. Irawan, “FlashNet: Towards Reproducible Continual Learning for Storage System,” 2023.