Bibliography – Reproducibility in Data Science and Digital Humanities

Selected DH research and resources bearing on, or utilized by, the WE1S project.
(all) Distant Reading | Cultural Analytics | | Sociocultural Approaches | Topic Modeling in DH | Non-consumptive Use


Heaven, Will Douglass. “AI Is Wrestling with a Replication Crisis.” MIT Technology Review, 2020. https://www.technologyreview.com/2020/11/12/1011944/artificial-intelligence-replication-crisis-science-big-tech-google-deepmind-facebook-openai/. Cite
Koenzen, Andreas, Neil Ernst, and Margaret-Anne Storey. “Code Duplication and Reuse in Jupyter Notebooks.” ArXiv:2005.13709 [Cs], 2020. http://arxiv.org/abs/2005.13709. Cite
Colavizza, Giovanni. “Are We Breaking the Social Contract?” Journal of Cultural Analytics, 2020. https://doi.org/10.22148/001c.11828. Cite
National Academies of Sciences, Engineering, and Medicine. Reproducibility and Replicability in Science. One of the Pathways by Which the Scientific Community Confirms the Validity of a New Scientific Discovery Is by Repeating the Research That Produced It. When a Scientific Effort Fails to Independently Confirm the Computations or Results of a Previous Study, Some Fear That It May Be a Symptom of a Lack of Rigor in Science, While Others Argue That Such an Observed Inconsistency Can Be an Important Precursor to New Discovery.  Concerns about Reproducibility and Replicability Have Been Expressed in Both Scientific and Popular Media. As These Concerns Came to Light, Congress Requested That the National Academies of Sciences, Engineering, and Medicine Conduct a Study to Assess the Extent of Issues Related to Reproducibility and Replicability and to Offer Recommendations for Improving Rigor and Transparency in Scientific Research.  Reproducibility and Replicability in Science Defines Reproducibility and Replicability and Examines the Factors That May Lead to Non-Reproducibility and Non-Replicability in Research. Unlike the Typical Expectation of Reproducibility between Two Computations, Expectations about Replicability Are More Nuanced, and in Some Cases a Lack of Replicability Can Aid the Process of Scientific Discovery. This Report Provides Recommendations to Researchers, Academic Institutions, Journals, and Funders on Steps They Can Take to Improve Reproducibility and Replicability in Science. Washington, DC: National Academies Press, 2019. https://nap.nationalacademies.org/catalog/25303/reproducibility-and-replicability-in-science. Cite
Pimentel, João Felipe, Leonardo Murta, Vanessa Braganholo, and Juliana Freire. “A Large-Scale Study about Quality and Reproducibility of Jupyter Notebooks.” In Proceedings of the 16th International Conference on Mining Software Repositories, 507–17. MSR ’19. Montreal, Quebec, Canada: IEEE Press, 2019. https://doi.org/10.1109/MSR.2019.00077. Cite
Rule, Adam, Amanda Birmingham, Cristal Zuniga, Ilkay Altintas, Shih-Cheng Huang, Rob Knight, Niema Moshiri, et al. “Ten Simple Rules for Writing and Sharing Computational Analyses in Jupyter Notebooks.” PLOS Computational Biology 15, no. 7 (2019): e1007007. https://doi.org/10.1371/journal.pcbi.1007007. Cite
Mendez, Kevin M., Leighton Pritchard, Stacey N. Reinke, and David I. Broadhurst. “Toward Collaborative Open Data Science in Metabolomics Using Jupyter Notebooks and Cloud Computing.” Metabolomics 15, no. 10 (2019): 125. https://doi.org/10.1007/s11306-019-1588-0. Cite
Liu, Alan, Scott Kleinman, Jeremy Douglass, Lindsay Thomas, Ashley Champagne, and Jamal Russell. “Open, Shareable, Reproducible Workflows for the Digital Humanities: The Case of the 4Humanities.Org ‘WhatEvery1Says’ Project.” In Digital Humanities 2017 Conference Abstracts. Montreal: Alliance of Digital Humanities Organizations (ADHO), 2017. Cite
Kluyver, Thomas, Benjamin Ragan-Kelley, Fernando Pérez, Brian Granger, Matthias Bussonnier, Jonathan Frederic, Kyle Kelley, et al. “Jupyter Notebooks – a Publishing Format for Reproducible Computational Workflows.” Positioning and Power in Academic Publishing: Players, Agents and Agendas, 2016, 87–90. https://doi.org/10.3233/978-1-61499-649-1-87. Cite
Peng, Roger D. “Reproducible Research in Computational Science.” Science 334, no. 6060 (2011): 1226–27. https://doi.org/10.1126/science.1213847. Cite