Skip to article frontmatterSkip to article content

Homework 1 - Reproducibility checkup on an existing paper

As this 2016 Nature survey points out, more than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments. Do you believe in every paper you read?

For this assignment we will ask you to identify a paper you find interesting and evaluate how reproducible it is, from a computational perspective. We will not focus on questions regarding experimental replication, lab protocols, etc.

This could include (but it is not limited) to a paper of your area of research or interests, which potentially introduces new methodology, runs interesting experiments or just draws conclusions from previous work. Don’t worry if you don’t have an active research area yet - below we provide a starting list of potential papers you can use, pick one that looks interesting to you.

As a proof of concept, try to pick a paper that in principle looks reproducible. Evaluate the paper in the following categories. For each one of the categories, we give you a series of questions that you can check, but we also encourage you to think about other aspects if you feel inclined.

For each one of these four categories, you have to write a response of at least 200 words. For further guidance about checkpoints for reproducibility, you can also check Joelle Pineau’s Reproducibility Checklist used in the NeurIPS Reproducibility Challenge.

You will also have to write a response to the authors as if you were a referee who just received this paper to be peer reviewed, as a reproducibility editor. That is, you don’t need to discuss the scientific merits or the domain specific ideas in the paper, only focus on questions of reproducibility.

We encourage you to find papers in your area of interest and/or research, but you can also start by looking at the following list of papers:

Deliverables

  1. Write out your answers by filling out the provided markdown file, according to the description above.

  2. Convert the markdown file to PDF using the pandoc program (which is installed in the hub). To do this, you will need to run pandoc hw01-response.md -o hw01-response.pdf in your terminal, and the pdf file should appear. Commit this PDF also to your repository.

  3. Fill out the Google Form (you can copy and paste from the full-form response in Markdown). Grade from poor (1) to good (5) how reproducible the paper is for each one of the evaluation categories above.