Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'Reproducibility'

Tag Archives: Reproducibility

TIER Faculty Fellowships 2015-16

Richard Ball, Associate Professor of Economics, and Norm Medeiros, Associate Librarian, of Haverford College, are co-principal investigators of Project TIER. They are seeking the first class of TIER Fellows to promote and extend teaching of transparent and reproducible empirical research methods.


Project TIER (Teaching Integrity in Empirical Research) is an initiative that promotes training in open and transparent methods of quantitative research in the undergraduate and graduate curricula across all the social sciences.

The Project anticipates awarding three or four TIER Faculty Fellowships for the 2015-16 academic year. Fellows will collaborate with TIER leadership and work independently to develop and disseminate transparent research methods that are suitable for adoption by students writing theses, dissertations or other supervised papers, or that can be incorporated into classes in which students conduct quantitative research.

The period of the Fellowships will be from June 1, 2015 through June 30, 2016. Each Fellow will receive a stipend of $5,000.

Applications are due April 19, 2015. Early inquiries and expressions of interest are encouraged. To learn more and apply, visit http://www.haverford.edu/TIER/opportunities/fellowships_2015-16.php.

Join an Open Call on Reproducibility Tomorrow at 11 am (ET)

BITSS Project Scientist Garret Christensen will be participating in a discussion with the Mozilla Science Lab on reproducibility in research tomorrow at 11 am ET. The call is open to the public. For those interested in joining, more information can be found here.

This Monday at AEA2015: Transparency and Integrity in Economic Research Panel

This January 5th, 10.15am at the American Economic Association Annual Meeting in Boston, MA (Sheraton Hotel, Commonwealth Room).


Session: Promoting New Norms for Transparency and Integrity in Economic Research

Presiding: Edward Miguel (UC Berkeley)

Panelists:

  • Brian Nosek (University of Virginia): “Scientific Utopia: Improving Openness and Reproducibility in Scientific Research”
  • Richard Ball (Haverford College): “Replicability of Empirical Research: Classroom Instruction and Professional Practice”
  • Eva Vivalt (New York University): “Bias and Research Method: Evidence from 600 Studies”

Discussants:

  • Aprajit Mahajan (UC Berkeley)
  • Justin Wolfers (UC Michigan)
  • Kate Casey (Stanford University)

More info here. Plus don’t miss the BITSS/COS Exhibition Booth at the John B. Hynes Convention Center (Level 2, Exhibition Hall D).

Scientists Have a Sharing Problem

Dec 15th Maggie Puniewska posted an article in the Atlantic Magazine summarizing the obstacles preventing researchers from sharing their data.


The article asks if “science has traditionally been a field that prizes collaboration […] then why [are] so many scientists stingy with their information.”

Puniewska outlines the most cited reasons scientists reframe from sharing their data.

The culture of innovation breeds fierce competition, and those on the brink of making a groundbreaking discovery want to be the first to publish their results and receive credit for their ideas.

[I]f sharing data paves the way for an expert to build upon or dispute other scientists’ results in a revolutionary way, it’s easy to see why some might choose to withhold.

(more…)

Scientific Irreproducibility and the Prospects of Meta-Research

recent article from The Economist featuring John Ioannidis’ Meta-Research Innovation Center (METRICS), whose work to advance the credibility of research will be presented next week at the BITSS Annual Meeting.


“Why most published research findings are false” is not, as the title of an academic paper, likely to win friends in the ivory tower. But it has certainly influenced people (including journalists at The Economist). The paper it introduced was published in 2005 by John Ioannidis, an epidemiologist who was then at the University of Ioannina, in Greece, and is now at Stanford. It exposed the ways, most notably the overinterpreting of statistical significance in studies with small sample sizes, that scientific findings can end up being irreproducible—or, as a layman might put it, wrong […] Dr Ioannidis has been waging war on sloppy science ever since, helping to develop a discipline called meta-research (ie, research about research).

METRICS’ mission is to “identify and minimize persistent threats to medical-research quality.” These include irreproducibility of research findings (the inability of external researchers to reproduce someone else’s work, most often because research data is not shared or data manipulations are not correctly detailed), funding inefficiencies (supporting flawed research), and publication bias (not all studies that are conducted get published, and the ones which do tend to be show significant results, leaving a skewed impression of the evidence).

(more…)

Scientific consensus has gotten a bad reputation—and it doesn’t deserve it

In a recent post, Senior science editor at Ars TechnicaJohn Timmer defends the importance of consensus.


Opening with the following quote from author Michael Crichton:

Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results.

Timmer defends the importance of consensus pointing out:

Reproducible results are absolutely relevant. What Crichton is missing is how we decide that those results are significant and how one investigator goes about convincing everyone that he or she happens to be right. This comes down to what the scientific community as a whole accepts as evidence. (more…)

Teaching Integrity in Empirical Research

Richard Ball (Economics Professor at Haverford College and presenter at the 2014 BITSS Summer Institute) and Norm Medeiros (Associate Librarian at Haverford College) in a recent interview appearing on  the Library of Congress based blog The Signal, discussed Project TIER (Teaching Integrity in Empirical Research) and their experience educating students how to document their empirical analysis.  


What is Project TIER

For close to a decade, we have been teaching our students how to assemble comprehensive documentation of the data management and analysis they do in the course of writing an original empirical research paper. Project TIER is an effort to reach out to instructors of undergraduate and graduate statistical methods classes in all the social sciences to share with them lessons we have learned from this experience.

What is the TIER documentation protocol?

We gradually developed detailed instructions describing all the components that should be included in the documentation and how they should be formatted and organized. We now refer to these instructions as the TIER documentation protocol. The protocol specifies a set of electronic files (including data, computer code and supporting information) that would be sufficient to allow an independent researcher to reproduce–easily and exactly–all the statistical results reported in the paper.

(more…)