Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'Replication'

Tag Archives: Replication

Emerging Researcher Perspectives: Replication as a Credible Pre-Analysis Plan

Guest post by Raphael Calel, Ciriacy-Wantrup Postdoctoral Fellow at the Department of Agricultural and Resource Economics at the University of California, Berkeley.


One of the most important tools for enhancing the credibility of research is the pre-analysis plan, or the PAP. Simply put, we feel more confident in someone’s inferences if we can verify that they weren’t data mining, engaging in motivated reasoning, or otherwise manipulating their results, knowingly or unknowingly. By publishing a PAP before collecting data, and then closely following that plan, researchers can credibly demonstrate to us skeptics that their analyses were not manipulated in light of the data they collected.

Still, PAPs are credible only when the researcher can anticipate and wait for the collection of new data.  The vast majority of social science research, however, does not satisfy these conditions. For instance, while it is perfectly reasonable to test new hypotheses about the causes of the recent financial crisis, it is unreasonable to expect researchers to have pre-specified their analyses before the crisis hit. To give another example, no one analysing a time series of more than a couple of years can reasonably be expected to publish a PAP and then wait for years or decades before implementing the study. Most observational studies face this problem in one form or another.

(more…)

Advisory Board Established for Project TIER

Guest post by Richard Ball and Norm Medeiros, co-principal investigators of Project TIER at Haverford College.


Project TIER (Teaching Integrity in Empirical Economics) is pleased to announce its newly-established Advisory Board. The advisors – George Alter (ICPSR), J. Scott Long (Indiana University), Victoria Stodden (University of Illinois at Urbana-Champaign), and Justin Wolfers (Peterson Institute/University of Michigan) – will help project directors Richard Ball and Norm Medeiros consider ways of developing and promoting the TIER protocol for documenting empirical research.

The guiding principle behind the protocol is that the documentation (data, code, and supplementary information) should be complete and transparent enough to allow an interested third party to easily and exactly reproduce all the steps of data management and analysis that led from the original data files to the results reported in the paper. The ultimate goal of Project TIER is to foster development of a national network of educators committed to integrating methods of empirical research documentation, guided by the principle of transparency, into the curricula of the social sciences.

(more…)

Influential Paper on Gay Marriage Might Be Marred by Fraudulent Data

Harsh scrutiny of an influential political science experiment highlights the importance of transparency in research.


The paper, from UCLA graduate student Michael LaCour and Columbia University Professor Donald Green, was published in Science in December 2014. It asserted that short conversations with gay canvassers could not only change people’s minds on a divisive social issue like same-sex marriage, but could also have a contagious effect on the relatives of those in contact with the canvassers. The paper received wide attention in the press.

Yet three days ago, two graduate students from UC Berkeley, David Broockman and Joshua Kalla, published a response to the study, pointing to a number of statistical oddities, and discrepancies between how the experiment was reported and how the authors said it was conducted. Earlier in the year, impressed by the paper findings, Broockman and Kalla had attempted to conduct an extension of the study, building on the original data set. This is when they became aware of irregularities in the study methodology and decided to notify Green.

Reviewing the comments from Broockman and Kalla, Green, who was not involved in the original data collection, quickly became convinced that something was wrong – and on Tuesday, he submitted a letter to Science requesting the retraction of the paper. Green shared his view on the controversy in a recent interview, reflecting on what it meant for the broader practice of social science and highlighting the importance of integrity in research.

(more…)

Three Transparency Working Papers You Need to Read

Garret Christensen, BITSS Project Scientist


Several great working papers on transparency and replication in economics have been released in the last few months. Two of them are intended for a symposium in The Journal of Economic Perspectives, to which I am very much looking forward, and are about pre-analysis plans. The first of these, by Muriel Niederle and Lucas Coffman, doesn’t pull any punches with its title: “Pre-Analysis Plans are not the Solution, Replication Might Be.” Niederle and Coffman claim that PAPs don’t decrease the rate of false positives sufficiently to be worth the effort, and that replication may be a better way to get at the truth. Some of their concern about PAPs stems from concerns about the assumption “[t]hat one published paper is the result of one pre-registered hypothesis, and that one pre-registered hypothesis corresponds to one experimental protocol. Neither can be guaranteed.” They’re also not crazy about design-based publications (or “registered reports“). They instead offer a proposal to get replication to take off, calling for the establishment of a Journal of Replication Studies, and for researchers to start citing replications, both positive and negative, whenever they cite an original work. They claim if these changes were made, researchers might begin to expect to see replications, and thus the value of writing and publishing them would increase.

Another working paper on PAPs in economics, titled simply “Pre-Analysis Plans in Economics” was released recently by Ben Olken. Olken gives a lot of useful background on the origin on PAP and discusses in detail what should go into them. A reference I found particularly informative is “E9 Statistical Principles for Clinical Trials,” the FDA’s official guidance for trials, especially section V on Data Analysis Considerations. Obviously a lot of the transparency practices we’re trying to adopt in economics and social sciences come from medicine, so it’s nice to see the original source. He compares the benefits: increasing confidence in results, making full use of statistical power, and improving relationships with partners (governments or corporations that may have vested interests in the outcomes of trials), with the costs: complexity and the challenge of writing all possible papers in advance, PAPs pushing towards simple, less interesting papers with less nuance, and reducing the ability to learn ex-post about your data. He cites Brodeur et al to say the problem of false positives isn’t that large, and that with the exception of the trials involving parties with vested interests, the costs outweigh the benefits.

(more…)

Registered Reports to the Rescue?

After writing an article for The Upshot, Brendan Nyhan (Assistant Professor at Dartmouth) was interviewed by The Washington Post.


The original Upshot article advocates for a new publishing structure called Registered Reports (RRs):

A research publishing format in which protocols and analysis plans are peer reviewed and registered prior to data collection, then published regardless of the outcome.

In the following interview with the Washington Post, Nyhan explains in greater detail why RRs are more effective than other tools at preventing publication bias and data mining. He begins by explaining the limitations of preregistration.

As I argued in a white paper, […] it is still too easy for publication bias to creep in to decisions by authors to submit papers to journals as well as evaluations by reviewers and editors after results are known. We’ve seen this problem with clinical trials, where selective and inaccurate reporting persists even though preregistration is mandatory.

(more…)

This Monday at AEA2015: Transparency and Integrity in Economic Research Panel

This January 5th, 10.15am at the American Economic Association Annual Meeting in Boston, MA (Sheraton Hotel, Commonwealth Room).


Session: Promoting New Norms for Transparency and Integrity in Economic Research

Presiding: Edward Miguel (UC Berkeley)

Panelists:

  • Brian Nosek (University of Virginia): “Scientific Utopia: Improving Openness and Reproducibility in Scientific Research”
  • Richard Ball (Haverford College): “Replicability of Empirical Research: Classroom Instruction and Professional Practice”
  • Eva Vivalt (New York University): “Bias and Research Method: Evidence from 600 Studies”

Discussants:

  • Aprajit Mahajan (UC Berkeley)
  • Justin Wolfers (UC Michigan)
  • Kate Casey (Stanford University)

More info here. Plus don’t miss the BITSS/COS Exhibition Booth at the John B. Hynes Convention Center (Level 2, Exhibition Hall D).

Scientists Have a Sharing Problem

Dec 15th Maggie Puniewska posted an article in the Atlantic Magazine summarizing the obstacles preventing researchers from sharing their data.


The article asks if “science has traditionally been a field that prizes collaboration […] then why [are] so many scientists stingy with their information.”

Puniewska outlines the most cited reasons scientists reframe from sharing their data.

The culture of innovation breeds fierce competition, and those on the brink of making a groundbreaking discovery want to be the first to publish their results and receive credit for their ideas.

[I]f sharing data paves the way for an expert to build upon or dispute other scientists’ results in a revolutionary way, it’s easy to see why some might choose to withhold.

(more…)

Reflections on Two Years Promoting Transparency in Research

By Guillaume Kroll (CEGA)


Two years ago, in December 2012, a handful of researchers convened in Berkeley to discuss emerging strategies to increase openness and transparency in social science research. The group’s concerns followed a number of high-level cases of scientific misconduct and unethical practices, particularly in psychology (1,2). As researchers started to question the legitimacy of the “publish or perish” structure governing academia, many decided to replicate influential findings in their field to differentiate the rigorous from the untrustworthy… only to find that a large majority of studies were unreproducible.

This observation triggered an unprecedented number of bottom-up innovations to restore the credibility of scientific evidence across social science disciplines, including the use of study registries, pre-analysis plans, data sharing, and result-blind peer review. The 2012 meeting resulted in the creation of BITSS, with the goal of fostering the adoption of more transparent research practices among the scientific community.

Today, BITSS has more than 150 affiliated researchers and partner institutions committed to improving the standards of rigor and integrity across the social sciences. Last week’s third annual BITSS meeting was a good opportunity to reflect on the progress achieved.

(more…)

Former BITSS Institute Participant Advocates for Replication in Brazil

Dalson Britto Figueiredo Filho, Adjunct Professor of Political Science at the Federal University of Pernambuco in Recife, Brazil, who attended the BITSS Summer Institute in June 2014, recently published a paper on the importance of replications in Revista Política Hoje.

“The BITSS experience really changed my mind on how to do good science”, said Figueiredo Filho. “Now I am working to diffuse both replication and transparency as default procedures of scientific inquiry among Brazilian undergraduate and graduate students. I am very thankful to the 2014 BITSS workshop for a unique opportunity to become a better scientist.”

The paper, written in Portuguese, is available here. Below is an abstract in English.

(more…)

Creating Standards for Reproducible Research: Overview of COS Meeting

By Garret Christensen (BITSS)


Representatives from BITSS (CEGA Faculty Director Ted Miguel, CEGA Executive Director Temina Madon, and BITSS Assistant Project Scientist Garret Christensen–that’s me) spent Monday and Tuesday of this week at a very interesting workshop at the Center for Open Science aimed at creating standards for promoting reproducible research in the social-behavioral sciences. Perhaps the workshop could have used a catchier name or acronym for wider awareness, but we seemed to accomplish a great deal.  Representatives from across disciplines (economics, political science, psychology, sociology, medicine), from funders (NIH, NSF, Laura and John Arnold Foundation, Sloan Foundation), publishers (Science/AAAS, APA, Nature Publishing Group), editors (American Political Science Review, Psychological Science, Perspectives on Psychological Science, Science), data archivists (ICPSR), and researchers from over 40 leading institutions (UC Berkeley, MIT, University of Michigan, University of British Columbia, UVA, UPenn, Northwestern, among many others) came together to push forward on specific action items researchers and publishers can do to promote transparent and reproducible research.

The work was divided into five subcommittees:

1) Reporting standards in research design

2) Reporting standards in analysis

3) Replications

4) Pre-Registration/Registered Reports

5) Sharing data, code, and materials

(more…)