Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'Disclosure'

Tag Archives: Disclosure

Influential Paper on Gay Marriage Might Be Marred by Fraudulent Data

Harsh scrutiny of an influential political science experiment highlights the importance of transparency in research.


The paper, from UCLA graduate student Michael LaCour and Columbia University Professor Donald Green, was published in Science in December 2014. It asserted that short conversations with gay canvassers could not only change people’s minds on a divisive social issue like same-sex marriage, but could also have a contagious effect on the relatives of those in contact with the canvassers. The paper received wide attention in the press.

Yet three days ago, two graduate students from UC Berkeley, David Broockman and Joshua Kalla, published a response to the study, pointing to a number of statistical oddities, and discrepancies between how the experiment was reported and how the authors said it was conducted. Earlier in the year, impressed by the paper findings, Broockman and Kalla had attempted to conduct an extension of the study, building on the original data set. This is when they became aware of irregularities in the study methodology and decided to notify Green.

Reviewing the comments from Broockman and Kalla, Green, who was not involved in the original data collection, quickly became convinced that something was wrong – and on Tuesday, he submitted a letter to Science requesting the retraction of the paper. Green shared his view on the controversy in a recent interview, reflecting on what it meant for the broader practice of social science and highlighting the importance of integrity in research.

(more…)

A Rough Guide to Spotting Bad Science

Twelve points that will help separate the science from the pseudoscience, written by Compound Interest (see here).


Compound Interest

Compound Interest


The Disturbing Influence of Flawed Research on Your Living Habits

Last year, we featured a story on our blog about the so-called cardiovascular benefits of fish oil, largely based on a seminal research study that had more to do with hearsay than with actual science. After your diet, flawed research is now trying to meddle with your sports life.


A Danish study published in the Journal of the American College of Cardiology recently made the headlines for suggesting that too much jogging could have a negative impact on life expectancy. In a recent post in the New York Times, economist Justin Wolfers carefully analyses the study and provides a brilliant response discrediting this overly-confident claim:

The researchers asked Danish runners about the speed, frequency and duration of their workouts, categorizing 878 of them as light, moderate or strenuous joggers. Ten years later, the researchers checked government records to see how many of them had died […] Happily, only 17 had. While this was good news for the surviving runners, it was bad news for the researchers, because 17 was clearly too few deaths to discern whether the risk of death was related to running intensity.

Nonetheless, the study claimed that too much jogging was associated with a higher mortality rate […] The evidentiary basis for this claim is weak. It is based on 40 people who were categorized as “strenuous joggers” — among whom only two died. That’s right: The conclusions that received so much attention were based on a grand total of two deaths among strenuous joggers. As Alex Hutchinson of Runner’s World wrote, “Thank goodness a third person didn’t die, or public health authorities would be banning jogging.”

Because the sample size was so small, this difference is not statistically significant. You may have heard the related phrase “absence of evidence does not equal evidence of absence,” and it is particularly relevant here […] In fact, the main thing the study shows is that small samples yield unreliable estimates that cannot be reliably discerned from the effects of chance.

Wolfers goes on highlighting other weaknesses in the Danish study. This new case of unreliable research finding receiving wide media coverage brings out a couple of important points that are central to our work at BITSS:

(more…)

Facilitating Radical Change in Publication Standards: Overview of COS Meeting Part II

Originally posted on the Open Science Collaboration by Denny Borsboom


This train won’t stop anytime soon.

That’s what I kept thinking during the two-day sessions in Charlottesville, where a diverse array of scientific stakeholders worked hard to reach agreement on new journal standards for open and transparent scientific reporting. The aspired standards are intended to specify practices for authors, reviewers, and editors to follow in order to achieve higher levels of openness than currently exist. The leading idea is that a journal, funding agency, or professional organization, could take these standards off-the-shelf and adopt them in their policy. So that when, say, The Journal for Previously Hard To Get Data starts to turn to a more open data practice, they don’t have to puzzle on how to implement this, but may instead just copy the data-sharing guideline out of the new standards and post it on their website.

The organizers1 of the sessions, which were presided by Brian Nosek of the Center for Open Science, had approached methodologists, funding agencies, journal editors, and representatives of professional organizations to achieve a broad set of perspectives on what open science means and how it should be institutionalized. As a result, the meeting felt almost like a political summit. It included high officials from professional organizations like the American Psychological Association (APA) and the Association for Psychological Science (APS), programme directors from the National Institutes of Health (NIH) and the National Science Foundation (NSF), editors of a wide variety of psychological, political, economic, and general science journals (including Science and Nature), and a loose collection of open science enthusiasts and methodologists (that would be me).

(more…)

Creating Standards for Reproducible Research: Overview of COS Meeting

By Garret Christensen (BITSS)


Representatives from BITSS (CEGA Faculty Director Ted Miguel, CEGA Executive Director Temina Madon, and BITSS Assistant Project Scientist Garret Christensen–that’s me) spent Monday and Tuesday of this week at a very interesting workshop at the Center for Open Science aimed at creating standards for promoting reproducible research in the social-behavioral sciences. Perhaps the workshop could have used a catchier name or acronym for wider awareness, but we seemed to accomplish a great deal.  Representatives from across disciplines (economics, political science, psychology, sociology, medicine), from funders (NIH, NSF, Laura and John Arnold Foundation, Sloan Foundation), publishers (Science/AAAS, APA, Nature Publishing Group), editors (American Political Science Review, Psychological Science, Perspectives on Psychological Science, Science), data archivists (ICPSR), and researchers from over 40 leading institutions (UC Berkeley, MIT, University of Michigan, University of British Columbia, UVA, UPenn, Northwestern, among many others) came together to push forward on specific action items researchers and publishers can do to promote transparent and reproducible research.

The work was divided into five subcommittees:

1) Reporting standards in research design

2) Reporting standards in analysis

3) Replications

4) Pre-Registration/Registered Reports

5) Sharing data, code, and materials

(more…)

The 10 Things Every Grad Student Should Do

In a recent post on the Data Pub blog, Carly Strasser provides a useful transparency guide for newcomers to the world of empirical research. Below is an adapted version of that post. 


1. Learn to code in some language. Any language.

Strasser begins her list urging students to learn a programming language. As the limitations of statistical packages including STATA, SAS and SPSS become increasingly apparent, empirical social scientists are beginning to learn languages such as MATLAB, R and Python. Strasser comments:

Growing amounts and diversity of data, more interdisciplinary collaborators, and increasing complexity of analyses mean that no longer can black-box models, software, and applications be used in research.

Start learning to code now so you are not behind the curve later!

2. Stop using Excel. Or at least stop ONLY using Excel.

In Excel modifying data is done without a trace. This makes documenting changes made to a dataset more difficult and prevents researchers using Excel from producing fully replicable research. Read “Potentially Problematic Excel Features” to learn more about the pitfalls of Excel.

(more…)

Call for Papers on Research Transparency

BITSS will be holding its 3rd annual conference at UC Berkeley on December 11-12, 2014. The goal of the meeting is to bring together leaders from academia, scholarly publishing, and policy to strengthen the standards of openness and integrity across social science disciplines.

This Call for Papers focuses on work that elaborates new tools and strategies to increase the transparency and reproducibility of research. A committee of reviewers will select a limited number of papers to be presented and discussed. Topics for papers include, but are not limited to:

  • Pre-registration and the use of pre-analysis plans;
  • Disclosure and transparent reporting;
  • Replicability and reproducibility;
  • Data sharing;
  • Methods for detecting and reducing publication bias or data mining.
Papers or long abstracts must be submitted by Friday, October 10th (midnight Pacific time) through CEGA’s Submission Platform. Travel funds may be provided for presenters. Eligible submissions include completed papers or works in progress.

The 2014 BITSS Conference is organized by the Center for Effective Global Action and co-sponsored by the Alfred P. Sloan Foundation and the Laura and John Arnold Foundation.