Berkeley Initiative for Transparency in the Social Sciences

SSMART Grants!

Garret Christensen–BITSS Project Scientist

Remember how a couple months ago, BITSS announced that we had launched a prize for especially transparent research and teaching? (Read this if you missed it.) Well, today we’re announcing that we’re also looking to fund research on transparency. Of course, you have to do the funded research transparently, meaning that you’ll be doing transparent research on transparency, which I think earns you double bonus points. To make this even more meta, we’re also looking to fund meta-analysis.

To be specific, BITSS is launching our 2015 Grants for Social Science Meta-Analysis and Research Transparency (SSMART). Full details and the RFP are here. Briefly, we are looking to fund research in three categories: (more…)

Emerging Researcher Perspectives: Replication as a Credible Pre-Analysis Plan

Guest post by Raphael Calel, Ciriacy-Wantrup Postdoctoral Fellow at the Department of Agricultural and Resource Economics at the University of California, Berkeley.

One of the most important tools for enhancing the credibility of research is the pre-analysis plan, or the PAP. Simply put, we feel more confident in someone’s inferences if we can verify that they weren’t data mining, engaging in motivated reasoning, or otherwise manipulating their results, knowingly or unknowingly. By publishing a PAP before collecting data, and then closely following that plan, researchers can credibly demonstrate to us skeptics that their analyses were not manipulated in light of the data they collected.

Still, PAPs are credible only when the researcher can anticipate and wait for the collection of new data.  The vast majority of social science research, however, does not satisfy these conditions. For instance, while it is perfectly reasonable to test new hypotheses about the causes of the recent financial crisis, it is unreasonable to expect researchers to have pre-specified their analyses before the crisis hit. To give another example, no one analysing a time series of more than a couple of years can reasonably be expected to publish a PAP and then wait for years or decades before implementing the study. Most observational studies face this problem in one form or another.


New York Times Covers TOP Guidlines

Yesterday in Science, the Transparency and Openness Promotion (TOP) Committee published the TOP Guidelines, referred to by the New York Times as “the the most comprehensive guidelines for the publication of studies in basic science to date” (see here). The guidelines are the output of a November 2014 meeting at the Center for Open Science (COS), co-hosted with BITSS and Science Magazine.

Transparency and Openness Promotion Guidelines

By Garret Christensen (BITSS)

BITSS is proud to announce the publication of the Transparency and Openness Promotion Guidelines in Science. The Guidelines are a set of standards in eight areas of research publication:

  • Citation Standardstop1
  • Data Transparency
  • Analytic Methods (Code) Transparency
  • Research Materials Transparency
  • Design and Analysis Transparency
  • Preregistration of Sudies
  • Preregistration of Analysis Plans
  • Replication


Emerging Researcher Perspectives: Get it Right the First Time!

Guest post by Olivia D’Aoust, Ph.D. in Economics from Université libre de Bruxelles, and former Fulbright Visiting Ph.D. student at the University of California, Berkeley.

As a Fulbright PhD student in development economics from Brussels, my experience this past year on the Berkeley campus has been eye opening. In particular, I discovered a new movement toward improving the standards of openness and integrity in economics, political science, psychology, and related disciplines lead by the Berkeley Initiative for Transparency in the Social Sciences (BITSS).

When I first discovered BITSS, it struck me how little I knew about research on research in the social sciences, the pervasiveness of fraud in science in general (from data cleaning and specification searching to faking data altogether), and the basic lack of consensus on what is the right and wrong way to do research. These issues are essential, yet too often they are left by the wayside. Transparency, reproducibility, replicability, and integrity are the building blocks of scientific research.


Advisory Board Established for Project TIER

Guest post by Richard Ball and Norm Medeiros, co-principal investigators of Project TIER at Haverford College.

Project TIER (Teaching Integrity in Empirical Economics) is pleased to announce its newly-established Advisory Board. The advisors – George Alter (ICPSR), J. Scott Long (Indiana University), Victoria Stodden (University of Illinois at Urbana-Champaign), and Justin Wolfers (Peterson Institute/University of Michigan) – will help project directors Richard Ball and Norm Medeiros consider ways of developing and promoting the TIER protocol for documenting empirical research.

The guiding principle behind the protocol is that the documentation (data, code, and supplementary information) should be complete and transparent enough to allow an interested third party to easily and exactly reproduce all the steps of data management and analysis that led from the original data files to the results reported in the paper. The ultimate goal of Project TIER is to foster development of a national network of educators committed to integrating methods of empirical research documentation, guided by the principle of transparency, into the curricula of the social sciences.


New Advisory Board Member: Paul Romer

BITSS is delighted to announce that we’ve added a new member to our advisory board: economist Paul Romer. Paul is a prominent economic theorist who has made major contributions to our understanding of economic growth, technological change, and urbanization. Paul is currently Professor of Economics at NYU, director of the Marron Institute of Urban Management, and director of the Urbanization Project at the Leonard N. Stern School of Business. He has previously taught at Stanford University’s Graduate School of Business, the University of California Berkeley, the University of Chicago, and the University of Rochester. You can learn more about him and the other advisory board members here, or you can view Paul’s own website.

As far as work related to transparency, Paul has recently written a paper in the Papers & Proceedings issue of The American Economic Review as well as several related blog posts about “mathiness” in economic theory models. The paper and related blog posts have elicited a significant amount of interest and sparked a fascinating debate among economic theorists.


All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.