Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'Center for Open Science'

Tag Archives: Center for Open Science

Transparency and Openness Promotion Guidelines

By Garret Christensen (BITSS)


BITSS is proud to announce the publication of the Transparency and Openness Promotion Guidelines in Science. The Guidelines are a set of standards in eight areas of research publication:

  • Citation Standardstop1
  • Data Transparency
  • Analytic Methods (Code) Transparency
  • Research Materials Transparency
  • Design and Analysis Transparency
  • Preregistration of Sudies
  • Preregistration of Analysis Plans
  • Replication

(more…)

Recap of Research Integrity in Economics Session at ASSA 2015

By Garret Christensen (BITSS)


BITSS just got back from the ASSA conference, the major annual gathering of economists. The conference largely serves to help new PhD economists find jobs, but there are of course sessions of research presentations, a media presence and sometimes big names like the Chair-of-the-Federal-Reserve in attendance. BITSS faculty Ted Miguel presided a session on research transparency. The session featured presentations by Eva Vivalt (NYU), Brian Nosek (UVA) and Richard Ball (Haverford College).

Vivalt presented part of her job market paper which shows that, at least in development economics, randomized trials seems to result in less publication bias and/or specification-searching than other types of evaluations.

Nosek’s presentation covered a broad range of transparency topics, from his perspective as a psychologist. His discussant, economist Justin Wolfers, concurred completely and focused on how Nosek’s lessons could apply to economics.

As an economist myself, I thought a few of his points were interesting:

  1. The Quarterly Journal of Economics should really have a data-sharing requirement.
  2. Economists don’t do enough meta-analysis (Ashenfelter et al.’s paper on the estimates of the returns to education is a great example of the work we could and should be doing)
  3. Somewhat tongue-in-cheek (I think), Wolfers discussed the fool/cheat paradox: whenever anyone is caught with a mistake in their research, they can either admit to having made an honest mistake, or having cheated. If they choose the “fool” option, as most do, there’s not much one can do to change one’s own intelligence. Why does nobody cop to having cheated? You could more easily make a case for mending your ways if you admitted to cheating.

(more…)

Come Learn More About Research Transparency at ASSA/AEA

If you’re at the ASSA meetings in Boston this weekend, and you are interested in learning more about research transparency, then please stop by booth 127 in the exhibition hall to speak with BITSS and Center for Open Science representatives. Or you can attend our session Monday morning at 10:15am: “Promoting New Norms for Transparency and Integrity in Economic Research.”

Psychology’s Credibility Crisis

In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology. 


According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other study results were found to be irreproducible.

Fortunately, “psychologists, spurred by a growing crisis of faith, are tackling it [the credibility crisis] head-on. Psychologist Brian Nosek at the University of Virginia is at the forefront of the fight.”  Below are excerpts from Nosek’s interview with Discover Magazine discussing what he and others are doing to increase the rigor of research.

What are you doing about the crisis?

BN: In 2011, colleagues and I launched the Reproducibility Project, in which a team of about 200 scientists are carrying out experiments that were published in three psychology journals in 2008. We want to see how many can reproduce the original result, and what factors affect reproducibility. That will tell us if the problem of false-positive results in the psychology journals is big, small or non-existent…

[W]e built the Open Science Framework (OSF) a web application where collaborating researchers can put all their data and research materials so anyone can easily see them. We also offer incentives by offering “badges” for good practices, like making raw data available. So the more open you are, the more opportunities you have for building your reputation.

(more…)

Facilitating Radical Change in Publication Standards: Overview of COS Meeting Part II

Originally posted on the Open Science Collaboration by Denny Borsboom


This train won’t stop anytime soon.

That’s what I kept thinking during the two-day sessions in Charlottesville, where a diverse array of scientific stakeholders worked hard to reach agreement on new journal standards for open and transparent scientific reporting. The aspired standards are intended to specify practices for authors, reviewers, and editors to follow in order to achieve higher levels of openness than currently exist. The leading idea is that a journal, funding agency, or professional organization, could take these standards off-the-shelf and adopt them in their policy. So that when, say, The Journal for Previously Hard To Get Data starts to turn to a more open data practice, they don’t have to puzzle on how to implement this, but may instead just copy the data-sharing guideline out of the new standards and post it on their website.

The organizers1 of the sessions, which were presided by Brian Nosek of the Center for Open Science, had approached methodologists, funding agencies, journal editors, and representatives of professional organizations to achieve a broad set of perspectives on what open science means and how it should be institutionalized. As a result, the meeting felt almost like a political summit. It included high officials from professional organizations like the American Psychological Association (APA) and the Association for Psychological Science (APS), programme directors from the National Institutes of Health (NIH) and the National Science Foundation (NSF), editors of a wide variety of psychological, political, economic, and general science journals (including Science and Nature), and a loose collection of open science enthusiasts and methodologists (that would be me).

(more…)

Creating Standards for Reproducible Research: Overview of COS Meeting

By Garret Christensen (BITSS)


Representatives from BITSS (CEGA Faculty Director Ted Miguel, CEGA Executive Director Temina Madon, and BITSS Assistant Project Scientist Garret Christensen–that’s me) spent Monday and Tuesday of this week at a very interesting workshop at the Center for Open Science aimed at creating standards for promoting reproducible research in the social-behavioral sciences. Perhaps the workshop could have used a catchier name or acronym for wider awareness, but we seemed to accomplish a great deal.  Representatives from across disciplines (economics, political science, psychology, sociology, medicine), from funders (NIH, NSF, Laura and John Arnold Foundation, Sloan Foundation), publishers (Science/AAAS, APA, Nature Publishing Group), editors (American Political Science Review, Psychological Science, Perspectives on Psychological Science, Science), data archivists (ICPSR), and researchers from over 40 leading institutions (UC Berkeley, MIT, University of Michigan, University of British Columbia, UVA, UPenn, Northwestern, among many others) came together to push forward on specific action items researchers and publishers can do to promote transparent and reproducible research.

The work was divided into five subcommittees:

1) Reporting standards in research design

2) Reporting standards in analysis

3) Replications

4) Pre-Registration/Registered Reports

5) Sharing data, code, and materials

(more…)

COS Now Offering Free Consulting Services

A close partner of BITSS, the Center for Open Science (COS) has launched a free consulting service to anyone seeking help with “statistical and methodological questions related to reproducible practices, research design, data analysis, and data management.”

The Center is dedicated to increasing the “openness, integrity, and reproducibility of scientific research” and is looking to advance its mission through a more hands-on approach. Those with methodological questions can email stats-consulting@cos.io for free assistance from computer and data scientists trained in reproducibility and advanced research methods. If a question is too complicated to be answered via email, researchers can schedule a Google Hangout with a COS consultant to have their questions answered in real time. Visit the COS Google Calender for availability.

The Center also offers online and on-site workshops for those seeking to gain a greater understanding of open research topics and tools. For more information on the details of COS’s services visit their Statistical & Methodological Consulting Services page.