Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'OSF'

Tag Archives: OSF

Psychology’s Credibility Crisis

In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology. 


According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other study results were found to be irreproducible.

Fortunately, “psychologists, spurred by a growing crisis of faith, are tackling it [the credibility crisis] head-on. Psychologist Brian Nosek at the University of Virginia is at the forefront of the fight.”  Below are excerpts from Nosek’s interview with Discover Magazine discussing what he and others are doing to increase the rigor of research.

What are you doing about the crisis?

BN: In 2011, colleagues and I launched the Reproducibility Project, in which a team of about 200 scientists are carrying out experiments that were published in three psychology journals in 2008. We want to see how many can reproduce the original result, and what factors affect reproducibility. That will tell us if the problem of false-positive results in the psychology journals is big, small or non-existent…

[W]e built the Open Science Framework (OSF) a web application where collaborating researchers can put all their data and research materials so anyone can easily see them. We also offer incentives by offering “badges” for good practices, like making raw data available. So the more open you are, the more opportunities you have for building your reputation.

(more…)

Tools for Research Transparency: a Preview of Upcoming BITSS Training

By Garret Christensen (BITSS)


What are the tools you use to make your research more transparent and reproducible? A lot of my time at BITSS has been spent working on a manual of best practices, and that has required me to familiarize myself with computing tools and resources that make transparent work easier. I’ll be sharing a draft of the manual at the BITSS Annual Meeting, but for now here are a few of the resources I’ve found most useful. If you’d like to learn more about these tools, there are a ton of helpful resources on the respective websites, or for a hands-on learning experience you can sign up for a collaborative training (December 11, 9.00 AM – 12.00 PM) BITSS is organizing with the D-Lab.

(more…)

COS Now Offering Free Consulting Services

A close partner of BITSS, the Center for Open Science (COS) has launched a free consulting service to anyone seeking help with “statistical and methodological questions related to reproducible practices, research design, data analysis, and data management.”

The Center is dedicated to increasing the “openness, integrity, and reproducibility of scientific research” and is looking to advance its mission through a more hands-on approach. Those with methodological questions can email stats-consulting@cos.io for free assistance from computer and data scientists trained in reproducibility and advanced research methods. If a question is too complicated to be answered via email, researchers can schedule a Google Hangout with a COS consultant to have their questions answered in real time. Visit the COS Google Calender for availability.

The Center also offers online and on-site workshops for those seeking to gain a greater understanding of open research topics and tools. For more information on the details of COS’s services visit their Statistical & Methodological Consulting Services page.

“Creating tools is not the biggest challenge. The biggest challenge is getting people to use them”

Brian Nosek, Perspective from Psychology

This is the third post of a video series in which we ask leading social science academics and experts to discuss research transparency in their discipline. The interview was recorded on December 13, 2013 at the University of California, Berkeley.

(more…)

BITSS is hiring!

Interested in improving the standards of rigor in empirical scientific research? Eager to collaborate with leading social science researchers to promote research transparency? Wishing to stay abreast of new advances in empirical research methods and transparency software development?

BITSS is looking for a Program Associate to support the initiative’s evaluation and outreach efforts. The candidate is expected to engage actively with social science researchers at scientific conferences, to raise awareness of new and emerging tools for research transparency. This includes the organization of satellite sessions, trainings, seminars, or other activities to share information and resources.

The candidate will also contribute directly to the refinement of the Open Science Framework, a platform created by the Center for Open Science that helps researchers document and archive their study designs, materials and data.

Sounds like fun? Apply now!

BITSS Affiliates Advocate for Higher Transparency Standards in Science Magazine

In the January 3, 2014 edition of Science Magazine, an interdisciplinary group of 19 BITSS affiliates reviews recent efforts to promote transparency in the social sciences and make the case for more stringent norms and practices to help boost the quality and credibility of research findings.

The authors, led by UC Berkeley economist Ted Miguel, deplore a dysfunctional reward structure in which statistically significant, novel, and theoretically tidy results get published more easily that null, complicated, or replication outcomes. This misalignment between scholarly incentives and scholarly values, the authors argue, spur researchers to present their data in a way that is more “publishable” – at the expense of accuracy.

Coupled with limited accountability for researchers’ errors and mistakes, this problem has had the effect of producing a somewhat distorted body of evidence that exaggerate the effectiveness of social and economic programs. The stakes are high because policy decisions based on flawed research affect millions of people’s lives.

(more…)

New “Reviewer Statement” Initiative Aims to (Further) Improve Community Norms Toward Disclosure

By Etienne LeBel

An Open Science Collaboration — made up of Uri Simonsohn, Etienne LeBel, Don Moore, Leif D. Nelson, Brian Nosek, and Joe Simmons — is glad to announce a new initiative aiming to improve community norms toward the disclosure of basic methodological information during the peer-review process. Endorsed by the Center for Open Science, the initiative involves a standard reviewer statement that any peer reviewer can include in their review requesting that authors add a statement to the paper confirming that they have disclosed all data exclusions, experimental conditions, assessed measures, and how they determined their samples sizes

(more…)