Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'Policy'

Tag Archives: Policy

Research Transparency Meeting with CGD

By Garret Christensen (BITSS)

Though BITSS hopes to increase research transparency across the social sciences, several of us, myself included, have a background in development economics. So we were happy to take part in a meeting last week at the Center for Global Development (CGD) in Washington, DC. In addition to BITSS and CGD, representatives from the International Initiative for Impact Evaluation (3ie), Inter-American Development Bank, InterAction, Innovations for Poverty Action Lab (IPA), Millennium Challenge Corporation (MCC), World Bank research group, United States Agency for International Development (USAID), and the US Treasury were present.

I was impressed by how much agreement there was, and how interested these large, sometimes slow-moving, organizations seemed to be, but I should probably temper my enthusiasm a bit: the people in the room were not randomly selected from their respective agencies, and even if they had been, we may still be far from actual policy changes and wider adoption. Regardless, we had a fruitful discussion about some of the roadblocks on the way to increased transparency.

Here are a few of the themes we discussed, mostly obstacles to increased transparency:


MCC’s First Open Data Challenge

The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations.

Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley and Kristin Penn at the Department of Compact Operations outlined the details of the challenge in a recent post on the MCC Poverty Reduction Blog.

Why issue the challenge?

The release of this data is intended to facilitate broader use of the data, above and beyond the scope of the independent evaluations that produced this data. Since the challenge was announced at the end of August, one question to MCC has been – what type of additional learning is the agency interested in?

Who can accept the challenge?

MCC has just announced its first Open Data Challenge – the call-to-action to any masters and PhD students working in economics, public policy, international development, or other related fields who are interested in exploring how to use publicly available MCC-financed primary data for policy-relevant analysis. (more…)

Africa’s Data Revolution – Amanda Glassman

Interview originally posted on the Global Poverty Wonkcast:


Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me on the show to discuss a new report from the Data for African Development working group that looks at Africa’s statistical capacity, warts and all. It turns out that the revolution may not be all it’s cracked up to be, and that well-intentioned outsiders—donors especially—are too often part of the problem.

I ask Amanda if big data is going to solve these problems. Is there hope that Africa will simply be swept up in a big data tsunami?

View original 458 more words

White House Calls for Comments on Reproducible Research

The White House’s Office of Science and Technology Policy (OSTP) has released a request for information on improving the reproducibility of federally funded scientific research.

Given recent evidence of the irreproducibility of a surprising number of published scientific findings, how can the Federal Government leverage its role as a significant funder of scientific research to most effectively address the problem?


A similar request for comments posted by the OSTP on open-access research resulted in policies mandating federally-funded research be made publicly accessible.

To submit comments email by September 23rd.

A Discussion with Victoria Stodden (April 3 — Washington, DC)

The AAAS FIRE and Big Data Affinity groups are co-hosting an informal discussion with Victoria Stodden (Columbia University, Department of Statistics) on Thursday April 3 at 6pm in the Kogod Courtyard of the National Portrait Gallery, in Washington, DC. The debate will focus on the role of data and software in scientific research and funding. What does access mean? How are current policies helping or hindering access? What are the costs? Who benefits? Victoria will also introduce her forthcoming book, Privacy, Big Data, and the Public Good.

Click here to register.

Twenty Tips For Interpreting Scientific Claims

A useful list of 20 concepts to help decision-makers parse how evidence can contribute to a decision, and potentially avoid undue influence by those with vested interests.


Calls for the closer integration of science in political decision-making have been commonplace for decades. However, there are serious problems in the application of science to policy — from energy to health and environment to education […] We suggest that the immediate priority is to improve policy-makers’ understanding of the imperfect nature of science. The essential skills are to be able to intelligently interrogate experts and advisers, and to understand the quality, limitations and biases of evidence. We term these interpretive scientific skills. These skills are more accessible than those required to understand the fundamental science itself, and can form part of the broad skill set of most politicians […] The harder part — the social acceptability of different policies — remains in the hands of politicians and the broader political process.



BITSS Affiliates Advocate for Higher Transparency Standards in Science Magazine

In the January 3, 2014 edition of Science Magazine, an interdisciplinary group of 19 BITSS affiliates reviews recent efforts to promote transparency in the social sciences and make the case for more stringent norms and practices to help boost the quality and credibility of research findings.

The authors, led by UC Berkeley economist Ted Miguel, deplore a dysfunctional reward structure in which statistically significant, novel, and theoretically tidy results get published more easily that null, complicated, or replication outcomes. This misalignment between scholarly incentives and scholarly values, the authors argue, spur researchers to present their data in a way that is more “publishable” – at the expense of accuracy.

Coupled with limited accountability for researchers’ errors and mistakes, this problem has had the effect of producing a somewhat distorted body of evidence that exaggerate the effectiveness of social and economic programs. The stakes are high because policy decisions based on flawed research affect millions of people’s lives.