Home » Guest Post
Category Archives: Guest Post
Guest post by Raphael Calel, Ciriacy-Wantrup Postdoctoral Fellow at the Department of Agricultural and Resource Economics at the University of California, Berkeley.
One of the most important tools for enhancing the credibility of research is the pre-analysis plan, or the PAP. Simply put, we feel more confident in someone’s inferences if we can verify that they weren’t data mining, engaging in motivated reasoning, or otherwise manipulating their results, knowingly or unknowingly. By publishing a PAP before collecting data, and then closely following that plan, researchers can credibly demonstrate to us skeptics that their analyses were not manipulated in light of the data they collected.
Still, PAPs are credible only when the researcher can anticipate and wait for the collection of new data. The vast majority of social science research, however, does not satisfy these conditions. For instance, while it is perfectly reasonable to test new hypotheses about the causes of the recent financial crisis, it is unreasonable to expect researchers to have pre-specified their analyses before the crisis hit. To give another example, no one analysing a time series of more than a couple of years can reasonably be expected to publish a PAP and then wait for years or decades before implementing the study. Most observational studies face this problem in one form or another.
Guest post by Olivia D’Aoust, Ph.D. in Economics from Université libre de Bruxelles, and former Fulbright Visiting Ph.D. student at the University of California, Berkeley.
As a Fulbright PhD student in development economics from Brussels, my experience this past year on the Berkeley campus has been eye opening. In particular, I discovered a new movement toward improving the standards of openness and integrity in economics, political science, psychology, and related disciplines lead by the Berkeley Initiative for Transparency in the Social Sciences (BITSS).
When I first discovered BITSS, it struck me how little I knew about research on research in the social sciences, the pervasiveness of fraud in science in general (from data cleaning and specification searching to faking data altogether), and the basic lack of consensus on what is the right and wrong way to do research. These issues are essential, yet too often they are left by the wayside. Transparency, reproducibility, replicability, and integrity are the building blocks of scientific research.
Project TIER (Teaching Integrity in Empirical Economics) is pleased to announce its newly-established Advisory Board. The advisors – George Alter (ICPSR), J. Scott Long (Indiana University), Victoria Stodden (University of Illinois at Urbana-Champaign), and Justin Wolfers (Peterson Institute/University of Michigan) – will help project directors Richard Ball and Norm Medeiros consider ways of developing and promoting the TIER protocol for documenting empirical research.
The guiding principle behind the protocol is that the documentation (data, code, and supplementary information) should be complete and transparent enough to allow an interested third party to easily and exactly reproduce all the steps of data management and analysis that led from the original data files to the results reported in the paper. The ultimate goal of Project TIER is to foster development of a national network of educators committed to integrating methods of empirical research documentation, guided by the principle of transparency, into the curricula of the social sciences.
Project TIER (Teaching Integrity in Empirical Research) is an initiative that promotes training in open and transparent methods of quantitative research in the undergraduate and graduate curricula across all the social sciences.
The Project anticipates awarding three or four TIER Faculty Fellowships for the 2015-16 academic year. Fellows will collaborate with TIER leadership and work independently to develop and disseminate transparent research methods that are suitable for adoption by students writing theses, dissertations or other supervised papers, or that can be incorporated into classes in which students conduct quantitative research.
The period of the Fellowships will be from June 1, 2015 through June 30, 2016. Each Fellow will receive a stipend of $5,000.
Applications are due April 19, 2015. Early inquiries and expressions of interest are encouraged. To learn more and apply, visit http://www.haverford.edu/TIER/opportunities/fellowships_2015-16.php.
To combat the practice of p-hacking, the editors of Basic and Applied Social Psychology (BASP) will no longer publish p-values included in articles submitted to the journal. The unprecedented move by the journal’s editorial board signals publishing norms may be changing faster than previously believed, but also raises certain issues. In a recent article published by Rutledge, editors of BASP, David Trafimow and Michael Marks, bring up 3 key questions associated with the banning of the null hypothesis significance testing procedure (NHSTP).
Question 1: Will manuscripts with p-values be desk rejected automatically?
Answer 1: No […] But prior to publication, authors will have to remove all vestiges of the NHSTP (p-values, t-values, F-values, statements about ‘‘significant’’ differences or lack thereof, and so on).
Question 2: What about other types of inferential statistics such as confidence intervals or Bayesian methods?
Answer 2: Analogous to how the NHSTP fails to provide the probability of the null hypothesis, […] confidence intervals do not provide a strong case for concluding that the population parameter of interest is likely to be within the stated interval. Therefore, confidence intervals also are banned from BASP.
Guest Post by Anja Tolonen (University of Gothenburg, Sweden)
Seventeen excited graduate students in Economics met at the University of Gothenburg, a Monday in September, to initiate an ongoing discussion about transparency practices in Economics. The students came from all over the world: from Kenya, Romania, Hong Kong, Australia and Sweden of course. The initiative itself also came from across an ocean too: Berkeley, California. The students had different interests within Economics: many of us focus on Environmental or Development Economics but there were also Financial Economists and Macroeconomists present.
The teaching material, which was mostly based on material from the first Summer Institute, organized by BITSS in June 2014, quickly prompted many questions. “Is it feasible to pre-register analysis on survey data?”, “Are graduate students more at risk of P-hacking than their senior peers?”, “Are some problems intrinsic to the publishing industry?” and “Does this really relate to my field?” several students asked. Some students think yes: