Berkeley Initiative for Transparency in the Social Sciences

Home » Posts tagged 'Research Bias'

Tag Archives: Research Bias

A Rough Guide to Spotting Bad Science

Twelve points that will help separate the science from the pseudoscience, written by Compound Interest (see here).

Compound Interest

Compound Interest

What to Do If You Are Accused of P-Hacking

In a recent post on Data Colada, University of Pennsylvania Professor Uri Simonsohn discusses what do in the event you (a researcher) are accused of having altered your data to increase statistical significance.

Simonsohn states:

It has become more common to publicly speculate, upon noticing a paper with unusual analyses, that a reported finding was obtained via p-hacking.

For example “a post by Andrew Gelman suspected p-hacking in a paper that collected data on 10 colors of clothing, but analyzed red & pink as a single color” [.html] (see authors’ response to the accusation .html) or “a statistics blog suspected p-hacking after noticing a paper studying number of hurricane deaths relied on the somewhat unusual Negative-Binomial Regression” [.html].

Instinctively, Simonsohn says, a researcher may react to accusations of p-hacking by attempting to justify the specifics of his/her research design but if that justification is ex-post, the explanation will not be good enough. In fact:

P-hacked findings are by definition justifiable. Unjustifiable research practices involve incompetence or fraud, not p-hacking.


Reproducible Research: True or False?

Keynote speaker at the upcoming BITSS annual meeting John Ioannidis (Professor of Health Research and Policy at Stanford School of Medicine, and Co-Director of the Meta-Research Innovation Center) speaks at Google about its efforts to improve research designs standards and reproducibility in science. Ioannidis is the author of the 2005 highly influential paper Why Most Published Research Findings Are False, the most downloaded technical paper from the open access library PLOS.

The annual BITSS meeting will be held at UC Berkeley on December 11-12. You can find the agenda on the event page and register here.

Science Establishes New Statistics Review Board

The journal Science is adding an additional step of statistical checks to its peer-review process in an effort to strengthen confidence in published study findings.

From the July 4th edition of Science:

[…] Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis, to provide better oversight of the interpretation of observational data. Members of the SBoRE will receive manuscripts that have been identified […] as needing additional scrutiny of the data analysis or statistical treatment. The SBoRE member assesses what the issue is that requires screening and suggests experts from the statistics community to provide it.

So why is Science taking this additional step? Readers must have confidence in the conclusions published in our journal. We want to continue to take reasonable measures to verify the accuracy of those results. We believe that establishing the SBoRE will help avoid honest mistakes and raise the standards for data analysis, particularly when sophisticated approaches are needed.

[…] I have been amazed at how many scientists have never considered that their data might be presented with bias. There are fundamental truths that may be missed when bias is unintentionally overlooked, or worse yet, when data are “massaged.” Especially as we enter an era of “big data,” we should raise the bar ever higher in scrutinizing the analyses that take us from observations to understanding.


Peer Review of Social Science Research in Global Health

A new working paper by Victoria Fan, Rachel Silverman, David Roodman, and William Savedoff at the Center for Global Development.


In recent years, the interdisciplinary nature of global health has blurred the lines between medicine and social science. As medical journals publish non-experimental research articles on social policies or macro-level interventions, controversies have arisen when social scientists have criticized the rigor and quality of medical journal articles, raising general questions about the frequency and characteristics of methodological problems and the prevalence and severity of research bias and error.

Published correspondence letters can be used to identify common areas of dispute within interdisciplinary global health research and seek strategies to address them. To some extent, these letters can be seen as a “crowd-sourced” (but editor-gated) approach to public peer review of published articles, from which some characteristics of bias and error can be gleaned.

In December 2012, we used the online version of The Lancet to systematically identify relevant correspondence in each issue published between 2008 and 2012. We summarize and categorize common areas of dispute raised in these letters.


Flawed Research On Your Plate

You might want to reconsider paying extra dollar for these fish oil supplements. A new study said most of the research literature on the cardiovascular benefits of omega-3 fatty acids is flawed.

In the early 1970s, two Danish researchers started to investigate the diet of Greenland’s Inuit populations, which were believed to live longer than their Caucasian counterparts. The study concluded that the large intake of seal and whale blubber by the Inuits — what got labelled at the “Eskimo Diet” — helped reduce the risks of heart diseases and increase life expectancy.

A new study says fish oil capsules have no effect on heart disease

A new study says fish oil capsules have no effect on heart disease (Photo Credit:

This resulted in the proliferation of studies on the cardioprotective effects of fish oil, and the boom of what has become a global billion-dollar industry.

However, according to a recent study from University of Ottawa’s George Fodor and his team, food capsules simply don’t do anything to help prevent heart diseases. “Most researchers never read the original 1970s paper”, said Fodor. “They just took it at face value that what was said was true […] We reviewed the original paper and it turns out that the Danish researchers never measured the frequency of heart disease.”