Home » Posts tagged 'Peer Review'
Tag Archives: Peer Review
Last year, we featured a story on our blog about the so-called cardiovascular benefits of fish oil, largely based on a seminal research study that had more to do with hearsay than with actual science. After your diet, flawed research is now trying to meddle with your sports life.
A Danish study published in the Journal of the American College of Cardiology recently made the headlines for suggesting that too much jogging could have a negative impact on life expectancy. In a recent post in the New York Times, economist Justin Wolfers carefully analyses the study and provides a brilliant response discrediting this overly-confident claim:
The researchers asked Danish runners about the speed, frequency and duration of their workouts, categorizing 878 of them as light, moderate or strenuous joggers. Ten years later, the researchers checked government records to see how many of them had died […] Happily, only 17 had. While this was good news for the surviving runners, it was bad news for the researchers, because 17 was clearly too few deaths to discern whether the risk of death was related to running intensity.
Nonetheless, the study claimed that too much jogging was associated with a higher mortality rate […] The evidentiary basis for this claim is weak. It is based on 40 people who were categorized as “strenuous joggers” — among whom only two died. That’s right: The conclusions that received so much attention were based on a grand total of two deaths among strenuous joggers. As Alex Hutchinson of Runner’s World wrote, “Thank goodness a third person didn’t die, or public health authorities would be banning jogging.”
Because the sample size was so small, this difference is not statistically significant. You may have heard the related phrase “absence of evidence does not equal evidence of absence,” and it is particularly relevant here […] In fact, the main thing the study shows is that small samples yield unreliable estimates that cannot be reliably discerned from the effects of chance.
Wolfers goes on highlighting other weaknesses in the Danish study. This new case of unreliable research finding receiving wide media coverage brings out a couple of important points that are central to our work at BITSS:
Guest Post by Liz Allen (ScienceOpen)
For the 3rd annual conference of The Berkeley Initiative for Transparency in the Social Sciences (BITSS), ScienceOpen, the new Open Access (OA) research + publishing network, would like prospective and registered attendees to consider the role that Post-Publication Peer Review (PPPP) can play in increasing the transparency of research.
When we launched earlier this year, we interviewed Advisory Board Member Peter Suber. One of the original founders of the Open Access movement, Peter is currently director of the Harvard Office for Scholarly Communication and the Harvard Open Access Project. His latest book, “Open Access” (MIT Press, 2012), is an important starting point for anyone new to the topic. We asked Peter various questions including “How important is it that OA penetrates research disciplines beyond science?” Here’s what he said:
“It is very important in my opinion. I have been arguing since 2004 that OA brings the same benefits in every field, even if some fields present more obstacles or fewer opportunities. For example, the natural sciences are better funded than the humanities, which means they have more money to pay for OA. In particular, there is more public funding for the sciences than the humanities, which means that the compelling taxpayer argument for OA gets more traction in the sciences than the humanities. In addition, books are at least as important as journal articles for humanities scholars, if not more import ant, and OA for books, while growing quickly, is objectively harder than OA for journal articles. The good news is that OA in the humanities is growing – not faster than OA in the sciences, but faster than in the past. More humanities scholars understand the benefits and opportunities for OA, and are answering the objections and misunderstandings raised against it”.
The journal Science is adding an additional step of statistical checks to its peer-review process in an effort to strengthen confidence in published study findings.
From the July 4th edition of Science:
[…] Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis, to provide better oversight of the interpretation of observational data. Members of the SBoRE will receive manuscripts that have been identified […] as needing additional scrutiny of the data analysis or statistical treatment. The SBoRE member assesses what the issue is that requires screening and suggests experts from the statistics community to provide it.
So why is Science taking this additional step? Readers must have confidence in the conclusions published in our journal. We want to continue to take reasonable measures to verify the accuracy of those results. We believe that establishing the SBoRE will help avoid honest mistakes and raise the standards for data analysis, particularly when sophisticated approaches are needed.
[…] I have been amazed at how many scientists have never considered that their data might be presented with bias. There are fundamental truths that may be missed when bias is unintentionally overlooked, or worse yet, when data are “massaged.” Especially as we enter an era of “big data,” we should raise the bar ever higher in scrutinizing the analyses that take us from observations to understanding.
Another scandal of peer review abuse should urge academic journals to reconsider their publication requirements.
This one comes from the Journal of Vibration and Control (JVC), an highly technical outlet in the field of acoustics, which just retracted 60 papers at once.
The mass retraction followed the revelation of a “peer review ring” in which one or more researchers fabricated identities to review their own papers and get them published.
It is not the first time such scandal happens, but the number of papers involved and the apparent facility under which the shady researchers operated should ring the alarm bell.
JVC is part of SAGE Publications, a leading international publisher of scholarly and educational products, many of which in the social sciences.
Considering the significant impact that social science studies can have on the design of social and economic policies, which affect all of us, isn’t it time the academic community realign its professional incentives with scholarly values?
In recent years, the interdisciplinary nature of global health has blurred the lines between medicine and social science. As medical journals publish non-experimental research articles on social policies or macro-level interventions, controversies have arisen when social scientists have criticized the rigor and quality of medical journal articles, raising general questions about the frequency and characteristics of methodological problems and the prevalence and severity of research bias and error.
Published correspondence letters can be used to identify common areas of dispute within interdisciplinary global health research and seek strategies to address them. To some extent, these letters can be seen as a “crowd-sourced” (but editor-gated) approach to public peer review of published articles, from which some characteristics of bias and error can be gleaned.
In December 2012, we used the online version of The Lancet to systematically identify relevant correspondence in each issue published between 2008 and 2012. We summarize and categorize common areas of dispute raised in these letters.
From Jerry Adler in the Pacific Standard—on the credibility crisis in social science research, publication bias, data manipulation, and non-replicability. Featuring BITSS aficionados Brian Nosek, Joe Simmons, Uri Simonsohn and Leif Nelson.
Something unprecedented has occurred in the last couple of decades in the social sciences. Overlaid on the usual academic incentives of tenure, advancement, grants, and prizes are the glittering rewards of celebrity, best-selling books, magazine profiles, TED talks, and TV appearances. A whole industry has grown up around marketing the surprising-yet-oddly-intuitive findings of social psychology, behavioral economics, and related fields. The success of authors who popularize academic work—Malcolm Gladwell, the Freakonomics guys, and the now-disgraced Jonah Lehrer—has stoked an enormous appetite for usable wisdom from the social sciences. And the whole ecosystem feeds on new, dramatic findings from the lab. “We are living in an age that glorifies the single study,” says Nina Strohminger, a Duke post-doc in social psychology. “It’s a folly perpetuated not just by scientists, but by academic journals, the media, granting agencies—we’re all complicit in this hunger for fast, definitive answers.”
Read the full article here.