Home » Posts tagged 'Statistical Analysis'
Tag Archives: Statistical Analysis
Interview originally posted on the Global Poverty Wonkcast:
Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me on the show to discuss a new report from the Data for African Development working group that looks at Africa’s statistical capacity, warts and all. It turns out that the revolution may not be all it’s cracked up to be, and that well-intentioned outsiders—donors especially—are too often part of the problem.
I ask Amanda if big data is going to solve these problems. Is there hope that Africa will simply be swept up in a big data tsunami?
View original 458 more words
Hosted by the NYU Center for Urban Science on July 16, the event included several panels with the book’s editors and a number of the authors.
Overview of the book:
Massive amounts of new data about people, their movements, and activities can now be accessed and analyzed as never before. Numerous privacy concerns have been raised by use – or misuse – of such data in commercial and national security arenas. Yet we are motivated by the potential for “big data” to be harnessed to serve the public good: scientists can use new forms of data to do research that improves people’s live; federal, state and local governments can use data to improve the delivery of services to citizens; and non-profit organizations can use the information to advance the public good.
Access to big data raises many unanswered questions related to privacy and confidentiality: What are the ethical and legal requirements for scientists and government officials seeking to serve the public good without harming individual citizens? What are the rules of engagement? What are the best ways to provide access while protecting confidentiality? Are there reasonable mechanisms to compensate citizens for privacy loss?
Published by Cambridge University Press, the book is an accessible summary of the important legal, economic, and statistical thinking that frames the many privacy issues associated with the use of big data – along with practical suggestions for protecting privacy and confidentiality that can help to guide practitioners.
The journal Science is adding an additional step of statistical checks to its peer-review process in an effort to strengthen confidence in published study findings.
From the July 4th edition of Science:
[…] Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis, to provide better oversight of the interpretation of observational data. Members of the SBoRE will receive manuscripts that have been identified […] as needing additional scrutiny of the data analysis or statistical treatment. The SBoRE member assesses what the issue is that requires screening and suggests experts from the statistics community to provide it.
So why is Science taking this additional step? Readers must have confidence in the conclusions published in our journal. We want to continue to take reasonable measures to verify the accuracy of those results. We believe that establishing the SBoRE will help avoid honest mistakes and raise the standards for data analysis, particularly when sophisticated approaches are needed.
[…] I have been amazed at how many scientists have never considered that their data might be presented with bias. There are fundamental truths that may be missed when bias is unintentionally overlooked, or worse yet, when data are “massaged.” Especially as we enter an era of “big data,” we should raise the bar ever higher in scrutinizing the analyses that take us from observations to understanding.
An eight-step strategy to increase the integrity and credibility of social science research using the new statistics, by Geoff Cumming.
We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial.
The full article is available here.
Psychological Science, the flagship journal of the Association for Psychological Science (APS), is introducing innovative new guidelines for authors, part of an effort to strengthen the reporting and analysis of findings in psychological research.
Starting January 1, 2014, submitting authors will be required to state that they have disclosed all important methodological details, including excluded variables and additional manipulations and measures, as a way of encouraging methodological transparency […] Psychological Science will also serve as a launch vehicle for a program to promote open communication within the research community by recognizing authors who have made data, materials, and/or preregistered design and analysis plans publicly available with specific “badges.”