Home » Posts tagged 'Research Methods'
Tag Archives: Research Methods
Project TIER (Teaching Integrity in Empirical Research) is an initiative that promotes training in open and transparent methods of quantitative research in the undergraduate and graduate curricula across all the social sciences.
The Project anticipates awarding three or four TIER Faculty Fellowships for the 2015-16 academic year. Fellows will collaborate with TIER leadership and work independently to develop and disseminate transparent research methods that are suitable for adoption by students writing theses, dissertations or other supervised papers, or that can be incorporated into classes in which students conduct quantitative research.
The period of the Fellowships will be from June 1, 2015 through June 30, 2016. Each Fellow will receive a stipend of $5,000.
Applications are due April 19, 2015. Early inquiries and expressions of interest are encouraged. To learn more and apply, visit http://www.haverford.edu/TIER/opportunities/fellowships_2015-16.php.
BITSS is pleased to announce it is now accepting applications to attend its 2015 Summer Institute.
This year’s workshop entitled “Transparency and Reproducibility Methods for Social Science Research” will be held in Berkeley, June 10-12. The intensive course will provide participants with a thorough overview of best practices for open, reproducible research, allowing them to remain in the vanguard of new scientific frontiers.
Topics covered include:
- Ethics in Experimental Research
- False-positives, P-hacking, P-curve, Power Analysis
- Data Management & Statistical Analysis in R
- Theory and Implementation of Pre-analysis Plans
- Approaches to the Replication of Research
- Meta-analyses: New Tools & Techniques
- Next Steps in Changing Scientific Research Practices
This January 5th, 10.15am at the American Economic Association Annual Meeting in Boston, MA (Sheraton Hotel, Commonwealth Room).
Session: Promoting New Norms for Transparency and Integrity in Economic Research
Presiding: Edward Miguel (UC Berkeley)
- Brian Nosek (University of Virginia): “Scientific Utopia: Improving Openness and Reproducibility in Scientific Research”
- Richard Ball (Haverford College): “Replicability of Empirical Research: Classroom Instruction and Professional Practice”
- Eva Vivalt (New York University): “Bias and Research Method: Evidence from 600 Studies”
- Aprajit Mahajan (UC Berkeley)
- Justin Wolfers (UC Michigan)
- Kate Casey (Stanford University)
More info here. Plus don’t miss the BITSS/COS Exhibition Booth at the John B. Hynes Convention Center (Level 2, Exhibition Hall D).
Richard Ball (Economics Professor at Haverford College and presenter at the 2014 BITSS Summer Institute) and Norm Medeiros (Associate Librarian at Haverford College) in a recent interview appearing on the Library of Congress based blog The Signal, discussed Project TIER (Teaching Integrity in Empirical Research) and their experience educating students how to document their empirical analysis.
What is Project TIER
For close to a decade, we have been teaching our students how to assemble comprehensive documentation of the data management and analysis they do in the course of writing an original empirical research paper. Project TIER is an effort to reach out to instructors of undergraduate and graduate statistical methods classes in all the social sciences to share with them lessons we have learned from this experience.
What is the TIER documentation protocol?
We gradually developed detailed instructions describing all the components that should be included in the documentation and how they should be formatted and organized. We now refer to these instructions as the TIER documentation protocol. The protocol specifies a set of electronic files (including data, computer code and supporting information) that would be sufficient to allow an independent researcher to reproduce–easily and exactly–all the statistical results reported in the paper.
BITSS is pleased to announce its 3rd annual meeting (December 11-12 – Berkeley, CA).
This year’s research transparency meeting will be the first to be open to the public and is anticipated to be the largest BITSS event to date. The event will act to update the academic community of the growing movement for greater openness in research and as a forum to discuss a variety of transparency related issues such as changing journal practices, novel evidence of publication bias and burgeoning related initiatives.
The gathering will cater to a range of guests from seasoned transparency experts to new supporters of the transparency movement. As such the event will consist of a variety of activities including a collaborative training session geared towards disseminating new tools, a public conference with presentations from transparency leaders, and a research seminar showcasing the latest developments in the world of research transparency.
Confirmed speakers include Edward Miguel (CEGA’s Faculty Director), John Ioannidis from Stanford School of Medicine, Victoria Stodden from the University of Illinois and Brian Nosek from the Center for Open Science.
An eight-step strategy to increase the integrity and credibility of social science research using the new statistics, by Geoff Cumming.
We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial.
The full article is available here.
Psychological Science, the flagship journal of the Association for Psychological Science (APS), is introducing innovative new guidelines for authors, part of an effort to strengthen the reporting and analysis of findings in psychological research.
Starting January 1, 2014, submitting authors will be required to state that they have disclosed all important methodological details, including excluded variables and additional manipulations and measures, as a way of encouraging methodological transparency […] Psychological Science will also serve as a launch vehicle for a program to promote open communication within the research community by recognizing authors who have made data, materials, and/or preregistered design and analysis plans publicly available with specific “badges.”