Research Practices for Scientific Rigor: A Resource for Discussion, Training, and Practice
Among human endeavors, science is unique because it yields progress: science advances our understanding of nature, yields new technologies, and improves human health. Progress depends on the consistent application of the highest standards in research methodologies, including experimental design, statistical analysis and reporting, and scientific communication, which we collectively refer to as "scientific rigor."
Recent scientific and public reports e.g., 1–6 have raised concerns about lower-than-expected rates of replication, especially in preclinical research, and some have raised the serious question of whether the self-correcting nature of the scientific enterprise is being undercut. Funding agencies, scientific journals, professional organizations, and institutions have begun to examine the factors that underlie such concerns. Increasingly, the broad scientific community has become focused on the need to maintain and enhance rigor as part of our collective responsibility to the integrity of the scientific mission.
The 黑料社 (SfN), like many professional societies, is evaluating scientific rigor within our own field. The Society is committed to helping ensure that neuroscientists are well trained in best research practices, that those practices are consistently adhered to, and that methods and design issues are reported in such a way so as to permit appropriate evaluation of results and facilitate attempts at replication. The document that follows is intended to support SfN members in our shared commitment to adhere strongly to principles of scientific rigor.
Resources for Neuroscientists
The SfN Scientific Rigor Working Group has developed the following set of research practices to serve as a foundation for ongoing field discussion. Additionally, the working group and other SfN leaders are encouraging awareness and discussion of scientific rigor in neuroscience through many SfN scientific venues and training forums, ranging from the SfN annual meeting and our journals to professional development and training programs.
The Research Practices section below is not a complete list of all research practices and is intended to serve as a foundation for trainees and experienced scientists alike to reference and use as the basis for conversation, training, and practice. Some of these practices are already established or straightforward, whereas other issues may be more complicated or unresolved. It is important to note that it may not always be possible to strictly adhere to every guideline, and that resulting research can still be rigorous. Given the complexity of these issues as they relate to individual research questions, SfN encourages full transparency, consideration, and communication regarding the implications of the choices made during research.
Policy Considerations
The Working Group stresses that the rigorous conduct of science can be influenced by other factors beyond the actual conduct of science. Factors that warrant ongoing discussion and action by the field include:
- support for publication of negative results or results deemed inadequately "exciting," or "novel"7–10;
- avoidance of “rushing” findings into publication without full investigation and proper self-replication6,9;
- increasing incentives to retract incorrect or unreproducible findings9;
- providing incentives and/or funding to perform replications1,11–13;
- consideration of the proper balance between increasing numbers of animals for replication and the goals of "replacement, reduction, and refinement" in animal research14;
- minimization of incentives that drive research conducted for reasons other than pursuit of truth (academic promotions, "publish or perish")8; and
- consideration of ways to counter the emerging trend in the peer review process in which additional experiments are requested on an abbreviated timeline, and pressure for results to be interpreted in ways that conform to previously-reached conclusions16.
Research Practices
The Working Group recommends consistent attention and discussion regarding the following practices:
Experimental Design includes subject selection, use of controls, and other methodological concerns.
Topic | Possible Approaches |
Unbiased sampling and data collection |
|
Experimental approach |
|
Thorough characterization of experimental effect |
|
Data Analysis includes correct collection and analysis of data, and use of appropriate statistics and sample sizes7.
Topic | Possible Approaches |
Where possible, analyses should be pre-planned25 |
|
Post-experiment data analyses |
|
Statistical design18 |
|
Transparency includes reporting, publishing, or providing access to specific data, methods, or analyses7.
Topic | Possible Approaches |
Data preservation | |
Full transparency in data and methods reporting |
|
Sources Cited
1. Wadman, M. . Nature 500, 14–16 (2013).
2. Trouble at the lab.The Economist (2013). at <>
3. Hiltzik, M. Science has lost its way, at a big cost to humanity. Los Angeles Times (2013). at <>
4. Couzin-Frankel, J. . Science 342, 68–69 (2013).
5. Collins, F. S. & Tabak, L. A. . Nature 505, 612–613 (2014).
6. Zimmer, C. Rise in Scientific Journal Retractions Prompts Calls for Reform. The New York Times (2012). at <>
7. Landis, S. C. et al. . Nature 490, 187–191 (2012).
8. Begley, C. G. & Ellis, L. M. Nature 483, 531–533 (2012).
9. Fanelli, D. PLoS Med 10, e1001563 (2013).
10. Franco, A., Malhotra, N. & Simonovits, G. Science 1255484 (2014). doi:10.1126/science.1255484
11. Steward, O., Popovich, P. G., Dietrich, W. D. & Kleitman, N. Exp. Neurol. 233, 597–605 (2012).
12. Button, K. S. et al. . Nat. Rev. Neurosci. 14, 365–376 (2013).
13. Funder, D. C. et al. Personal. Soc. Psychol. Rev. 18, 3–12 (2014).
14. Fitts, D. A. . J. Am. Assoc. Lab. Anim. Sci. JAALAS 50, 445–453 (2011).
15. Ploegh, H. . Nat. News 472, 391–391 (2011).
16. Rockman, H. A. . J. Clin. Invest. 124, 463–463 (2014).
17. Ioannidis, J. P. A. . PLoS Med 2, e124 (2005).
18. Begley, C. G. Nature 497, 433–434 (2013).
19. Nestler, E. J. & Hyman, S. E. . Nat. Neurosci. 13,1161–1169 (2010).
20. Wahlsten, D. et al. . J. Neurobiol. 54, 283–311 (2003).
21. Willner, P. & Mitchell, P. J. . Behav. Pharmacol. 13, 169–188 (2002).
22. Schulz, K. F., Altman, D. G. & Moher, D. . BMC Med. 8, 18 (2010).
23. Lapchak, P. A. . J. Neurol. Neurophysiol. 3, (2012).
24. De Souza, N. . Nat. Methods 10, 288–288 (2013).
25. Ruxton, G. D. & Beauchamp, G. . Behav. Ecol. 19, 690–693 (2008).
26. Simmons, J. P., Nelson, L. D. & Simonsohn, U. . Psychol. Sci.22, 1359–1366 (2011).
27. Cortina, J. M. & Landis, R. S. Organ. Res. Methods 14, 332–349 (2011).
28. Costafreda, S. G. . Front. Neuroinformatics 3, (2009).
29. Demets, D. L. . Stat. Med. 6, 341–348 (1987).
30. Machlis, L., Dodd, P. W. D. & Fentress, J. C. . Z. Für Tierpsychol. 68, 201–214 (1985).
31. Kendziorski, C., Irizarry, R. A., Chen, K.-S., Haag, J. D. & Gould, M. N. . Proc. Natl. Acad. Sci. U. S. A. 102, 4252–4257 (2005).
32. Imbeaud, S. & Auffray, C. . Drug Discov. Today 10, 1175–1182 (2005).
33. Schafer, J. L. & Graham, J. W. Psychol. Methods7, 147–177 (2002).
34. Graham, J. W., Cumsille, P. E. & Elek-Fisk, E. in Handbook of Psychology (John Wiley & Sons, Inc., 2003). at <>
35. Twisk, J. & de Vente, W. . J. Clin. Epidemiol. 55, 329–337 (2002).
36. Tenopir, C. et al. . PLoS ONE 6, e21101 (2011).
37. Simard, J. M. & Gerzanich, V. . Exp. Neurol. 233, 623–624 (2012).
38. Viswanathan, M. et al. . J. Clin. Epidemiol. (2014). doi:10.1016/j.jclinepi.2014.02.023