Education Resources
Jump to a section: Research Quality Essentials · Sample Size & Power · Statistics · Preregistration · Bias & QRPs · Reporting & Open Materials · Open Data & Sharing · Sport Science Specific · Teaching · James Steele’s Reading List
Research Quality Essentials
Foundational resources for understanding why research quality, replication, and meta-science matter. Start here if you are new to open science.
Key Resources
A free, regularly updated open textbook covering p-values, effect sizes, confidence intervals, Bayesian statistics, equivalence testing, and meta-analysis. Includes interactive exercises. The gold standard starting point for any researcher wanting to improve their statistical reasoning. Also available as a free Coursera course.
A comprehensive, community-curated catalogue of open science educational resources, organised by topic. An excellent discovery tool for finding tutorials, readings, and teaching materials across all aspects of open research practice.
Caldwell et al. (2020) — Moving sport and exercise science forward: a call for the adoption of more transparent research practices. Sports Medicine, 50(3), 449–459. A field-specific call to action arguing for preregistration, open data, and registered reports in sport science. Essential reading for understanding the context of this centre’s work. https://doi.org/10.1007/s40279-019-01227-1
Halperin et al. (2018) — Strengthening the practice of exercise and sport-science research. International Journal of Sports Physiology and Performance, 13, 127–134. Identifies the specific methodological weaknesses prevalent in sport science (small samples, underpowered designs, publication bias) and proposes practical solutions. https://doi.org/10.1123/ijspp.2017-0322
Nosek & Errington (2020) — What is replication? PLoS Biology, 18(3), e3000691. A conceptually clear and concise paper defining what replication is, what it can and cannot tell us, and why it is the cornerstone of scientific progress. https://doi.org/10.1371/journal.pbio.3000691
Heneghan et al. (2012) — Forty years of sports performance research and little insight gained. BMJ, 345, e4797. An early and influential paper documenting the poor methodological quality of sports science research across four decades. https://doi.org/10.1136/bmj.e4797
Sports Metaresearch: An Emerging Discipline — Warmenhoven et al. (2025). Sports Medicine, 55, 845–856. Introduces metaresearch — research on research — as a formal sub-discipline of sport science, with a framework for evaluating and improving research quality in the field. https://doi.org/10.1007/s40279-025-02181-x
Sample Size, Power, and Precision
Tools and explanations for planning informative studies and avoiding underpowered research. Underpowered studies are one of the most pervasive problems in sports science.
Key Resources
An interactive visualisation tool that lets you explore how sample size, effect size, and significance threshold relate to statistical power. Ideal for building an intuition for why small studies are so often misleading. Also explore the full suite of visualisations here.
Sample Size Justification — Daniel Lakens (2022). Collabra: Psychology, 8(1). A practical paper covering how to justify sample size using power analysis, precision, or other criteria. A must-read before designing any study. https://doi.org/10.1525/collabra.33267
G*Power — Free power analysis software for a wide range of statistical tests. The most widely used standalone tool for a priori power calculations. Available for Windows and Mac. https://www.psychologie.hhu.de/arbeitsgruppen/allgemeine-psychologie-und-arbeitspsychologie/gpower
Superpower — Aaron Caldwell & Daniel Lakens. R package for power analysis in factorial designs. Particularly useful for sport science designs with multiple conditions or outcome variables. Includes tutorials and a Shiny app. https://aaroncaldwell.us/Superpower/
pwr — R package for basic power analysis. Covers t-tests, ANOVA, correlation, and proportion tests. Straightforward and well-documented. https://cran.r-project.org/package=pwr
Statistics You Can Actually Interpret
Visual and conceptual explanations of core statistical ideas that go beyond p-values. Understanding what your results actually mean is the foundation of good science.
Key Resources
A free, openly accessible guide covering how to compute, interpret, and report effect sizes and confidence intervals across a wide range of study designs. Includes R code. An excellent practical companion to statistical textbooks.
Greenland et al. (2016) — Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations. European Journal of Epidemiology, 31, 337–350. A landmark paper enumerating 25 common misinterpretations of statistical concepts. Read this to identify and correct errors in your own thinking and in the published literature. https://doi.org/10.1007/s10654-016-0149-3
Understanding Confidence Intervals — Chapter from Lakens’ textbook. https://lakens.github.io/statistical_inferences/07-CI.html
ASA Statement on p-values — Wasserstein & Lazar (2016). The American Statistician, 70(2), 129–133. The American Statistical Association’s official position statement on the proper use and interpretation of p-values. Short and essential. https://doi.org/10.1080/00031305.2016.1154108
Preregistration & Registered Reports
Separating hypothesis-generation from hypothesis-testing is one of the most powerful tools for improving the credibility of research findings.
Key Resources
Registered Reports are a publication format where the study design and analysis plan are peer-reviewed before data collection. Acceptance is based on the quality of the question and methods, not the outcome. Over 300 journals now offer this format. This page explains the format and lists participating journals.
Open Science Framework – Preregistration Guide Step-by-step instructions for preregistering your study on the OSF. Includes templates for different study types. https://help.osf.io/article/330-welcome-to-registrations
AsPredicted — A simple, structured preregistration template widely used in psychology and increasingly in sport science. https://aspredicted.org
Choosing the Right Preregistration Template — COS Blog. Guidance on which preregistration template is most appropriate for your study design (RCT, observational, qualitative, etc.). https://www.cos.io/blog/choosing-preregistration-template-guide-for-researchers
Preregistration: A Plan, Not a Prison — DeHaven (2017). Center for Open Science Blog. A brief explainer addressing the common misconception that preregistration prevents exploratory analysis. https://www.cos.io/blog/preregistration-plan-not-prison
Bias, QRPs, and Researcher Degrees of Freedom
Understanding how bias arises — even in well-intentioned research — is critical for interpreting the existing literature and for designing better studies.
Key Resources
The paper that launched a generation of reform. Demonstrates with simulations how common "researcher degrees of freedom" — flexible stopping rules, covariate selection, outcome switching — can produce almost any desired p-value. Required reading for all researchers.
Big Little Lies: A Compendium and Simulation of p-Hacking Strategies — Stefan & Schönbrodt (2023). Royal Society Open Science, 10(2), 220346. Catalogues 12 specific p-hacking strategies and simulates their effect on false-positive rates. Valuable for understanding exactly how bias enters the literature. https://doi.org/10.1098/rsos.220346
The Garden of Forking Paths — Gelman & Loken (2013). Explains how researchers make many implicit analysis decisions that collectively inflate false-positive rates, even without any intention to deceive. http://www.stat.columbia.edu/~gelman/research/unpublished/p_hacking.pdf
Wicherts et al. (2016) — Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking. Frontiers in Psychology, 7, 1832. A systematic taxonomy of 34 researcher degrees of freedom across all stages of a study. A useful checklist for identifying potential sources of bias in your own work. https://doi.org/10.3389/fpsyg.2016.01832
Why Most Published Research Findings Are False — Ioannidis (2005). PLoS Medicine, 2(8), e124. A foundational paper using probability theory to show how combinations of low power, multiple testing, and publication bias can make most published results unreliable. Provocative but important. https://doi.org/10.1371/journal.pmed.0020124
Transparent Reporting & Open Materials
Following established reporting guidelines and sharing your materials openly increases the trustworthiness and reusability of your work.
Key Resources
The definitive home for reporting guidelines across health research. Includes CONSORT (randomised trials), STROBE (observational studies), PRISMA (systematic reviews), and many more. Use the guideline library to find the appropriate checklist for your study design.
CONSORT 2010 — Reporting guideline for randomised controlled trials. https://www.consort-statement.org
STROBE — Reporting guideline for observational studies in epidemiology. https://www.strobe-statement.org
PRISMA 2020 — Updated reporting guideline for systematic reviews and meta-analyses. The 2020 revision includes new guidance on risk of bias assessment and evidence synthesis methods. https://www.prisma-statement.org/
PERSiST Guidance — Implementing PRISMA 2020 in sport, exercise, and rehabilitation research. A sport-science-specific adaptation of PRISMA 2020, with examples from the field. Essential for anyone writing a systematic review in sport science. https://pmc.ncbi.nlm.nih.gov/articles/PMC8862073/
Open Science Framework – Materials Sharing https://help.osf.io
Open Data, Code & Sharing
Sharing your data, code, and materials enables verification, reuse, and cumulative science. It is increasingly required by funders and journals.
Key Resources
A free, open platform for managing your entire research workflow — from preregistration through data collection to publishing outputs. Supports collaboration, version control, and integration with GitHub, Dropbox, and other tools. The OSF provides persistent DOIs for all shared materials.
Using OSF to Share Data: A Step-by-Step Guide — Soderberg (2018). Advances in Methods and Practices in Psychological Science, 1(1), 115–120. A clear, practical walkthrough of how to share data on the OSF, including how to structure files, write README documents, and assign licences. https://doi.org/10.1177/2515245918757689
Zenodo — A free, general-purpose open repository from CERN and OpenAIRE. Accepts any file format and research output type. All deposits receive a DOI. Particularly useful for large datasets or software. https://zenodo.org
figshare — Another widely used platform for sharing data, figures, and supplementary materials. https://figshare.com
Open Science Interventions to Improve Reproducibility — Dudda et al. (2025). Royal Society Open Science, 12, 242057. A scoping review of which open science practices have empirical evidence of actually improving reproducibility. Useful for prioritising your efforts. https://doi.org/10.1098/rsos.242057
Sport Science Specific Resources
Organisations, journals, and initiatives focused specifically on open and transparent science within sport, exercise, and kinesiology.
Key Resources
The leading organisation for open science in sport, exercise, and kinesiology. STORK runs open-access journals (Communications in Kinesiology; Reports in Sport and Exercise), supports preregistration, and provides a community for researchers committed to improving research quality in the field.
SportRxiv — The open-access preprint server for sport, exercise, performance, and health research. Post your manuscripts for free public access before (or after) journal publication. Increases visibility and enables rapid dissemination. https://sportrxiv.org
Communications in Kinesiology (CiK) — The flagship open-access journal of STORK. Publishes methodologically rigorous, transparent research across all kinesiology disciplines including tutorials and registered reports. Free to read and publish. https://storkjournals.org/index.php/cik
Replication Concerns in Sports and Exercise Science — Mesquida, Murphy, Lakens & Warne (2022). Royal Society Open Science, 9, 220946. A narrative review of the specific methodological problems that reduce replicability in sport science — including low power, p-hacking, and flexible designs. https://doi.org/10.1098/rsos.220946
Teaching Research Quality in Sports Science
Resources for lecturers, supervisors, and educators who want to embed open science principles in their teaching.
Resources
FORRT Teaching Hub Community-contributed syllabi, lesson plans, and teaching materials for open science across disciplines. Includes resources specifically for sport and health science. https://forrt.org/teaching/
OSF – Open & Reproducible Methods Syllabi A collection of course syllabi from researchers who teach open science and research methods. https://osf.io/vkhbt/
R Psychologist – Interactive Teaching Visuals A suite of beautiful, interactive visualisations for teaching core statistical concepts (power, p-values, correlation, Bayesian inference). Free to use in lectures. https://rpsychologist.com
Essentials of Exercise and Sport Psychology: An Open Access Textbook — STORK. A free, openly licensed textbook for sport and exercise psychology, produced under the STORK initiative. https://kinesiologybooks.org/index.php/stork/catalog/book/10
Topic Intro Page (Visual Learning Hub) https://norf-tropic.my.canva.site/tr-op-ic
James Steele’s Recommended Reading
A curated reading list from James Steele, focusing on the philosophy of science, theory building, and statistical reasoning — the deeper foundations that underpin good research practice.
His Own Work
The introductory chapter for the forthcoming STORK open-access textbook on research methods in sport and exercise science. Covers the philosophical foundations of science, what it means to build and test theory, and why transparent research practices follow naturally from a coherent philosophy of science.
Philosophy by Stealth: A Periodisation Chapter — Steele (preprint) An example of smuggling rigorous philosophy of science into a practical applied topic — periodisation. Demonstrates how philosophical thinking about causation, mechanism, and theory can sharpen even the most applied research questions. https://sportrxiv.org/index.php/server/preprint/view/323
Theory Development and Testing — Talk (2024) — James Steele (YouTube) A recorded lecture using recent empirical work as a live example of theory development and testing in practice. Accessible and thought-provoking. Additional short videos on related topics are also available on the same channel. https://youtu.be/39Ajm1dwFx0
Last updated: March 2026. To suggest a resource, please contact us.