An Investigation into the Barriers to Replicability and Reproducibility in Sports and Exercise Science Research

Research output: ThesisPhd Thesis 1 (Research TU/e / Graduation TU/e)

18 Downloads (Pure)

Abstract

The first study presented a narrative review of known threats to replicability, such as questionable research practices and publication bias, which in combination with low statistical power can inflate false positives and effect sizes. While signs of these issues—like a high proportion of significant findings and small sample sizes—have been noted in sports and exercise science, their full implications have not been adequately addressed. The study discussed how these factors reduce replicability and proposed strategies to improve transparency and rigor. In the second study, we examined the statistical power, prevalence of a priori power analyses and reporting practices of studies published in the Journal of Sports Sciences. A meta-analysis of 89 independent p-values using z-curve analysis revealed an average statistical power of 53%. The Observed Discovery Rate (73%) exceeded the upper bound of the Expected Discovery Rate (71%) indicating the presence of publication bias. In a broader sample of 179 studies, only 46 (26%) included an a priori power analysis, the reporting and reproducibility of a priori power analyses were found to be poor, and key statistical information—such as test statistics, effect sizes, and confidence intervals—was often inadequately reported. The inclusion of a power analysis did not lead to larger sample sizes, suggesting that many were poorly conducted. These shortcomings highlight the need for improved methodological rigor and transparency in sports science research. In the third study, a z-curve analysis of 269 independent p-values corresponding the to the tested hypothesis in 269 studies published across 10 journals revealed strong evidence of publication bias and low statistical power in the field. The Observed Discovery Rate (68%) exceeded the upper bound of the Expected Discovery Rate (34%), indicating the presence of publication bias. The average statistical power was only 11%, and the expected replication rate was just 49%, suggesting that many published findings are unlikely to replicate. The fourth study assessed the reporting practices, reproducibility and quality of a priori power analyses. Among 350 hypothesis-testing studies, only 41% reported a power analysis. Of these, many were incomplete or ambiguous: 25% were fully non-reproducible, 37% non-reproducible, and only 15% fully reproducible. Even among studies using G*Power software, default settings were often used without proper justification, leading to underestimated sample sizes. Crucially, the inclusion of a power analysis did not lead to larger sample sizes or higher proportion of hypotheses supported, suggesting that many were poorly conducted. In the fifth study, the concept of hypothesis misalignment was introduced. While collecting data for the previous two studies, it became evident that many studies used a hypothesis test that did not align with their stated hypothesis. This misalignment—where the hypothesis being tested did not correspond to the hypothesis of interest—can lead to misleading claims. The study deconstructed various examples to identify six types of misalignments and provided practical guidance to help researchers test their hypotheses with appropriate hypothesis tests. The final study is a narrative review focused on the quality and transparency of meta-analyses published in Sports Medicine. Although the initial aim was to quantify effect-size heterogeneity, the overestimation of meta-analytic effect sizes, and the average statistical power of studies included meta-analyses, the study was redirected due to the poor methodological rigor and transparency in the meta-analyses assessed. Our findings highlight the urgent need for more rigorous and transparent practices in meta-analyses. In conclusion, this PhD reveals that sports and exercise science faces significant challenges to reproducibility and replicability. Key issues include publication bias, underpowered study designs, poor reporting practices, hypothesis misalignment, and low methodological quality in meta-analyses. Addressing these problems will require a collective effort from the research community, including the adoption of improved reporting standards, wider use of preregistration and Registered Reports, and enhanced statistical training. These steps are essential to reduce bias in the published literature, strengthen the rigour of a priori power analyses, ensure appropriate hypothesis testing, and improve the quality and transparency of meta-analytic work in the field.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • Industrial Engineering and Innovation Sciences
Supervisors/Advisors
  • Lakens, Daniël, Promotor
  • Snijders, Chris C.P., Promotor
  • Warne, Joe, Copromotor, External person
Award date8 Jan 2026
Place of PublicationEindhoven
Publisher
Print ISBNs978-90-386-6571-9
Publication statusPublished - 8 Jan 2026

Bibliographical note

Proefschrift.

Fingerprint

Dive into the research topics of 'An Investigation into the Barriers to Replicability and Reproducibility in Sports and Exercise Science Research'. Together they form a unique fingerprint.

Cite this