Preprint / Version 1

Assessing the evidential value of mental fatigue and exercise research




mental fatigue, meta-analysis, metascience, mental exertion


It has been reported repeatedly that mental fatigue can negatively affect exercise performance, but recent findings have questioned the strength of the effect. To further complicate this issue, an overlooked problem might be the presence of publication bias in a body of literature with studies using underpowered designs which is known to inflate false positive report probability and effect size estimates. Altogether, the presence of bias in the literature is likely to reduce the evidential value of the published literature on this topic, although it is unknown to which extent. The purpose of the current work was to assess the evidential value of studies published up to date on the effect of mental fatigue on exercise performance by assessing the presence of publication bias and the observed statistical power achieved by these studies. A traditional meta-analysis revealed a Cohen’s dzeffect size of –0.49, 95% CI [–0.63, –0.34], p < 0.001. However, when we applied methods for estimating and correcting for publication bias (such as small-study effects), we found that the bias-corrected effect size decreased to –0.17. Furthermore, the median observed statistical power assuming the meta-analytic effect size (i.e., –0.49) as the true effect size was 34% (min = 16%, max = 97%), indicating that on average these studies only had a 34% chance of observing a significant result if the true effect was Cohen’s dz = –0.49. If the adjusted effect size (–0.17) was assumed as the true effect, the median statistical power was just 0.09%. We conclude that the evidence for the effect of the mental fatigue effect is a useful case study for illustrating the dangers of small-study effects.


Metrics Loading ...


Giboin L-S, Wolff W. 2019 The effect of ego depletion or mental fatigue on subsequent physical endurance performance: A meta-analysis. Perform. Enhanc. Health 7, 100150. (doi:10.1016/j.peh.2019.100150)

Brown DMY, Graham JD, Innes KI, Harris S, Flemington A, Bray SR. 2019 Effects of Prior Cognitive Exertion on Physical Performance: A Systematic Review and Meta-analysis. Sports Med. (doi:10.1007/s40279-019-01204-8)

Holgado D, Sanabria D, Perales JC, Vadillo MA. 2020 Mental fatigue might be not so bad for exercise performance after all: a systematic review and bias-sensitive meta-analysis. J. Cogn. 3, 1–14. (doi:

Van Cutsem J, Marcora S, De Pauw K, Bailey S, Meeusen R, Roelands B. 2017 The Effects of Mental Fatigue on Physical Performance: A Systematic Review. Sports Med. 47, 1569–1588. (doi:10.1007/s40279-016-0672-0)

Holgado D, Troya E, Perales JC, Vadillo M, Sanabria D. 2020 Does mental fatigue impair physical performance? A replication study. Eur. J. Sport Sci. (doi:10.1080/17461391.2020.1781265)

Marcora S, Staiano W, Manning V. 2009 Mental fatigue impairs physical performance in humans. J. Appl. Physiol. 106, 857–864. (doi:10.1152/japplphysiol.91324.2008)

Earp BD, Trafimow D. 2015 Replication, falsification, and the crisis of confidence in social psychology. Front. Psychol. 6, 621. (doi:10.3389/fpsyg.2015.00621)

Hagger MS et al. 2016 A Multilab Preregistered Replication of the Ego-Depletion Effect. Perspect. Psychol. Sci. J. Assoc. Psychol. Sci. 11, 546–573. (doi:10.1177/1745691616652873)

Simmons JP, Nelson LD, Simonsohn U. 2011 False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 22, 1359–1366. (doi:

Bakker M, van Dijk A, Wicherts JM. 2012 The Rules of the Game Called Psychological Science. Perspect. Psychol. Sci. 7, 543–554. (doi:

Stefan A, Schönbrodt F. 2022 Big Little Lies: A Compendium and Simulation of p-Hacking Strategies. (doi:10.31234/

Button KS, Ioannidis JPA, Mokrysz C, Nosek BA, Flint J, Robinson ESJ, Munafò MR. 2013 Power failure: why small sample size undermines the reliability of neuroscience. Nat. Rev. Neurosci. 14, 365–376. (doi:10.1038/nrn3475)

Anderson SF, Kelley K, Maxwell SE. 2017 Sample-Size Planning for More Accurate Statistical Power: A Method Adjusting Sample Effect Sizes for Publication Bias and Uncertainty. Psychol. Sci. 28, 1547–1562. (doi:10.1177/0956797617723724)

Cumming G. 2011 Understanding The New Statistics: Effect Sizes, Confidence Intervals, and Meta-Analysis. New York: Routledge. (doi:10.4324/9780203807002)

Carter EC, Kofler LM, Forster DE, McCullough ME. 2015 A series of meta-analytic tests of the depletion effect: Self-control does not seem to rely on a limited resource. J. Exp. Psychol. Gen. 144, 796–815. (doi:10.1037/xge0000083)

Kvarven A, Strømland E, Johannesson M. 2020 Comparing meta-analyses and preregistered multiple-laboratory replication projects. Nat. Hum. Behav. 4, 423–434. (doi:10.1038/s41562-019-0787-z)

Lakens D. 2015 What p-hacking really looks like: A comment on Masicampo and LaLande (2012). Q. J. Exp. Psychol. 68, 829–832. (doi:10.1080/17470218.2014.982664)

Simmons JP, Simonsohn U. 2017 Power Posing: P-Curving the Evidence. Psychol. Sci. 28, 687–693. (doi:10.1177/0956797616658563)

McKay B, Bacelar M, Parma JO, Miller MW, Carter MJ. 2022 The combination of reporting bias and underpowered study designs have substantially exaggerated the motor learning benefits of self-controlled practice and enhanced expectancies: A meta-analysis. (doi:10.31234/

McMorris T, Barwood M, Hale BJ, Dicks M, Corbett J. 2018 Cognitive fatigue effects on physical performance: A systematic review and meta-analysis. Physiol. Behav. 188, 103–107. (doi:10.1016/j.physbeh.2018.01.029)

Carter EC, Schönbrodt FD, Gervais WM, Hilgard J. 2019 Correcting for bias in psychology: A comparison of meta-analytic methods. Adv. Methods Pract. Psychol. Sci. 2, 115–144. (doi:10.1177/2515245919847196)

McShane BB, Böckenholt U, Hansen KT. 2016 Adjusting for Publication Bias in Meta-Analysis: An Evaluation of Selection Methods and Some Cautionary Notes. Perspect. Psychol. Sci. J. Assoc. Psychol. Sci. 11, 730–749. (doi:10.1177/1745691616662243)

Sladekova M, Webb LEA, Field AP. 2022 Estimating the change in meta-analytic effect size estimates after the application of publication bias adjustment methods. Psychol. Methods (doi:10.1037/met0000470)

Cumming G. 2012 Understanding the new statistics: Effect sizes, confidence intervals, and meta-analysis. New York, NY, US: Routledge/Taylor & Francis Group.

Viechtbauer W. 2010 Conducting Meta-Analyses in R with the metafor Package. J Stat Softw 36, 1–48. (doi:10.18637/jss.v036.i03)

Stanley TD. 2017 Limitations of PET-PEESE and other meta-analysis methods. Soc. Psychol. Personal. Sci. 8, 581–591. (doi:10.1177/1948550617693062)

Hong S, Reed WR. 2021 Using Monte Carlo experiments to select meta-analytic estimators. Res. Synth. Methods 12, 192–215. (doi:10.1002/jrsm.1467)

Ciria LF, Román-Caballero R, Vadillo M, Holgado D, Luque-Casado A, Perakakis P, Sanabria D. 2022 A call to rethink the cognitive benefits of physical exercise: An umbrella review of randomized controlled trials. , 2022.02.15.480508. (doi:10.1101/2022.02.15.480508)

Coburn KM, Vevea JL. 2015 Publication bias as a function of study characteristics. Psychol. Methods 20, 310–330. (doi:10.1037/met0000046)

Kepes S, Banks GC, McDaniel M, Whetzel DL. 2012 Publication Bias in the Organizational Sciences. Organ. Res. Methods 15, 624–662. (doi:10.1177/1094428112452760)

Bartoš F, Maier M, Quintana DS, Wagenmakers E-J. 2022 Adjusting for Publication Bias in JASP and R: Selection Models, PET-PEESE, and Robust Bayesian Meta-Analysis. Adv. Methods Pract. Psychol. Sci. 5, 25152459221109260. (doi:10.1177/25152459221109259)

Coburn KM, Vevea JL. 2019 weightr: Estimating Weight-Function Models for Publication Bias.

Lin L, Chu H. 2018 Quantifying publication bias in meta-analysis. Biometrics 74, 785–794. (doi:10.1111/biom.12817)

Lin L, Rosenberger KJ, Shi L, Wang Y, Chu H. 2022 altmeta: Alternative Meta-Analysis Methods.

Schwarzer G, Carpenter JR, Rücker G. 2022 metasens: Statistical Methods for Sensitivity Analysis in Meta-Analysis.

Bartoš F, Schimmack U. 2022 Z-curve 2.0: Estimating Replication Rates and Discovery Rates. Meta-Psychol. 6. (doi:10.15626/MP.2021.2720)

Rücker G, Schwarzer G, Carpenter JR, Binder H, Schumacher M. 2011 Treatment-effect estimates adjusted for small-study effects via a limit meta-analysis. Biostat. Oxf. Engl. 12, 122–142. (doi:10.1093/biostatistics/kxq046)

Quintana D. 2022 metameta: A Meta-meta-analysis Package for R.

Borg DN, Bon J, Sainani KL, Baguley BJ, Tierney N, Drovandi C. 2020 Sharing Data and Code: A Comment on the Call for the Adoption of More Transparent Research Practices in Sport and Exercise Science. (doi:10.31236/

Pageaux B, Lepers R, Dietz KC, Marcora SM. 2014 Response inhibition impairs subsequent self-paced endurance performance. Eur. J. Appl. Physiol. 114, 1095–1105. (doi:10.1007/s00421-014-2838-5)

Penna EM, Filho E, Wanner SP, Campos BT, Quinan GR, Mendes TT, Smith MR, Prado LS. 2018 Mental Fatigue Impairs Physical Performance in Young Swimmers. Pediatr. Exerc. Sci. 30, 208–215. (doi:10.1123/pes.2017-0128)

Thornton A, Lee P. 2000 Publication bias in meta-analysis: its causes and consequences. J. Clin. Epidemiol. 53, 207–216. (doi:10.1016/S0895-4356(99)00161-4)

Borg DN, Barnett A, Caldwell AR, White N, Stewart I. 2022 The Bias for Statistical Significance in Sport and Exercise Medicine. (doi:10.31219/

Wicherts JM, Veldkamp CLS, Augusteijn HEM, Bakker M, van Aert RCM, van Assen MALM. 2016 Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking. Front. Psychol. 7, 1832. (doi:10.3389/fpsyg.2016.01832)

Cohen J. 1962 The statistical power of abnormal-social psychological research: a review. J. Abnorm. Soc. Psychol. 65, 145–153. (doi:

Button KS, Ioannidis JPA, Mokrysz C, Nosek BA, Flint J, Robinson ESJ, Munafò MR. 2013 Power failure: why small sample size undermines the reliability of neuroscience. Nat. Rev. Neurosci. 14, 365–376. (doi:10.1038/nrn3475)

Szucs D, Ioannidis JPA. 2017 Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature. PLoS Biol. 19, e3001151. (doi:10.1371/journal.pbio.2000797)

Maxwell SE, Lau MY, Howard GS. 2015 Is psychology suffering from a replication crisis? What does ‘failure to replicate’ really mean? Am. Psychol. 70, 487–498. (doi:10.1037/a0039400)

OPEN SCIENCE COLLABORATION. 2015 Estimating the reproducibility of psychological science. Science 349, aac4716. (doi:10.1126/science.aac4716)

Murphy J, Mesquida C, Caldwell AR, Earp BD, Warne JP. 2022 Proposal of a Selection Protocol for Replication of Studies in Sports and Exercise Science. Sports Med. (doi:10.1007/s40279-022-01749-1)

Camerer CF et al. 2016 Evaluating replicability of laboratory experiments in economics. Science 351, 1433–1436. (doi:10.1126/science.aaf0918)

Maxwell SE, Kelley K, Rausch JR. 2008 Sample size planning for statistical power and accuracy in parameter estimation. Annu. Rev. Psychol. 59, 537–563. (doi:10.1146/annurev.psych.59.103006.093735)

Ioannidis JPA. 2005 Why Most Published Research Findings Are False. PLOS Med. 2, e124. (doi:10.1371/journal.pmed.0020124)

Brysbaert M. 2019 How many participants do we have to include in properly powered experiments? A tutorial of power analysis with reference tables. J. Cogn. 2, 16. (doi:10.5334/joc.72)

Higginson AD, Munafò MR. 2016 Current Incentives for Scientists Lead to Underpowered Studies with Erroneous Conclusions. PLOS Biol. 14, e2000995. (doi:10.1371/journal.pbio.2000995)

Smaldino PE, McElreath R. 2016 The natural selection of bad science. R. Soc. Open Sci. 3, 160384. (doi:10.1098/rsos.160384)

Maxwell SE. 2004 The Persistence of Underpowered Studies in Psychological Research: Causes, Consequences, and Remedies. Psychol. Methods 9, 147–163. (doi:

Turner RM, Bird SM, Higgins JPT. 2013 The Impact of Study Size on Meta-analyses: Examination of Underpowered Studies in Cochrane Reviews. PLOS ONE 8, e59202. (doi:10.1371/journal.pone.0059202)

Pageaux B, Lepers R. 2018 Chapter 16 - The effects of mental fatigue on sport-related performance. In Progress in Brain Research (eds S Marcora, M Sarkar), pp. 291–315. Elsevier. (doi:10.1016/bs.pbr.2018.10.004)

Abt G, Boreham C, Davison G, Jackson R, Nevill A, Wallace E, Williams M. 2020 Power, precision, and sample size estimation in sport and exercise science research. J. Sports Sci. 0, 1–3. (doi:10.1080/02640414.2020.1776002)

Caldwell AR et al. 2020 Moving Sport and Exercise Science Forward: A Call for the Adoption of More Transparent Research Practices. Sports Med. (doi:10.1007/s40279-019-01227-1)

Abt G, Jobson S, Morin J-B, Passfield L, Sampaio J, Sunderland C, Twist C. 2022 Raising the bar in sports performance research. J. Sports Sci. 40, 125–129. (doi:10.1080/02640414.2021.2024334)

Sainani KL et al. 2021 Call to increase statistical collaboration in sports science, sport and exercise medicine and sports physiotherapy. Br. J. Sports Med. 55, 118–122. (doi:10.1136/bjsports-2020-102607)

Brown D, Boat R, Graham J, Martin K, Pageaux B, Pfeffer I, Taylor I, Englert C. 2021 A Multi-Lab Pre-Registered Replication Examining the Influence of Mental Fatigue on Endurance Performance: Should We Stay or Should We Go?: North American Society for the Psychology of Sport and Physical Activity Virtual Conference. pp. 57–57. (doi:10.1123/jsep.2021-0103)

Vazire S. 2019 Do We Want to Be Credible or Incredible? APS Obs. 33.

Asendorpf JB et al. 2013 Recommendations for Increasing Replicability in Psychology. Eur. J. Personal. 27, 108–119. (doi:

Lakens D. 2022 Sample Size Justification. Collabra Psychol. 8, 33267. (doi:10.1525/collabra.33267)

Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. 2018 The preregistration revolution. Proc. Natl. Acad. Sci. 115, 2600–2606. (doi:10.1073/pnas.1708274114)