A Comparison of Methods for Quantifying Training Load: Relationships Between Modelled and Actual Training Responses
Wallace, L.K., Slattery, K.M., Coutts, A.J.
Purpose:
To assess the validity of methods for quantifying training load, fitness, and fatigue in endurance athletes using a mathematical model.
Methods:
Seven trained runners (VO2max: 51.7 ± 4.5 mL kg−1 min−1, age: 38.6 ± 9.4 years, mean ± SD) completed 15 weeks of endurance running training. Training sessions were assessed using heart rate (HR), running pace, and rating of perceived exertion (RPE). Training dose was calculated using the session-RPE method, Banister’s TRIMP, and the running training stress score (rTSS). Weekly running performance (1,500-m time trial), fitness (submaximal HR, resting HR), and fatigue (profile of mood states, heart rate variability [HRV]) were measured. A mathematical model was applied to the training data from each runner to provide individual estimates of performance, fitness, and fatigue. Correlations assessed the relationships between the modeled and actual weekly performance, fitness, and fatigue measures within each runner.
Results:
Training resulted in a 5.4 ± 2.6% improvement in 1,500-m performance. Modelled performance was correlated with actual performance in each subject, with relationships being r = 0.70 ± 0.11, 0.60 ± 0.10, and 0.65 ± 0.13 for the rTSS, session-RPE, and TRIMP input methods, respectively. There were moderate correlations between modeled and actual fitness (submaximal HR) for the session-RPE (−0.43 ± 0.37) and TRIMP (−0.48 ± 0.39) methods and moderate-to-large correlations between modeled and actual fatigue measured through HRV indices for both session-RPE (−0.48 ± 0.39) and TRIMP (−0.59 ± 0.31) methods.
Conclusions:
These findings showed that each of the training load methods investigated is appropriate for quantifying endurance training dose and that submaximal HR and HRV may be useful for monitoring fitness and fatigue, respectively.