P E R F O R M A N C E
Study Details
Study Details
Prac cal Takeaways from study
Related links to learn more about the topic
Reviewers comments on the st udy
James is currently the Head Strength & Conditioning Coach for the Romanian Rugby Union. He has previously worked in America's professional rugby competition Major League Rugby with Austin Elite and the NZ Women ’s National Rugby League Team. He is a published author and has completed a MSc in Sport & Exercise Science from AUT, Auckland, NZ.
Tom is the Head of Athletic Development at St Peters RC High School. He holds a Masters in S&C and has previously worked with West Bromwich Albion FC, Gloucester Rugby club, and Great Britain Equine. Tom is our youth research reviewer at Science for Sport.
Cody is a strength and conditioning coach and adjunct lecturer at the University of Iowa. He has an MSE in Exercise Science from the University of Kansas and also holds a CSCS from the NSCA.
James is a Performance Nutritionist for the English Football Association and works alongside the England national teams (men's and women's). He is also a SENr registered performance nutritionist and holds a PhD from Liverpool John Moores University.
Tom is currently an assistant professor in applied sports sciences and has worked in elite sport for over 10 years. Previous roles include working as a sports scientist at Liverpool FC, where he completed his PhD, and working across a number of other sports. He is passionate about physiology and has published papers on strength and conditioning, nutrition and youth development
Jordan is a Physical Therapist and Strength Coach who currently practices in a Sports & Orthopedic clinic in Bergen County, New Jersey. He is passionate about educating athletes on ways to optimize performance while decreasing the risk of injury.
In the UK, research suggests student athletes consume more alcohol than their non-athlete peers. Reasoning behind this has proposed this could be due to certain ‘unwritten rules’, or ‘commandments’ within such groups. These include that those involved should be ‘committed to the social life and that ‘excessive alcohol consumption and associated behaviours are obligatory’
In soccer it has been shown that drinking was a strategic social activity to provide group acceptance and belonging. These dynamics, such as perceived player status, may be impactful on consumption by group members, however, this area of investigation remains relatively unexplored.
This study sought to identify the underlying value attached to drinking practices in university rugby settings and the purpose or use of these for positioning within social settings. Additionally, this study aimed to understand environmental contexts in which drinking practices exist and the wider alcohol-related pressures athletes might be exposed to.
Participants included in this case study were rugby players, coaching staff, committee members, supporters and the wider community that entered the field of study These included opposition players and supporters, the wider student population that entered the environment (i.e., the student union bar or rugby stands) and student union staff.
Teams competing in first, second, and third year were observed prior, during, and following games throughout their season. This allowed data from multiple perspectives and exploration of how players of differing abilities and experiences interacted with the environment and each other
Data were analysed using a six-stage approach. 1) collating, transcribing, and interrogating the data with the research team. This began with the first round of interviews and field observations and continued at least monthly throughout the season. 2) once data collection had concluded, interview transcripts, documentation and field notes were coded with tags related to alcohol use (i.e., where drinking occurred, how people behaved and what they said). 3) clustering codes into themes to provide a comprehensive and nuanced understanding of alcohol use in university rugby. 4) each theme was extensively reviewed by the research team and refined, where needed, to provide a more robust interpretation of the relationship between sport and alcohol use. 5) further interrogation of each theme, where the research team probed each theme, its underlying message, and the suitability of each piece of evidence in supporting this underlying message. 6) a coherent story was developed by the research team, which best explained the collected data.
This study showcased features of the student-athlete environment which normalised and legitimised a drinking ethos. It shows how alcohol use was embedded within the studied athlete’s weekly routines when playing sport, training, and watching first-team athletes.
Specific roles and responsibilities (i.e., social secretaries), punishments (i.e., drinking forfeits) and events (i.e. initiation ceremonies) were used to ensure athletes complied with the drinking ethos. Furthermore, athletes used alcohol to gain status and reputation.
These athletes faced a multitude of pressures encouraging them to drink on a regular basis. This suggests a systemwide approach is required to address the culture around drinking at this and other institutions with an embedded drinking ethos.
The researchers suggest that this should focus on collectively addressing the physical, commercial, sociocultural, and political contributors by bringing stakeholders, including players, together to develop a shared understanding of the challenge.
This study provides more thorough and nuanced insight into the specific reinforcement mechanisms of the culture than has previously been acknowledged in the literature.
Four key themes were identified in the data; 1) Routine and ritualised consumption, where drinking was embedded within the everyday practices of playing and watching sport, 2) Enforcement and compliance, which explains how roles and responsibilities, punishments, and events, were used to ensure athletes complied with the drinking ethos, 3) Showing off and gaining status, where athletes’ used alcohol to make a name for themselves and build a reputation (symbolic and social capital), and 4) Institutional contributors to alcohol use, which describes how elements of the institution (i.e. alcohol price, promotion, and sponsorship) normalised and legitimised the alcohol ethos.
“This study suggests that the environment identified has led to a culture where no behaviours were off limits leading to potentially harmful consequences. This occurred via the institution itself normalising and legitimising such a drinking ethos by expecting and approving of excessive alcohol use, and in some cases actively encouraging it through easy access and promotions.”
“It is important to say that many young people are now living alcohol free, or low alcohol lives. This is not something being adopted across the board though. To echo what the researchers propose, it is clear that a wholesale change in culture around alcohol consumption and sport is likely required in such settings. Although this was only a case study of one institution, such practices could very well be widespread. With the negative impacts of alcohol, be it physically, psychologically and/or socially, this information is important to be understood and actioned.”
This month ’ s top research in strength & conditioning.
H OW IN C RE A S IN G S U BST IT U T IO N S
S O C C E R IM PA C T E D M AT C H P E RF O R
VA L ID IT Y O F W RIST-WO RN H E A RT RA
D U RIN G RE S ISTA N C E T RA IN IN G
D O YO U H AV E TO S P RIN T M A XIM A L L IN C RE A S E S P E E D ?
LOW LOA D VS H IG H LOA D BF R: W H IC H IS S U P E RIO R F O R E C C E N T RIC H A M ST RIN G M U S C L E A DA P TAT IO N ?
“P L AY W IT H T H E BA L L AT YO U R F E E T ” –
D O E S IN C LU D IN G A S O C C E R BA L L
D U RIN G P LYO M E T RIC A N D C O D TA S KS
IM PA C T AT H L E T IC IS M IN YO U N G S O C C E R P L AYE RS ?
S H O RT VS . LO N G M U S C L E L E N GT H
T RA IN IN G F O R M A XIM IZ IN G
H YP E RT RO P H Y?
Change is a constant in life, and the halting of competition in 2020 due to the COVID-19 global pandemic has been a catalyst for reflection and evolution of many federations (e.g. Federation Internationale de football association). Competitions were halted and plans to reintroduce placed a premium on the health and safety of athletes. This led to rule changes to create a more conducive environment for reduced injury risk and hopefully faster paced competitions.
An increase in the number of substitutions (from three to five per match) was an opportunity to limit the accumulation of fatigue with the hopes of increasing the intensity and quality of athlete performance on the pitch. However, an understanding of how the in-game demands have evolved and the influence it has had on players has yet to be established. Therefore, this research examined the impact that rule changes regarding substitutions (following the COVID-19 pandemic) had on player performance.
Sixty-six matches from twenty-two professional Spanish teams (playing in LaLiga SmartBank) were examined both immediately before and after the COVID-19 isolation period. This allowed for examination of differing scenarios withrounds 8-10 of the league allowing three substitutions per match, and rounds 32-24 of league competition allowing five. Total distance covered, distance at high-intensity (14-21-km.h-1), distance at very high-intensity (21.1-24-km.h-1), and sprinting (>24-km.h-1) distance were measured for each match (examined based on half and the entire match).
Only substitutions made in the second half were considered and the second half was divided into 15-min windows. This created five ‘groups’ for comparison (e.g. full match, substitutions made at halftime, between 46-60-min, between 61-75-min, and after the 76th-min of play).
For comparison of data, when a player was substituted, the performances (distance and speed) of the incoming and outgoing player were combined to represent a ‘single-match player’ (maximum speed was noted for the higher value between the two athletes). The data were analysed and compared based on performances before and after the substitution rule change came into effect.
Substitutions, both the ability as well as the limitation (depending on how you look at it), are a part of competition and coaches must be strategic in how and when to use them. Importantly, once a player is substituted they cannot return. Therefore, based on the substitution trends from the teams researched, a player must have the capacity to play for a seemingly continuous 45-min half (except for stoppage time) or 90-min competition.
§ This is the basis of training and development. Athletes must withstand the competitive physical demands, capable of performing the necessary volumes (distances) and intensities (speeds), as this varies across a match based on a number of influences (e.g. score, opponent, scoring opportunity, etc., see ). Therefore, HERE coaches must prepare athletes for the ‘worst case scenario of what a match could potentially entail (see HERE).
§ This capacity is built through progressive exposures to these volumes and intensities in training.
§ When an athlete has the capacity to play an entire match, they also build the ability to recover between matches, which is also critical in an athlete maintaining performance and reducing injury risk across a season. Ultimately, the goal of a player should be to be able to play an entire match while only being substituted if a tactical advantage presents itself
Ÿ This does not mean a player must constantly sprint, but appreciate the various opportunities during a stoppage or other pacing strategies (see ) that allow for recovery and the intermittent activity during a HERE match.
Research has shown that a limit on the number of substitutions has not held back on the increase in sprint distance (see ) or the increase in high-speed running (see noted in professional leagues before the HERE HERE) recent rule change. This is a testament to the training and performance of elite athletes. Allowing coaches to keep their best athletes on the field for the entire match because they are faster and more fit than ever before.
As much as coaches work to mitigate injuries, they are simply a part of competition and substitute players are necessary to support the success of a team when a starter goes down with an injury Therefore, non-starters have to perform supplementary work (outside of competition), because they are not getting the physical stress that starters are receiving during a match.
§ Coaches can prescribe additional fitness (e.g. running, cross training, on-field drills) immediately following a match or on the subsequent day, to support the physical capacity necessary. In doing this, if a substitute is called upon, they can work towards achieving similar physical stimuli as a starter and there will hopefully be limited detriment to performance or increase in injury risk.
Distance run in the first and second halves of a match increased in the matches following the substitution rule change.
§ Distance run at high-intensity (14-21-km.h-1) in the second half was also greater in the subsequent matches.
Despite the rule change, player substitution frequency only increased ~21.7-% in the matches following the isolation period.
“The researchers in this study question whether or not further change is needed for the substitution rules in professional soccer, proposing that play would be fasterpaced and injuries would decrease if more were allowed. The data from this study is not enough to support this direction, but the information is limited due to the fact that the matches analysed were the first to be played following the COVID-19 isolation period. Therefore, further investigation is needed, but at the same time we have to appreciate that if you change the rules, you’re changing the game. I would challenge that maybe it’s not the game that needs to change, but the training and congestion of competitions that would help to improve performance and mitigate injury.”
“Additionally, substitutions are a part of the success of a team and highlights the critical role of a substitute player in the squad. Player availability and player ability are two deciding factors to a team. Therefore, as the saying goes, ‘you are only as strong as your weakest link.’ If a reserve player in fact is called upon as a substitute during a match, their level of play and physical capacity needs to be at the same level of the starter This can only be achieved through dedicated training outside of competition. This often takes additional time and effort but ultimately is critical to success.”
Measuring heart rate (HR) is a strategy to objectively assess and quantify intensity during activity (see HERE). Wrist-worn devices that utilise photoplethysmography technology (see ), emitting infrared light through the HERE skin to measure variations in blood circulation, have become a common option for this encouraging feedback (increasing adherence and motivation, see ). For HERE more traditional steady-state activities (walking, running, cycling), the validity of these devices is generally favourable (see HERE).
However, there are numerous factors that can influence the accuracy of these devices (e.g. device type, skin tone, and exercise mode). Further, there is limited research around the accuracy of measuring HR using wrist worn devices during resistance exercise. Therefore, this study investigated the validity of wrist-worn HR monitors (Apple Watch Series 6 and Whoop Band 3.0) during various resistance exercises as a measure for HR during activity.
Twenty-nine participants (16 females, age 24.5±4-yrs) completed a series of resistance training sessions while monitoring HR with an Apple Watch Series 6 (Apple Watch), a WHOOP band 3.0 (WHOOP band), and a Polar H10 chest strap (as the validated criterion measure (see HERE)). Following a specific warm-up, participants completed three sets of 15 repetitions in the barbell back squat, barbell deadlift, dumbbell curl to overhead press, seated cable row, and burpees. There was 2-min between each set and 3-min between each exercise, using a submaximal load (a relative load equivalent to a 20repetition maximum). Technique was normalised across exercises and HR measurements were recorded from all devices after the seventh-, fifteenth-repetition, and 30-sec following the completion of a set.
Using only data from the third set, comparisons were made between the criterion and wrist-worn devices for validity. Additionally, comparisons were made between devices based on exercise activity (active or static) and type (upper or lower body).
If selecting between the Apple Watch series 6 and WHOOP band 3.0, the Apple Watch appears to offer better validity for measuring HR between the two when performing tasks that keep the wrist and arm fairly stationary (barbell back squat, deadlift, and seated cable rows).
Given the method of photoplethysmography technology (emitting light through the surface of the skin), the best suggestion for improving the reliability of data would be to simply ensure that the watch is strapped securely (tightly as comfortable) around the wrist to allow the interaction between infrared light and the blood circulation to optimise. This would hopefully limit the movement artefacts that are caused by arm movement (see ) and improve the HR feedback validity and reliability. HERE
A strategy to measure and quantify the intensity of a session would be to note the average heart rate across the session, which can serve as an objective internal training load measure (see HERE).
§ Additionally, the validity of this measure can be supported and better understood with an athlete’s subjective report of session rating of perceived exertion, which should be taken in the 10-30-min following a training session on a consistent basis to provide valid and reliable data to interpret.
When it comes to monitoring HR during resistance exercise for increased strength and hypertrophy, probably the best opportunity to implement HR measures comes in the form of individualising rest periods between sets.
§ For example, an athlete can determine how long to rest between sets by using a target HR to descend to following a set (which may be roughly 50-% of their maximum (e.g. this would be roughly 100 beats per minute (bpm) for a 20-yr old athlete (having a predicted max of ~200-bpm)) or a rate that is similar to what was noted 45-sec following their initial working set, see HERE).
§ Ultimately, the HR prior to a subsequent set of resistance exercise should be a reduced rate where respiration has normalised, and quality of effort and execution can be maintained.
Given the fact the chest strap HR monitor was the criterion, as well as the ease of use and cost-effectiveness of this option, if validity is important to you as an athlete or coach, a chest strap heart rate monitor is likely the best option when compared with a wrist-worn device. The drawback is that it does not have multiple uses, like a watch does (e.g. time, alarm, heart rate variability).
Apple Watch measures were slightly more accurate than WHOOP Band’s for HR measurement during resistance activity.
§ Mean absolute percentage errors were 1.6-14-% for the Apple Watch and 4.4-14.8-% for the Whoop Band across all measurements.
Overall, validity of HR between devices worsened as with higher body movement exercises (e.g. burpees or dumbbell curl to press).
For the Whoop Band, as exercise intensity increased (from seven to fifteen reps) validity of measurement decreased (Apple Watch data was inconclusive in this regard).
§ Additionally, the Whoop Band performed better than the Apple Watch in ‘high amplitude’ movements (e.g. dumbbell curl to overhead press and burpees). This was likely due to the slim design when compared to the traditional watch face of the Apple Watch, allowing for a more consistent exchange of infrared light through the skin’s surface.
For the Apple Watch, HR results were ‘acceptable’ measures compared to the criterion when there was limited wrist movement (e.g. barbell back squat, barbell deadlift, and seated cable row). However, HR validity relative to exercise intensity was inconclusive.
“Measuring HR can be a motivating measurement tool for individuals aiming to simply increase physical activity (see HERE). Which is surprisingly important at times for even the most elite athletes. Measurements and monitoring can be motivating feedback that improves adherences and increases basic enjoyment, because the individual can see and quantify the results of their efforts. This alone can be a vital tool to improve consistency, which is a limiting factor to overall success of a training plan.”
“Further, for coaches, as it should be with implementing any technology and measurement tool, validity and reliability are an important factor to the practicality of use. Therefore, when it comes to utilising a wrist-worn device, noting an instantaneous or peak HR is likely less useful than simply an overall average session value. Additionally, if using an instantaneous measurement, it should likely be taken during a time that arm (or wrist) movement is limited. By doing this, the data is likely more valid, reliable, and in turn useful for the coach and athlete in making training decisions and monitoring workloads.”
Sprinting is fundamental to many sports including soccer, American football and track and field. In soccer, most goals are scored immediately after powerful actions, including sprinting. Sprint number has also increased in the English Premier League over the years, suggesting that in order to succeed an ability to move at pace is highly important.
Although this importance is clear, the way in which to best train the attribute is the subject of some debate. This stems from fitness coaches and athletic trainers having to delicately balance providing enough of a stimulus to elicit improvement without overstepping that mark and risking injury
In order to approach this problem, previous research has considered whether 20 m sprints of 90 % maximal effort might bring about adaptation without the concerns associated with exposure to maximal effort sprints. No benefits were seen following this, though it was suggested this was due to the relatively short distance over which the sprints were conducted.
The aim of this study was to investigate the effect of relatively longer sprint running at 90% to 95% of maximal velocity (Vmax) on sprint performance and mechanical outputs.
Thirty-four recreationally active adults (18 females and 16 males aged 20–33 y) were randomly assigned into a control group (CG, n = 12) and a training group (TG, n = 14). Both groups completed pre-testing and post-testing in the form of a 30-m sprint separated by a 6-week period. TG performed a weekly sprint-training session consisting of 30-m flying sprints at 90% to 95% of Vmax, while the control group performed no intervention training.
The target velocity of the TG was based upon Vmax achieved during pre-intervention testing and was reached during the intervention via verbal feedback provided between repetitions.
Both groups were allowed to continue with their recreational training during the intervention period in the form of 2 to 4 weekly sessions with upper body strength training and low-intensity endurance training. The participants were requested to refrain from high-intensity interval training, plyometric training, and heavy strength training involving the lower limbs.
Macroscopic sprint mechanical outputs (maximal horizontal force [F0], theoretical maximal velocity [V0], maximal power [Pmax], and the slope of the force–velocity curve were assessed to examine underlying mechanisms resulting in any potential changes in performance.
The findings from this study showed a positive effect following sub-maximal sprint training. This manifested as significant improvements in 10-, 20-, and 30-m sprint times.
While it’s often recommended that sprint repetitions should be executed maximally, this study shows that sprinting at 90% to 95% of Vmax effectively promotes adaptation through the interaction between high intensity and larger accumulated work that can be obtained before the onset of fatigue.
Despite this, we should be careful when thinking what this might mean for our athletic populations, as the present participants were recreationally active adults with limited sprint training experience.
Sprint performance becomes more resistant to training with increasing performance level, age, and training status. Despite the interesting findings here it would be useful for future studies to consider better training individuals.
Additionally, it would also be useful to determine the lowest effective sprinting intensity for stimulating adaptation. More research is warranted to explore if a more gradual progression in terms of training volume and intensity might still be effective.
Significant improvements in the training group were observed for 10, 20, and 30-m sprint time. These improvements were accompanied by higher step rate and theoretical Vmax and maximal power
All within- and between-groups differences were in the range of trivial to small.
“This study showed that weekly sprint running sessions at 90% to 95% of Vmax over 6 weeks induced positive effects on sprint performance and mechanical outputs. This is a really interesting and potentially valuable finding. If such outcomes were replicated in athletic populations this could provide some confidence when looking to increase max velocity while keeping training intensity in mind.”
“It is worth noting that this study too was not without its casualties. In fact, three of the dropouts from this study were directly related to the training within the intervention. I don’t mention this to scare anyone off from training at high velocities, in fact I encourage it. It’s important to ramp this stuff up though and keep training age and recent overall training volume and intensity in mind. Hopefully in time, replications of this study can aid with that by reducing the required percentage of max velocity needing to be trained at in a way that still provides benefits to athletes.”
Hamstring injuries are one of the most common injuries seen in sports that involve kicking and sprinting. Eccentric strength specifically at long muscle lengths is an important aspect in helping reduce the risk of re-injury Factors such as location and grade of injury, tenderness to palpation, as well as pain can all impact length of time loss due to injury
Blood flow restriction (BFR) training is a well-known modality (See ) used in the rehabilitation as well as HERE the strength and conditioning setting to help improve strength and hypertrophy at lower loads when heavier loading is contraindicated. BFR occludes venous outflow while maintaining arterial inflow thus depriving oxygenrich blood to a muscle. As a result, the body is “tricked” into working harder, recruiting more muscle units at much lighter loads compared to traditional resistance training.
The purpose of this article was to compare the effects of low-load BFR eccentric hamstring training intervention (BFR-ELET) vs. traditional high-load eccentric hamstring training (TRAD-ELET) without BFR.
This crossover design included forty active adults who were randomized into two groups: eccentric lower extremity training using BFR (BFR-ELET) and traditional eccentric lower extremity training without BFR (TRADELET). Subjects performed training 2x/week for 6 weeks on a leg curl machine spending 3 seconds to reach full extension during the eccentric phase.
Subjects in the BFR-ELET group performed 30% 1RM for 4 sets of 30/15/15/15 reps with 30 sec rest period between sets. After a 3-minute rest period, subjects in the TRAD ELET group trained the contralateral leg at 80% 1RM for 3 sets of 10 repetitions with 30 sec of rest between sets.
Single leg power was assessed by single leg vertical max testing completed on a Jump Mat system and unilateral eccentric hamstring strength was assessed using the Nordic hamstring exercise administered on a Nordbord hamstring testing system.
Hamstring power assessed by single leg vertical max increased in the TRAD-ELET group with no change in the BFR-ELET group.
Unilateral eccentric maximal hamstring strength increased in both the TRAD-ELET and BFR-ELET groups but no differences were seen between groups.
There was an increase in perceived soreness from baseline to immediately after training in both groups with no differences between each.
BFR-ELET induced greater muscle cell swelling determined by the change in phase angle (fluid shifts between intracellular and extracellular space) compared to TRAD-ELET training.
Under traditional strength training guidelines, training at 70% 1RM is necessary for strength gains. Incorporating BFR into a resistance training program promotes musculoskeletal longevity by promoting strength gains with loads as low as 30% 1RM.
Low-load BFR was found to be just as effective compared to traditional resistance training on an eccentric hamstring training program for improvements in strength, lean mass and perceived soreness.
BFR training is a useful tool to utilize early on in the rehabilitation process to expose the muscle to low training loads while promoting muscular adaptation and less mechanical stress to the joint.
Traditional high-load resistance training will always be necessary for optimal strength gains but incorporation of BFR provides benefits.
“BFR is a great modality to use and incorporate to elicit strength gains when optimal loading can’t be tolerated as it has shown to have similar outcomes compared to traditional resistance training. This can be very useful in the rehabilitation process after injury or surgery to help prevent atrophy and deconditioning of the affected limb.”
“This article included healthy individuals, so future studies should assess eccentric hamstring strengthening and use of BFR with injured populations and assess translation to various functional tasks and demands of sport.”
In previous studies, the combination of plyometric, sprint and change of direction (COD) training has been shown to have a greater effect than on an athlete who trains these qualities in isolation ( ). Moreover, COD specific training performed HERE with a ball (CODB) has also resulted in greater training outcomes, such as maximal oxygen uptake, than COD or ball training drills performed alone (HERE). However, few studies have investigated the effects of plyometric and CODB training on athletic performance. Therefore, the aim of this study was to investigate a bi-weekly training session which included plyometric and CODB training on performance in male soccer players of varied biological ages.
Forty-eight healthy young male players (U15-17) from one soccer team participated in this study Participants were grouped based on their maturation stage using the protocol described by Mirwald et al (HERE).
This study was a pre-to-post measurement design with the aim to examine the effects of 8 weeks of plyometric and COD with ball training (P-CODBT) on athletic performance. The experimental groups included subjects who were both circa and post-PHV (N = 24). In this group, participants replaced part of their technical practice with a P-CODB training component twice weekly. Similarly, another circa and post-PHV group (N = 24) were also assigned to the control group, who completed regular training with no intervention for comparison.
The protocol for the experimentation group required players to complete a series of progressive drills which began with basic plyometric drills (e.g. hurdle jumps), followed by pre-planned COD work (e.g. 45⁰ cutting task) with a soccer ball. The programme progressed by increasing the amount completed (e.g. sets) and the angle at which the COD task was completed to increase both volume and intensity Prior to starting this, the researchers collected varied measures including; anthropometric data, linear sprinting with and without ball, COD speed with and without ball, vertical jump, dynamic balance, and endurance-intensive performances. These were repeated again at the 8-week mark to assess change, if any
Following an eight-week intervention, the experimental group experienced greater improvements in explosive performances when compared to the control group. Explosive performances were described in the following test: linear sprinting, vertical jump and change of direction test.
Secondly, the findings of this study indicated that irrespective of the intervention, both the experimental and control group improved in terms of their athletic performances over an eightweek period. These measures were taken in consideration of data such as linear sprinting with and without the ball.
Regarding maturation status, athletes in both the control and experimental group who were identified as circa-PHV did not show any significant differences in pre-to-post testing. However, the post-PHV group reported improvements in comparison.
Through my experiences working in soccer academies, there remains a clear divide between the role of the strength and conditioning (S&C) and soccer coaches. I would recommend that any S&C coach wishing to use soccer balls within their sessions to communicate with all staff prior to avoid potential conflict. Role conflict is the grey area that exists between practitioners from different disciplines (e.g. S&C and sports coach), where one coach may perceive that another coach is stepping over their professional boundaries. This is very important for culture as discussed in the attached podcast, as the wrong type of conflict can have a detrimental impact on the athlete. Moreover, the remedy for this is very simple. You can access the reviewed paper and clearly demonstrate the benefits of including a soccer ball in training. Better yet, you could include them within your sessions to tap into their technical expertise to create good working relationships.
In practice, coaches could look to include the soccer ball during a multidirectional speed and agility session. Within this, I would include closed (e.g. unopposed) tasks to develop an appreciation for technique before adding in competition. Simply put, coaches could:
§ Use the ball during deceleration practice, where a player would run 5m and have a 1m “zone” to stop in. To make this harder, coaches could increase the distance or shorten the “stop zone”. Coaches could time these efforts weekly to generate a competitive edge in training.
§ Perform simple change of direction routines with a ball included. For example, coaches could set up a line of cones which players have to pass between whilst moving side-to-side. This is not only an excellent form of conditioning, but a great way to work on lateral transitions. A video example of this can be seen with the drill performed after the warm-up. HERE,
§ Utilise a football during basic plyometric tasks. This could be delivered in a way that is sensitive to the technical aspects of plyometrics (e.g. a single-leg hop where a player keeps the ball up) or a fun manner to drive engagement (e.g. a player performs a broad jump with the ball between their legs). Irrespective of which approach you use, space, time or number of jumps could all be used to drive intensity
“The current study found that a short-term training protocol that included plyometric training, sprint training, COD, with a ball at feet can be a safe way for coaches to create meaningful, athletic benefits for youth who are post-PHV In the attached article, coaches will see examples of plyometric, strength and COD work that are appropriate for a youth team. It is important that as coaches, we think of inventive ways that continue to develop practice that are both engaging, fun, but most importantly, meaningful. Simple tasks such as those mentioned in the practical takeaways section could be of use.”
“In closing, I believe that coaches and practitioners could look to integrate the protocols in the reviewed study on a bi-weekly basis. These could be integrated into a well-structured warm-up before soccer-specific training through actions that closely resemble competitive actions (e.g. deceleration positions with the ball at feet). In time, these strategies could prove to be a more effective way of improving soccer-related fitness and technical ability compared to completing training for these (e.g. fitness specific work and skill specific work) in isolation.”
Long muscle length training is the in vogue trend within the fitness community. Previous research has demonstrated there is a regional hypertrophy and 1RM strength effect with partial range of motion training at short (i.e. flexed knee creating a shorter hamstring) and long muscle lengths (i.e. extended knee and flexed hip creating a long hamstring) (see ). Another study HERE found long muscle length training superior to short muscle length in biceps thickness (see ). However, HERE this was a short 5 week study so extrapolating these results to longer time frames is challenging. Further, 1RM strength has not been studied in long vs. short muscle length interventions within the upper body Therefore, this study aimed to compare 1RM strength and regional muscle hypertrophy of the biceps following periods of short and long muscle length training.
19 untrained women (age = 22.8 ± 10.5 yr) performed a unilateral seated dumbbell preacher curl in a withinsubject experimental design 3 times per week for 8 weeks. Each arm was randomized into an initial range of motion (iROM) or final range of motion (fROM) training intervention. The iROM condition performed preacher curls from 0° (fully extended) to 68° (halfway) of elbow flexion. fROM condition started at 68° and finished at 135° (perpendicular to the floor). The arm that started each session was alternated and all 4 sets per arm in each session were taken to volitional failure.
Before and after the 8 week intervention, biceps cross sectional area (CSA) was assessed with ultrasound at 50% and 70% of the distance from the acromion to the lateral epicondyle of the upper arm (shoulder to elbow, halfway down to approximately ¾ down to the elbow). CSA at 50% and 70% were summed to estimate overall biceps hypertrophy. 1RM strength was also assessed on the dumbbell preacher curl through a full range of motion.
The iROM condition led to a significantly greater CSA than fROM at 70% of biceps length but not at 50%. CSA summed increase was not significantly different between conditions. iROM showed significantly greater 1RM strength increase than fROM.
These findings add further evidence to the growing pile on the superior stimulus achieved via long muscle length training. For example, Dustin Oranchuk reached similar conclusions in his popular isometric training review (see ). But what does this mean for different HERE training populations?
§ Use a full range of motion to maximize strength and hypertrophy within general and athletic populations.
§ Use end range partial reps as an intensity technique during overreaching or high-volume training cycles for hypertrophy. For example, performing a set of 10 biceps curls close to failure and finishing with 6-10 end range partial reps from bottom to ¼ or ½ way Legendary bodybuilding coach and influencer John Meadows has been doing these for years, which you can see on his YouTube channel I’ll post in the links below
This paper also adds to the body of research surrounding regional muscle hypertrophy. For example, end range leg extensions preferentially target the distal quadriceps (see HERE).
And we can preferentially target the upper and lower abs with crunches and leg raises respectively (see ). This could be useful for targeting the lower quads during return to HERE play training from knee injury
One strategy that can be employed by athletes who don’t want to increase muscle mass due to performance or weight class restrictions and who are close to peaking is to reduce the range of motion training at short muscle lengths. For example, quarter squats and deadlifts from blocks. These exercises can limit muscle growth while potentially leading to better speed and power improvements (see HERE).
“If your goals are to maximize hypertrophy, you’re short changing yourself if you’re not training with a full range of motion. The mechanisms leading to the increased hypertrophic response are unknown but we could speculate a multitude of factors like increased work, increased eccentric contraction, and greater time under tension. Regardless, save the short muscle length partial exercises for athletic performance peaking. It should also be noted this study only compared two partial range of motion conditions. Would the iROM condition outperform a full ROM condition?”
This month ’s top research on technology and monitoring.
CAN A SHOE IMPROVE TRAINING PERFORMANCE?
HOW MIGHT WE USE VIRTUAL REALITY TO HELP WITH LEARNING SPORTS MOVEMENTS?
CAN YOU USE VELOCITY-BASED TRAINING WITHOUT THE TECH?
For a runner, especially mid- to long-distance (800-m to marathon) athletes, footwear can play a massive role in performance, consistency, and longevity in the sport. Recent advancements (e.g. installing a carbon-fibre plate through the midsole of the shoe, which increases bending stiffness) have been noted as the reason for numerous world, national, and continental records being broken since being first introduced in 2017 (see ). One shoe in particular is the Nike ZoomX HERE Vaporfly 4% (VPF), which is lighter, and the carbon-fibre plate is less dense and more stiff than traditional foam running shoes (see and ), this modification has shown potential to HEREHERE improve performance 2-6-% in long-distance races (see HERE and HERE).
This increase in performance likely comes via improved running economy, impacting the mechanics of the stride, ground contact, and influence on the fatigue of an athlete during competition (see). However, there has been no HERE research around the VPF’s influence on a runner’s physiology, biomechanics, performance, and fatigue during a training session. Therefore, the purpose of this study was to examine the impact that the VPF shoe had on performance, running mechanics, and physiological responses (e.g. perception of pain and fatigue, heart rate) during a long-interval workout.
Twelve men (age 32.91±7.5-yrs) with amateur to national level middle- to long-distance running experience participated in two separate (7-days apart) ‘long-interval training sessions’ (5x1000-m with 90-sec recovery). Each participant wore the VPF shoe for one workout and a ‘traditional’ running shoe (without a carbon-fibre plate and within 50-g of the VPF) for the other training session.
For each session, following a 15-min warm-up, a countermovement jump was performed prior to the running portion of the workout, measuring jump height. A triaxial accelerometer device (estimating vertical, horizontal, and lateral forces) was worn to measure running power, vertical power, vertical ratio, vertical oscillation, step length, step frequency, contact time, flight time, and leg stiffness (see HERE and ) during the long-interval session. Heart rate was also HERE measured, considering the average heart rate for the session as a marker of intensity. Finally, countermovement vertical jump height was re-assessed post-workout, as well as a subjective report of running-related muscle pain using a visual analogue scale (0-100; 0= no pain at all to 100 = the worst pain). The subjective report of pain was also reported 24-hr post-training session.
Data was compared between using the VPF shoe against a traditional light-weight running shoe across the participants, where researchers looked specifically at session performance (time for completion), running power, running mechanics, leg stiffness, heart rate, perceived pain, and neuromuscular fatigue (change in jump height).
For running athletes looking to gain an edge on performance, the VPF shoe appears to be a worthwhile investment given the faster times reported and the kinematic improvements noted with running (increased stiffness, stride length, flight time, and decreased contact time). It appears the carbon-fibre midsole paired with the light-weight foam technology has a positive impact and competitive advantage compared to traditional running footwear (see HERE).
Given the novel stress that an athlete endures with using a carbon-fibre footplate, there is an increased risk of injury if this modality is introduced too abruptly (see Therefore, coaches HERE). need to introduce this into training conservatively and gradually. Possibly consider utilising it only during warm-ups for a week, then introduce into a brief period for one workout in the subsequent week.
Beyond improvements in performance, given the notable decreased perception of muscular fatigue, runners could possibly train at a higher frequency across the week, which would potentially allow them to increase the volume of activity at a given intensity. By doing this, there is a greater potential for improvement in performance (see HERE).
Further, this would be a favourable option for athletes that are competing at a high frequency (every 1-2-weeks) allowing them to continue to train at high intensity, yet be recovered in time to compete later in the week. For example, an athlete completes an intensive workout session on a Tuesday and has time to feel fully recovered by the following Saturday.
However, coaches should proceed with caution as shoes that are more stiff increase the stress at the ankle joint (see ). This means that coaches need to be aware of the orthopaedic loads that an HERE athlete endures in an on-ground training session and consider prescribing cross training modalities (e.g. biking, elliptical, or swimming) as alternatives for steady state aerobic development. Lower intensity, long-duration training is also important and complementary to the development of endurance athletes (see HERE).
There were increases in stiffness, stride length, flight time, and a decrease in contact time when wearing the VPF shoe. When participants used the VPF shoe, performance improved on average by 2.4-% across participants, with 83-% of the runners having a better (faster) performance when using the VPF shoe compared to the control. However, pacing across the five repetitions was more inconsistent (‘unstable’) when participants were wearing the VPF shoe. Which could be due to the novelty of the shoe when running 1000-m as fast as possible.
There was no difference in neuromuscular fatigue (change in jump height pre- to post-workout) when comparing shoe conditions.
There was a lower perception of pain in the 24-hr following the training session for 86-% of the runners when using the VPF shoe.
Despite the faster running speed with the VPF shoe, stride frequency, running power and heart rate were similar between conditions.
Then check these out...
“An increase in performance is the pursuit of any athlete and coach, but it is important to not lose perspective and integrity with the sport of competition, as the impact of the carbon-fibre foot plate has begun re-writing the record books of long-distance events (see HERE). Therefore, there are some questions to consider regarding their use:”
“As a training tool, what is the athlete’s tolerance to the change in footwear? Because of the way that the VPF shoe changes the mechanics of the foot, ankle, and gait (see HERE). It is important for coaches to consider the potential increased risk of injury that could possibly ensue. As you would with any new exercise or intensity, it should be introduced gradually.”
“What is the life of the shoe? As more research unfolds about the technology of the VPF shoe, it is important for coaches to have an understanding of the ‘life’ (effectiveness) of the shoe. At what mileage does the technology begin to wear out?”
“Lastly, as a competition device, it seems we went through a similar process with performance enhancing swimsuits previously (see HERE). As these advancements in shoe materials and design advance, similar consideration by governing bodies has begun (see HERE) and will have to continue to decide a standard for what is allowable for an athlete’s footwear as manufacturers continue their quest for improvements (see HERE).”
If an athlete is looking to learn a skill, or series of skills, such as a gymnast looking to learn a new routine, the typical way to do so would be via action observation (AO). This would involve watching a video or seeing someone perform that routine live. Research suggests that this can be a positive way of acquiring such movement performance.
It is thought this works due to shared neural pathways with physical movements, referred to as functional equivalence. When implementing this practice considerations are made as to whether the learner observes themselves or others, and whether this is done from a third, or first person, visual perspective.
Motor simulation theory (MST) suggests an athlete observing themself may be more beneficial as it maximises the neural overlap between observed and real movements. In terms of perspective, it seems that the skill being learned plays a role here. A third-person perspective may be most appropriate for technical skills to adequately convey visual information of the skill technique (such as during a new gymnastics routine). Conversely, for outcome focused skills (like taking a penalty kick in soccer) a first-person view might be more appropriate.
One general limitation with this technique is often the quality of the video footage being used. The emerging use of 360° virtual reality (VR) may be able to assist with this though. 360° VR is able to present real-world video footage through a head-mounted display (HMD), allowing users to scan the performance environment. This potentially allows for increased realism of the experience by providing visual information that is more representative of competitive experiences.
A constraints-led approach (CLA) to learning provides practitioners with key considerations for creating practice environments that may facilitate the functional similarity between real-world and simulated settings, such as 360° VR. The CLA advocates for the inclusion of key constraints in practice (such as an opportunity for variability of movement). This provides learners with opportunities for action (such as being able to pass a ball to more than one teammate). This is obviously more representative of competitive environments.
This opinion piece sought to consider how such practices might be designed as technology continues to improve and become more cost-effective in a way that facilitates such learning.
The authors of this piece gave opinions on a number of topics with regard to the use of 360° VR when looking to improve or acquire movement skills in sport. These included:
Ecological dynamics view of skill development.
Constraints-led approach (CLA) to 360° VR practice design
Practice design considerations for 360° VR regarding.
§ Constrain to afford.
§ Representative design
§ Repetition without repetition
These opinions and suggestions are each discussed in the next section.
Ecological dynamics view of skill development.
From an ecological dynamics perspective, skilled behaviour comes from continuous interactions between the learner and the practice/performance environment. Practitioners design practice environments to encourage learners to explore and adapt to work towards finding their own performance solutions.
With this, focus shifts from technique replication to searching for individually appropriate movement solutions (such as those that suit their height, or previous experience). This suggests reproducing “optimal” techniques may not be required for skilled behaviour, rather the capacity to adapt and produce stable individualised performance solutions is critical.
Therefore, when using 360° VR in developing skilled behaviour it may not be necessary for athletes to reproduce expert technical models, but rather to be able to explore finding their own individualised performance solutions.
Constraints-led approach (CLA) to 360° VR practice design
Constraints led approach to learning is centred around exploration and therefore information in the performance environment regulates motor processes. In turn an athlete’s ability to move will impact what they perceive to be an opportunity. For example, a soccer player may not consider they can score in a situation if the ball is on their weaker foot.
The demonstrated neural similarities between AO and actual movement suggest that 360° VR is capable of producing realistic training experiences that may enhance the link between an athlete’s abilities and environmental information from which action emerges.
Utilising head movements via the HMD and scanning the environment allows for information to be perceived in a representative fashion, developing the link between information and movement (i.e., perception-action coupling). Therefore, using 360° VR learners can develop the ability to interpret relevant environmental information that is not available with 2-D flat-screen video.
Practice design considerations for 360° VR
Using 360° VR as a form of AO in skill development requires practitioners to think about how practice will replicate critical elements of performance contexts to match simulated and actual behaviours. The CLA provides a number of key design principles that can support practitioners through this process including:
From a CLA, movement variability (i.e., exploration) is key as it can aid learners in attuning to various opportunities for action and developing individual movement solutions. This contrasts with traditional skill acquisition approaches, which may aim to reduce movement variability to produce “ideal” movement patterns.
In CLA, practitioners can manipulate task constraints to amplify exploratory behaviour When selecting constraints, practitioners have a number of options, such as reducing the distance between defending and attacking players in soccer or rugby
With 360° VR, time to pass in soccer could be manipulated by recording video footage from a first-person perspective of passing from multiple distances from defending players (e.g., high direct pressure) that present a range of passing options with varying difficulties.
Representative design
AO research has acknowledged the importance of presenting representative content, encouraging use of first-person perspective footage for technical skills such as golfputting.
Such research has primarily used 2-D video emphasising replication of a prescribed motor pattern. Given the immersive nature of 360° VR, practitioners can now capture highly representative content that accurately simulates relevant opportunities for action and potentially more authentic first-person perspective. This facilitates a strengthening of perception-action coupling which importantly underpins successful performance outcomes.
Finally, it is known that competitive environments are constantly changing, with learners rarely executing skills in a similar fashion under the same conditions. The CLA advocates for practice that repeats the same skills but not done identically, known as “repetition without repetition”.
In applying this with 360° VR, a key consideration is the skill of the learner. For higher-performing athletes, video footage could incorporate more complex task constraints requiring technical skills to be executed in various ways, such as changes in movement speed or distance.
This paper aimed to support practitioners in sport and movement settings in designing practice using 360° VR for skill acquisition and adaptation. It was proposed that 360° VR may be utilised as a form of AO to simulate critical aspects of performance contexts.
In line with the ecological dynamics perspective and CLA to skill acquisition, 360° VR may facilitate more representative practice design, as it allows users to scan the performance environment, potentially supporting perception-action coupling and helping performers to attune to relevant performance information.
“This opinion article proposes that practitioners should be encouraged to consider adopting AO using 360° VR to support skill development. They provide a number of practical guidelines based on a CLA in sport and movement settings to help with this.”
“I think that 360° VR will play a bigger and bigger role in training and skill acquisition in sport worldwide. UK-based VR companies such as Rezzil are at the forefront of this with their realistic graphics being an incredible selling point of the platform. Gone are the days of seasickness from utilising such technology.”
“As the hardware continues to develop and researchers continue to work together to aid practitioners in their dayto-day activities I think that from skill acquisition to rehabilitating players to those simply standing within while replaying pivotal movements from the weekend’s competition VR is only going to become more important across the board. Papers such as this are an important part of showing us how that might work”
Velocity based training (VBT) is a flexible training method based on the strong inverse relationship between load (kg) and movement velocity (m.s-1). VBT can be utilised in several ways to inform and support training practices; test and monitor individuals via load-velocity profiling, autoregulate load prescriptions via velocity zones and targets, motivate athletes by driving intent and regulate both effort and volume via velocity loss thresholds.
It has been suggested that using progressive velocity loss thresholds can indicate neuromuscular fatigue and therefore be potentially useful during resistance training. Whilst the evidence supporting the benefits for VBT compared to traditional training approaches is currently unclear, the consistent acute responses of velocity measures to resistance training bouts make them an appealing approach to tracking and manipulating training.
Strength and conditioning coaches will now often measure velocity when looking to prescribe appropriate training loads allowing programming adaptation. Some of the devices used for such measurement though can be impractical, expensive, and difficult to use.
The aim of the study was to investigate whether resistance-trained participants can accurately predict changes in barbell velocity, specifically in the deadlift exercise, without feedback from velocity monitoring devices.
Seventeen participants (16 male, 1 female; age = 24.7 ± 3.8) were randomised in a counterbalanced, crossover design using two experimental sessions. These consisted of three sets of deadlifts at 60-and-80% of their onerepetition maximum (1RM).
The number of repetitions were determined by the participants as they were asked to terminate each set when they felt the barbell velocity had reduced by 20%, relative to repetition one.
A binomial mixed effects regression model was used to assess the accuracy of participants' ability to stop after reaching at least 20% velocity loss.
Most participants were unable to accurately perceive changes in velocity without exposure to augmented feedback. When participants were asked to terminate their set when they believed their barbell velocity had reduced by at least 20%, relative to the first repetition, they tended to underestimate their velocity loss.
The authors of this study acknowledged that these relatively poor results may be due to the fact that participants were not provided with any feedback on velocity at any point. In some previous research participants were provided with anchoring points (i.e., their minimum and maximum velocities) during familiarisation.
It has previously been suggested that lack of any feedback could lead to participants being 4.2 times more likely to underestimate velocity
Participants tended to underestimate how close they were to a 20% velocity loss and therefore had relatively low probability of correctly stopping after reaching this threshold.
There was only a 10.5% probability that participants could accurately perceive that at least a 20% velocity loss had occurred in their set.
“There is mounting evidence that VBT could be beneficial when programming for athletes. This is especially the case as it seems that there is interindividual variability in rep-load capabilities between athletes. VBT can help to navigate this, but it can come at a cost in terms of the expense of appropriate kit”.
“This research suggests that solely basing VBT on intuition is insufficiently accurate. It proposes though, as well as pointing to other research, that using such equipment for a short period to familiarise athletes may be a beneficial halfway house. If there is potential to borrow or rent equipment to show athletes what different velocity percentages feel like then VBT perhaps can be implemented without it afterwards. Specific research into such strategies would be beneficial to confirm this as it is unclear how long such improved perception might last.”
Monitoring is a critical tool in providing athletes with effective and appropriate training. The feedback gained from various monitoring strategies (e.g. training load- or performancemonitoring) aims to provide coaches and athletes with insight that helps to support the current training and to further guide and direct future interventions. Training load monitoring aims to quantify work through external (e.g. distance covered, accelerations, changes in direction) or internal (e.g. rating of perceived exertion (RPE), heart rate (HR), perception of wellness) variables. Likewise, performance monitoring is a means of showcasing an athlete’s ability in a given task (e.g. sprint, countermovement jump (CMJ), isometric mid-thigh pull (IMTP)). All of these options can be overwhelming for coaches, and they can lean on research to help guide best practice for their athlete and situation.
However, despite the rapid growth of youth sport, there has been a lag in research around youth athlete monitoring, which typically covers collegiate and professional level athletes. Therefore, this study implemented various performance and load monitoring strategies during a 10-week training period to examine variance and potential relationships between metrics.
Fourteen youth basketball players (9 females, 5 males, age 15.1±1-yr) participated in a 10-week study that involved the use of six different monitoring strategies. Participants performed a CMJ, IMTP, wore an inertial measurement unit (IMU) to measure movement and HR, noted their session RPE following training, as well as completed a wellness questionnaire involving questions around soreness, fatigue, sleep quality and quantity, mental stress, and motivation measured on a 1-7 scale.
Weekly, prior to training, participants performed three maximal CMJs and IMTPs on a force platform to establish validity between measures. Additionally, both tests were standardised based on specific procedures and feedback to provide reliable measurements. Researchers examined peak ground reaction force, jump height using take-off velocity, and calculated net impulse (see Wellness data was HERE). collected before every training session and compliance was ensured with a daily reminder The IMU measured accelerations in all directions to calculate an ‘accumulated load’ (PlayerLoadTM) score, as well as a PlayerLoadTM per minute (see Lastly, peak- and mean-HR were noted HERE). during each training session, and time spent in various zones (e.g. zone 1 (65-72-% HRmax), up to zone 5 (>93-% HRmax)). Daily measures were converted into weekly averages with standard deviations. Weekly measures were compared across the 10-weeks of data to identify variance and potential relationships between variables.
Observing no change or relationship is not always a bad thing. Given the phase of training examined (competitive period leading into a national level tournament), coaches were not wanting to impose a stress that created an adaptive response. Therefore, the performance and subjective reports helped to support the readiness and recovery of athletes, suggesting that the training load was not imposing an unrecoverable load.
§ Context matters and coaches need to appreciate that performance (e.g. jump height, peak force, net impulse) may simply plateau or stagnate during a competition period, which is alright, given the goal of competition requiring technical and tactical execution. Performance in season is measured by wins and losses, whereas performance monitoring in situations like this aims to identify athlete readiness.
§ Likewise, during an off-season period, performance may decrease during a ‘loading’ phase, but through the adaptive process, following an unloading phase, the hope and goal would be to see performance super-compensate and increase. This allows for an increase in performance potential, which is known as athlete preparedness. This concept supports the fact that progress is not linear and takes time through periodisation (see HERE).
Coaches should identify measurement options for performance tests (e.g. CMJ, IMTP) that are timeefficient, valid, and reliable to support using them frequently as a means of monitoring athlete readiness. Procedures (e.g. set up, instructions, time of the week, equipment used, etc.) should be standardised to support the comparability (reliability) and decision making between measurements.
Given a short bout of high-intensity work or a longer duration training session, peak HR and PlayerLoadTM may be limiting metrics in the short term, that simply do not appropriately represent training load. However, there is potential for those metrics as a chronic (4-6-week) measure of training load. Given the lack of ability to identify significant variation based on the volume and intensity of a session, session RPE may be a better metric for identifying significant variation in the short-term (e.g. a single session or a given week’s worth of data).
Overall, a subjective questionnaire may be the best and most practical method given its effectiveness and efficiency as a frequent and low-cost strategy to identify an athlete’s psychophysiological readiness. A questionnaire, used consistently, provides low-invasive insight that can lead to further discussions and interventions (e.g. education around recovery strategies, decreasing or increasing training, etc.).
Although variations in activity (session RPE) were noted across the weeks, there was ‘no impactful change in any of the load or performance variables.
Any relationship between load and performance testing was ‘trivial-small’, as well as internal to external variables (i.e. subjective responses to PlayerLoadTM), noting ‘trivial-small’ relationship between each other Wellness questionnaire responses were consistent across the 10-week period.
There were noted changes in HR and IMU metrics between weeks five and six, but no impact to performance or subjective reports following.
“Monitoring helps coaches to identify daily fluctuations in athlete readiness. This insight provides coaches with the feedback necessary to appropriately prescribe training, so that the intended effect is achieved, and overall athletic development can be managed. This is critically important during the in-season phase, when maximising readiness on gameday is a primary goal in supporting performance and mitigating injury risk. Coaches should aim to implement a combination of monitoring strategies that allow for an objective measure to hold athletes accountable to their efforts (e.g. CMJ, IMTP, HR, speed), as well as a subjective measure that allows athletes to note their feelings and their mood state (e.g. fatigue or motivation) or potential affect (e.g. session RPE). The combination of subjective and objective measures helps to accurately represent the psychophysiological readiness state of an athlete.”
“Notably, the question of ‘how are you doing?’ or ‘how are you feeling?’ is asked on a daily basis. However, when implemented as a monitoring strategy via a questionnaire, this basic insight is recorded for reference and considered for guidance in training prescription. It is not a question or a process to be ignored. A subjective questionnaire is a practical, valuable and appreciable strategy, even for youth athletes, that helps to create a better training environment and support more effective training or education interventions. If this is important to the coaches, it will be important to the athletes. This evolves through a trusting relationship, education, and adopting an appreciation and lifestyle for how the individual operates in the hours outside of training (e.g. time and stress management, sleep, social interactions, nutrition, etc.)”
In tennis, a single point can be won following one movement or a combination of change of direction (COD), sprinting, and deceleration efforts. As such, players are expected to be competent in a range of skills outside of the technical aspects of tennis (e.g. performing a serve), in order to perform at a high level. To ensure that an athlete can complete these to a high standard, they must have adequate levels of lower-body strength. In addition to this, coaches are challenged when dealing with youth due to the impact that maturation can have on such qualities. For example, although lower-body neuromuscular performance naturally progresses as a child gets older, few studies have investigated the impact of maturation on several performance variables specific to tennis. Therefore, the aim of this study was to analyse the maturational differences that accompany lower-body neuromuscular performance in young tennis players.
One hundred and fifty-five junior tennis players (91 boys and 64 girls) participated in this study. On average, the age of the participants was 13.1 years. These subjects were divided into three groups based on their stage of peak height velocity (PHV) based on the Mirwald method. These were a pre-PHV group (N=57), a circa-PHV group (N=50), and a post-PHV group (N=48).
All participants were required to complete a speed test over 5, 10 and 20 metres, a modified 5-0-5 change of direction (COD) assessment, a hexagon agility test, and bilateral and unilateral countermovement jump (CMJ). COD deficit was also calculated by understanding the additional time that a COD requires when compared with a linear sprint of an equivalent distance (e.g. 10m time vs. 5-0-5 time). This allows practitioners to understand how well or how poorly an individual can change direction.
The main findings of this study were that pre-PHV players presented lower levels of performance compared to post-PHV players. In the CMJ test, pre-PHV jumped on average 20% less than both the circa and post groups. In addition, sprint and COD speed was 7-8% slower when compared to the circa group and 10-13% in the post-PHV group, respectively
In addition to the above, pre-PHV players also demonstrated lower performance in both linear sprint and COD performance in all testing when compared to those who were currently experiencing PHV (circa).
In contrast, COD deficit was significantly lower in the prePHV group than in both the circa and post-PHV groups, suggesting that they performed better in this test than other maturation groups. These results highlight the importance of the maturity stage as a measure of performance over chronological age when designing training programs.
Tennis players can perform several COD movements during a point. Therefore, COD may be considered one of the most important physical skills needed to be successful. Changing direction can be described as a multi-step action, whereby preliminary deceleration occurs over several steps in an attempt to reduce movement. As such, coaches must utilise a host of exercises that mirror the demands of basic COD movements. For example, a COD specific programme could include the following:
1. Lateral Lunge (3-4 x 4-5 reps) – fantastic to develop lateral, reactive strength for side steps and cutting. Examples can be seen of these in the attached article.
2. Forward and Reverse Lunges (3-4 x 6-8 reps each leg) – essential for developing braking mechanics for both acceleration and deceleration, where a majority of COD’s are unilateral in nature (HERE).
3. Drop Jumps (3-4 x 3-5 reps) – offer a new way to overload the hamstring eccentrically, whilst developing stiffness properties required to rebound. Practically, tennis players can use this strength to “push” off the floor more optimally Coaches working with youth need to be aware of the interindividual differences caused by maturation. In the attached video, Professor Joe Eisenmann breaks down the issues surrounding the timing and tempo of growth. These, as shown in the reviewed study, have a marked effect on the physical attributes of youth. As such, coaches need to maintain regular growth data (e.g. through use of Mirwald equations) to be aware of sensitive periods of development where injury may occur In tennis, common injuries include those (HERE). to the shoulder and lower back As a result of this, coaches may wish to consider rapid changes in (HERE). growth (>7.5cm/yr), where a host of overuse injuries such as Osgood Schlatter’s, Severs disease and hip Apophyses can occur (HERE).
From these results, it is more evident that players need approaches that are sensitive to their stage of maturation. For those who are pre or circa PHV, athletes should access training programmes that are rich in foundational positions (e.g. push, pull, hinge, squat and brace) to refine and hone movement as changes to muscle and bone size occur. However, in those post-PHV, athletes would benefit from greater plyometrics, sprint and COD work to supplement a traditional strength training programme to benefit from the increase in androgen production. Such training would not only drive performance, but could serve as a valuable way to decrease COD deficit.
“This study has important implications for practice and future research. In considering that young tennis players will often train with varied ages and stages of maturation, coaches should be aware of the limitations of chronological age. For example, In the attached podcast, Dr Sean Cumming explains how children of the same age can have as much as 5 years of biological difference. Therefore, coaches should be aware of the factors that influence training (e.g. amount of fat free mass and emotional maturity) that could result in injury Tournaments that are “bio-banded”, where children compete in a maturation band (e.g. circa PHV players enter a tournament) rather than chronological age, have been successful. ”
“With regards to COD tests, coaches may see the COD deficit measure as a way to confirm that an athlete is in fact “pre-PHV”. The authors suggested that COD deficit may be greater in more mature athletes, as faster and heavier players potentially produce greater inertia, resulting in longer contact times during COD task. Therefore, a greater COD deficit score, indicative of a COD movement taking longer to perform, may not reflect a more “mature” athlete the same way that jump and speed test may. In light of this, coaches may need to spend more time training COD in athletes who are circa and post PHV, with further studies looking to investigate if this method is a valid measure of maturation status.”
This month ’s top research on nutrition.
PRACTITIONERS PERSPECTIVE: REAL-LIFE INSIGHTS INTO NUTRITIONAL PRACTICES FOR MATCH PREPARATION IN ELITE FOOTBALL
BODY COMPOSITION ASYMMETRY AND INJURY RISK IS RUGBY LEAGUE
Elite professional football clubs continue to increase the number of specialised sport science staff to provide enhanced support for their players. As a result, considerations around performance nutrition have become an essential part of training and competition routine for elite players and is one discipline of sport science which can genuinely affect the outcome of a match (i.e. lack of fuel will result in poor performance).
Both academic researchers and field experts have produced comprehensive guidelines on performance nutrition for elite football players. A great example is the UEFA expert group statement on nutrition in elite football. However, practitioners on the ground often admit that the academic literature has limited application when working in an applied sporting environment. I have personally experienced this during my 4 years working with The Football Association.
While some have tried to provide examples of real food that players can consume, all logistical and planning related to meals and supplement provision has so far been overlooked. In reality, sports nutritionists are not often concerned with ensuring each football player has their exact amount of carbohydrate intake of the day, they are more concerned with challenges such as ensuring menus are appropriate for the training day, food presentation is attractive or organising match day supplements. Being a successful sports nutritionist requires many soft skills such as thorough organisation and planning, and these skills often make more of an impact than an individual’s knowledge of nutrition.
This paper aimed to outline how sports nutritionists organise and plan their match day nutrition strategy, to shed light on a less documented area of performance nutrition.
Sixty-two international performance nutritionists working with elite football players were surveyed via email or WhatsApp. The survey asked how practitioners organised food and supplements within the 24 hours surrounding a match, from match day-1 (the day before the match) dinner to match dinner
Pre-match
Match day-1 dinner, match day breakfast, match day lunch, and match day pre-match meal were usually provided in a hotel as a buffet. These meals mostly consisted of highcarbohydrate, moderately high-protein and fibre, low-fat foods, with no junk food or other items that could trigger indigestion. Match day-1 dinner had the highest tendency of having its menu altered according to the local cuisine, whereas all other meals remained similar to maintain player routines. Some meal examples included: white carbohydrates (pasta, rice, and potatoes), protein (chicken, white fish, minced meat, salmon), fresh fruits, pudding (cake, rice pudding), cooked or raw vegetables, soup, greek and sweetened low fat yogurt, water, tea/coffee and juice. Some breakfast food examples included: white carbohydrates (breads), protein (eggs, ham, fish), jams and honey, muesli and oats, nuts, cooked vegetables (tomato, mushroom), and fresh fruit.
During Match
Locker room nutrition was provided at 98% of matches. The foods, drinks, and supplements focused on high carbohydrate and/or protein intake. Target carbohydrate intake for a football player pre-game was 31-45g and half-time 15-30g. Some food examples include dried or fresh fruits, cereal bars, sport-specific food and drinks (gel, bars, gummies, shakes, premade drinks), fruit juice and water. Caffeine was provided by a large majority of practitioners in the form of pills, gums, or carbohydrate gels, for a target quantity of 100 to 200 mg. Pre-workout mixes were also provided at most (62%) matches 30-60 min before kick-off mainly composed of caffeine, carbohydrates, beta-alanine, creatine, and branched chain amino acids (BCCA).
Post-Match
Similar foods, drinks, and supplements were offered in the locker room as pre-match and half-time, however carbohydrate and protein targets were adjusted for recovery goals. Carbohydrate target was 46-60 g or >60 g and protein target 20-30 g or 31-40g, again dependent on players game time (for example 60 mins and above). Protein was mainly provided in shakers in the form of whey protein or milk. Shakes were often mixed with carbohydrates (65%). Practitioners also offered specific foods or supplements such as tart cherry or pomegranate juice, or omega 3’s.
The objective of the post-match meal is to deliver high protein and carbohydrate content, with moderate fat and fibre. When playing at home, players usually preferred to eat at the stadium or restaurant than the bus or locker room because they were confident in the quality of its food preparation, the level of detail, and hygiene provided. Food was usually served within 30 minutes of leaving the pitch, in a buffet style.
While playing away, meals were often served on the team bus or in the locker room. Bus meals were most practical since there was usually little time to prepare a meal in the locker room and there wasn’t a dedicated space to eat. The food was usually prepared at the hotel and brought on board. As an alternative, teams could order food delivery to the stadium from a local caterer. Practitioners tended to be more flexible with post-match meals, allowing players more autonomy in their choices for both logistical and psychological reasons. Example foods included: proteins (white meat, red meat, meat in sauces, grilled fish), raw/cooked vegetables, soups, white carbohydrates (pasta, bread, rice), sandwiches, pizza, burgers, wraps, sushi, lasagnes, plain low-fat and Greek yogurt, fresh fruit, cake, good quality chocolate bars, fruit juice and water
Most practitioners (67%) did not provide a pre-bed nutrition option for players, but if so, are more likely to do this for an away match than a home match (79%). The nutrition is generally composed of protein rich food and drink.
Snacks for travel
Snacks for travel are a challenge to organise for multiple reasons including logistics (cold vs. hot food), ability to choose products (bus/flights, own caterers), cost, and ingestion timing. Some examples include water, tea and coffee, fruit juice, fresh and dried fruit, cereal bars, sandwiches, yogurts, and nuts.
Supplements
These supplements were used by practitioners the majority of the time (>50%):
1.Caffeine (91%) for pre-match and before some training.
2.Creatine monohydrate (84%) at a maintenance dose of 3-5 g/day taken for up to a year.
3.Omega 3 (75%)
4.Vitamin D (68%) at a daily dose of 2000 to 4000 IU
5.Antioxidants (65%)
6.Nitrates (65%) at a 400-800 mg supplement or in natural foods pre-match
7.Beta-alanine (65%) used for two-four weeks blocks or <4g/day in the form of power or pills
8.Collagen (57%): mainly for injured players (40%), specific player profiles (33%) or during congested fixtures (27%), at a dose of 10 g/day, in the form of powder or homemade foods or drinks rich in collagen.
9.Probiotics (55%) for individual cases.
This supplementation strategy is in accordance with the Australian Institute of Sports Supplement framework.
Nutrition for the injured player
Individuals were supplemented differently depending on the injury type and phase. For muscle/tendon injuries:
§ Acute stage: whey (87%), omega 3 (67%) and collagen (76%).
§ Second phase: whey (80%), creatine (73%) and omega 3 (67%).
§ Last phase: whey (87%), omega 3 (67%) and collagen (60%).
For bone-related injuries, practitioners also provide vitamin D (73%) and calcium (27%)
The preferred meal structure is usually the same before and after a match, in the form of a large buffet where players can choose their food based on their preferences and dietary requirements.
Providing meals, drinks, snacks, and supplements around match day allows practitioners to have more control over player’s match day nutrition. This can mean players are more likely to hit their nutrition match day goals e.g fueling and recovery.
Many specific foods including lactose-free, gluten-free, vegetarian, vegan, and religiousrelated options are frequently offered to accommodate individual players’ preferences, allergies, and dietary restrictions. This ensures all players have equal access to each food opportunity.
Practitioners periodise the nutrition for each meal/snack to achieve specific goals. For example, (1) carbohydrate content of foods increased towards the match to fuel an athlete during the match (2) fibre and fat content of foods decreased towards the match to aid digestive comfort (3) high protein and carbohydrate content post-game to support muscle repair and glycogen resynthesis to aid recovery.
Even in the context of such an elite and professional sport as football, many practitioners offer burgers, pizzas, and chocolate bars to players, but use quality (homemade) and timing (post-match exclusively) as safeguards. Providing foods that players enjoy helps them engage with the nutrition strategy. Often asking players what they would like and adapting it so it meets their nutrition goals can help them feel like they have more choice, which can further promote engagement.
Although this research lists the most used supplements, supplement provision is always tailored to an individual athlete’s needs. Following the SENr guidelines, a practitioner must first a) assess the risk of an athlete taking a supplement b) assess the need for providing a supplement c) assess the consequence of an athlete taking that supplement. This ensures that supplement provision is tailored to the athlete.
Want to learn more?
Then check these out...
•Sports nutritionists face many challenges in preparing match day nutrition strategies that align with relevant literature but are practical in their application. While specific nutrition competencies and knowledge is required to be a successful sports nutritionist, additional skills such as planning and organisation, adaptability and effective communication are all needed to implement a match day nutritional strategy in reallife conditions.
In practice, I have never worked with a team where every single player nails their pre-match or post-match strategy aligned to the literature. There are always barriers in the way from poor hotel nutrition service (if the quality of chefs is terrible), to players not feeling like it on the day (behavioural).
Although this publication was done in football, similar themes will be present in the sport of rugby and some of the key topic areas will likely be worse due to lack of funding (i.e. eating in a restaurant after a match is a luxury in rugby. Normally players get a sloppy lasagne on the bus home with a protein shake!).
Finally, some of my best successes with nutrition strategies is the support and buy-in with fellow colleagues who understand what players need and require on match day support. Upskilling and ensuring colleagues understand what I am trying to achieve has been important in my own practice.
Rugby league is an intermittent and collision-based team sport requiring players to develop speed, agility, muscular strength, and power to perform effectively The extensive physical requirements to play rugby league place players at a high risk for musculoskeletal injuries, particularly to the knee, calf, back, head, and neck. Certain factors such as age, weight, and body mass index (BMI) may contribute to an athlete’s increased risk for injury
A concept of morphological asymmetry (the study of side-to-side differences in body size, shape, function, or composition) in athletes may also play a role in identifying future injury risk. Although asymmetry exists in sport and is influenced by sport specificity The severity and composition of asymmetry may negatively or positively impact an athlete s technical skills, performance, and vulnerability to injury
An example of leg morphological asymmetry has been reported in elite Australian Football League (AFL). Specifically, the axially loaded and dominant supporting leg relative to the kicking leg has been shown to possess greater bone strength and bone mass. This results in a significant interaction between bilateral symmetry in lower limb lean mass and kicking performance. It is thought that this distinction is due to training exposure over time, as seen when assessing more experienced players against those with less experience.
It is yet unknown whether such functional and morphological asymmetries are also present in rugby league players. It is likely though due to the similar routines and repetitive mechanical loading of both sports. Additionally, it is yet to be determined whether these asymmetries increase the susceptibility to injury in rugby players, and what role an awareness of them may have for athletic trainers’ and strength conditioning coaches during screening and rehabilitation. Also measuring any changes in side-to-side asymmetry over the course of the season could be useful in assessing an athlete's progress.
The primary focus of this research was to assess any differences in side-to-side asymmetries among college rugby league players and to determine how these differences may change over the competitive season. A secondary aim was to investigate whether the extent of side-to-side asymmetries was associated with lower back pain and lower limb injury
Thirty-seven rugby players (22 women, 15 men) from a varsity university team volunteered to participate in this study At baseline (pre-season) players completed a questionnaire which assessed demographic information (eg, age, sex, ethnicity), position played, number of years playing competitively, leg dominance, and history of lower back pain and lower limb injury duration and severity
Weight and height were measured by a trained researcher. Body composition was assessed at preseason and two months later at postseason using a dual energy x-ray absorptiometry (DEXA) scan. During this time, players had 8 to 12 rugby games and three to four practices per week between the two body composition measurements.
Male vs Female
Men at baseline were 8.3cm taller, 10.0 kg greater body mass, 13.0 kg more lean mass, and 13.6 kg higher fat free mass compared with women. Women also had 7.6 % greater fat mass compared with males.
Female players had more asymmetry than males. Female players had significantly greater lean mass in left trunk (11,518g) compared with the right (11,817g), significantly higher fat mass in right leg (4124g) compared with the left (4478g), and significantly more bone mass in the right arm (193g) and leg (539g) compared with the left arm (189g) and leg (531g).
Backs vs Forwards
At baseline, forwards had 11.7 kg greater body mass, 6.4 kg more lean mass, 5.1 kg higher fat mass, 6.6 kg greater fat-free mass, and 3 kg/m2 higher BMI than backs.
Trunk fat mass was significantly greater in forwards (right side: 5033.3 ± 2240.8g, left side: 4962.7 ± 2247.6g) compared with backs (right side: 3668.2 ± 1411.7g, left side: 3708.9 ± 1349.4g). Back players had significantly greater bone mass in the right arm (209.4 ± 43.4g) compared with the left (202.1 ± 44g).
Pre-season vs Post-season
In forwards, there was a significant increase in right arm lean mass compared with the left (+42.6 ± 122.4g). In addition, forward players had significantly less right leg fat mass compared with the left (-178.8 ± 187.1g).
In backs, trunk fat mass at baseline compared with post-season was significantly greater on the left side at baseline (-101 ± 107.4g) and, on average, greater on the right side at post-season (+95 ± 181g).
Lower Limb Injuries and Lower Back Pain
When examining player records of lower limb injuries, it was found that 26% of players had been injured in the last month while 42% of players had been injured in the last year The majority of the lower limb injuries lasted one to six weeks and primarily affected women's back players. Regarding the history of lower back pain, 42% of the players reported suffering in the prior month. Most of these cases were male backs and female forwards. There was no evident correlation between lower limb injury or lower back pain and asymmetry of body composition.
It is interesting to note that female rugby league players had greater asymmetry than their male counterparts. More research should be conducted to better understand this discrepancy and should be considered when creating strength and conditioning programs.
There were no associations between lower back pain and lower limb injuries with body composition asymmetry, meaning that slight body composition asymmetry does not contribute to a players injury risk.
This research might be useful for sport science departments to further inform the prescription of position specific training programmes (i.e. scrum versus kicking), nutrition and recovery strategies.
Compared to backs, forwards have a greater body mass, thus demanding a greater requirement for energy intake during a match or practice session (case-by-case basis) This highlights the importance of individualised nutrition support to meet these demands. For example, forwards will need to consume a larger carbohydrate pre-match breakfast than backs for example.
“Thanks to recent developments in science and technology, athletes are now able to get a very detailed analysis of their physical and mental condition to be ready for competition. Understanding body composition asymmetry is an example of this. This type of analysis is especially important for sports like AFL, rugby and football, where body composition asymmetry can play a role in both injury risk and performance. Though more research still needs to be done in this area for rugby, it seems that body composition asymmetry is not linked with lower limb injuries or lower back pain.”
“In practice however, other measurement methods are used to assess potential asymmetry such as hamstring strength, power, force and jumping assessments.”
Implementing effective nutritional strategies can directly influence an athlete’s performance, recovery, immunity, and general wellbeing. Due to the energy demands of elite sport, athletes often demonstrate an increased requirement for energy, nutrients (carbohydrates, proteins, antioxidants, and electrolytes) and fluids.
A healthy diet for athletes can be represented by the Swiss Food Pyramid for Athletes. This is a revised version of the classic Food Pyramid, which considers the dietary needs of individuals who are exercising five times a week for at least one hour. It provides a visual guide to the quantity of different food groups required to achieve an optimal healthy diet.
Athlete eating behaviour is dynamic and determined by numerous factors, including environmental and individual. One psychological determinant of eating behaviours is an athlete’s personality. Personality is described as a combination of characteristics or qualities that form an individual's distinctive character. Costa and McCrae (2008) conceptualised the Big Five model which aims to categorise and describe an individual’s personality. This model consists of five main domains to interpret personality traits:
1.neuroticism (calm, confident vs. anxious, pessimistic),
2.extraversion (reserved, thoughtful vs. sociable, fun-loving),
3.openness to experiences (prefers routine, practical vs. imaginative, spontaneous),
4.agreeableness (suspicious, uncooperative vs. trusting, helpful), and
5.conscientiousness (impulsive, disorganised vs. disciplined, careful).
The aim of this research was to analyse the personality traits and eating behaviours among an elite group of Polish athletes training in team sports using the Swiss Food Pyramid for Athletes and the Big Five model.
The following research questions were posed:
1.What are the nutritional behaviours of athletes?
2.What are the personality traits of athletes?
3.What are the relationships between the personality traits and eating behaviours of the athletes?
Elite professional male Polish athletes (n=213) training in team sports, including basketball (n = 54), volleyball (n = 53), football (n = 53) and handball (n = 53) volunteered for this study.
A validated questionnaire was used to refer to the qualitative recommendations of the Swiss nutrition pyramid. The questionnaire consists of 23 statements on rational eating behaviours, with a five-point Likert scale of answers (from one to five, from “definitely not”, “probably not”, “hard to say”, “rather yes” to “definitely yes”). The questions concerned:
§ regularity of consuming at least three meals a day,
§ the recommended frequency of consuming vegetables and fruits,
§ whole grains,
§ dairy products,
§ other nutritional sources of protein,
§ adequate hydration before, during and after training,
§ preferred fats and limiting non-recommended products (sweets, fast foods, carbonated and non-carbonated sweetened beverages and energy drinks).
To assess personality traits from the Five-Factor model, the Neuroticism Extraversion Openness-Personality Inventory-Revised (NEO-PI-R Inventory) by Costa and McCrae was used. The NEO-PI-R includes 240 statements referring to the five personality factors, the truthfulness of which is assessed on a five-point scale (from “completely disagree” to “completely agree”).
The highest percentage of athletes (above 90%) declared consuming at least three meals a day, adequate hydration during and after exercise and avoiding fast food products. A high percentage (over 80%) declared that they avoided carbonated and non-carbonated sweetened beverages as well as energy drinks in their diet. Approximately 50% declared regular consumption of meals and eating the most caloric meal closely before or after main training. Roughly 30% of the athletes ate fish one to two times a week, one to two portions of fruit daily, whole grain products at least twice a day, followed a varied diet and reduced animal fats in their diet.
Among the personality dimensions of the Five-Factor model, the tested athletes obtained the highest results in terms of conscientiousness (M = 128.50), agreeableness (M = 123.20 AU) and extraversion (M = 121.80 AU), lower results regarding openness to experience (M = 115.00 AU) and the lowest in terms of neuroticism (M = 72.15 AU). Analysis demonstrated that personality traits explain 99% of the variance in the level of the rational nutrition behaviour index, with agreeableness, extraversion, neuroticism, and conscientiousness being significant predictors.
Neuroticism personality trait (anxious, pessimistic) correlated with regular consumption of meals, 2-3 portions of fruits and vegetables consumed daily, and avoidance of energy drinks. Openness (imaginative, spontaneous) related to the consumption of raw vegetables at least once a day, daily consumption of two-three portions of fruits and vegetables and limited sweet and salty snacks. Agreeableness (trusting and helpful) predicted increases in cereal products and decreased hydration post training. Conscientiousness (disciplined, careful) ensured regular consumption of meals, limiting sweet and salty snacks, and at least two servings of dairy products.
Understanding an athlete’s personality can help anticipate how they will respond to nutrition strategies. Therefore, it is useful for sports nutritionists to assess an athlete’s personality traits before providing them with guidance. This can be done through screening athletes with questionnaires that are designed to evaluate personality traits.
Nutrition and psychology have a strong relationship when understanding eating behaviour. Consulting with a psychologist can help nutritionists tailor dietary guidelines and promote behaviour change for athletes. In some cases, it may also be beneficial to have both a sports psychologist and nutritionist in individual sessions if an athlete's nutritional practices are strongly linked to their psychological state.
The results of this research show that some key education topics include regular consumption of meals and eating the most caloric meal before or after main training, consuming fish one to two times a week, eating one to two portions of fruit daily, consuming whole grain products at least twice a day, and reducing animal fats in their diet. This will help improve athletes’ knowledge of nutrition.
“The core of being a successful sports nutritionist lies in the ability to affect behaviour change to optimise an athlete s diet. Whether this is getting an athlete to eat more protein or improving an athlete’s entire relationship with food. Based on the COM-B model, this requires a sports nutritionist to improve an athlete s opportunity, capability, and motivation to elicit behaviour change that will improve their nutrition.”
“This research shows that considering an athlete’s personality traits can also better help us understand their eating behaviour. It is an interesting and new development in the relationship between nutrition and psychology and research understanding how athletes from different sports and cultures behave towards nutrition should now be conducted”.
“If you do not have time to screen athlete personality traits, you can begin to gauge an athlete s personality by getting to know them as a person, an individual and as an athlete before attempting to educate them with a nutrition strategy This would allow you to better design 1-1 consultations, environmental cues and enablers for each athlete to better achieve successful nutrition strategies”.