Stress fractures are a relatively common issue in athletes of all levels, particularly in those sports that require repetitive loading such as running. It is suggested that 95% of stress fractures occur in the lower limbs; in runners they account for 15-20% of injuries (Wright et al 2015)
Stress fractures are a relatively common issue in athletes of all levels, particularly in those sports that require repetitive loading such as running. It is suggested that 95% of stress fractures occur in the lower limbs; in runners they account for 15-20% of injuries (Wright et al 2015). Research suggests track and field athletes have the highest incidence of stress fractures and the tibia, metatarsals and fibula are the most common areas at risk in the running athlete (Fredericson et al 2006). The loss time due to injury from stress fractures is frustrating for the running athletes as at some point they will require a period of relative rest from their usual running training. The location and grading of the stress fracture dictates the amount of time for which training will need to be modified.
While the pathophysiolocial change behind the formation of a stress reaction and stress fracture is relatively well understood, susceptibility of one athlete over another has been the source of much debate in the research. To date, most of the research has looked at one or a combination of either intrinsic factors (eg age, sex, race, bone density and structure, hormonal profile, sleep patterns) and/or extrinsic factors (eg biomechanics, training locations, type, frequency and intensity of training).
Extrinsic Factors
Load is the only extrinsic factor that has both clnical consensus and consensus in the literature – that the rate of load laters the rate of stress fractures. When the combination of bone strain, bone strain rate, the strain magnitude, as well as the limited periods of recovery allowed between exposure to the bone strain is ore than the bone is able to repair, the accumulated micro-damage means the development of a stress reaction or a stress fracture is inevitable.
Stress fractures develop along a continuum, starting with the asymptomatic accelerated remodelling of bone due to bone strain. If the osteoclastic bone resorption exceeds the osteoblastic bone reformation, then microscopic cavities form. It is this lag time between osteoclastic activity and osteoblastic activity that leaves bone temporarily weakened. If the load continues at a rate at which the osteoblastic activity cannot overcome the osteoclastic activity, the bone cannot adapt and a stress reaction or stress fracture results.
Tenforde and colleagues (2013) reported that running more than 32km per week increased the risk of stress fractures 2x in males and 3x in females. While this value demonstrates that increased repetitive bony load increases the risk of stress fractures, the actual value is not helpful. Thirty-two km per week is a value many recreational runners will achieve, elite runner will regularly exceed 70km per week, with values as high as 180-200km common in some elite endurance runners.
There is limited understanding of how to predict when load may become pathological. since it is traditionally through that stress fractures develop gradually, monitoring symptoms such as boney tenderness may be important. Monitoring training loads and identifying any symptoms of early bone pain reaction could potentially lead to early intervention, which may help prevent a stress reaction turning into a stress fracture. This may decrease the amount of modified training time required for the athlete.
Poor biomechanics has been suggested as a contributory cause of stress fractures and from a theoretical perspective this has good reasoning. If a runner has poor biomechanics, they may have a decreased ability to absorb load through their musculoskeletal system, thereby increase the bony load. For example, a recent study by Nunns et al (2016) found that athletes who sustained a tibial stress fracture had greater heel peak pressures and lower range of tibial rotation. The proposed impact of this was reduced impact attenuation. Reduced impact attenuation would potentially increase the bony load, leading to this increase incidence of stress fractures.
However, despite extensive research into biomechanics and injury, there is still no agreement in the literature that altered biomechanics contribute to stress fractures. In a recent systematic review and meta-analysis of stress fractures in runners (Wright et al 2015) the only two factors found to be associated with an increased risk of injury were a previous history of stress fracture (OR 4.99) and a female gender (OR 2.31)
One of the reasons for this could be a failure to sub-categorise the biomechanical variations that might contribute to different loading of different areas of bone. For example, a rigid supinated foot type (Milner, Hamil & Davis, 2010) would create a different load profile to a hypermobile pronating foor type (Hestroni et al 2008) in runners.
On a clinical level it is important t to take into consideration the extrinsic risk factors such as load and biomechanical control, but it is also important to realise that the development of stress fractures may be more complex than simply these factors.
There may also be intrinsic factors at play that could alter that individual athlete’s potential predisposition to stress fractures.’
The traditional concept associated with increased stress fracture risk in running was of the female athlete triad. Running is a sport in which the aesthetics and potential performance benefits of a lean body mass is often considered desirable. This desire for lean body weight is considered to contribute to the concept of the female athlete triad in which the three factors of low energy availability, menstrual dysfunction and low bone mass combine to increase the risk of stress fractures (Rauh et al 2010). However, more recent research into the multifactorial effect of hormones, thyroid function, and the effect of glucocorticoids, such as cortisone, has led to the appreciation of a more complex interaction of intrinsic factors, and so the concept of the female athlete tried has now been replaced by the concept of relative energy deficiency in sports (RED-S).
Oestrogen inhibits bone resorption and progesterone promotes bone formation. The luteal phase in the menstrual cycle may be shortened in some exercising women. This is associated with decreases in both oestrogen and progesterone, which could potentially have a detrimental effect on bone health. Recurrent short luteal phase cycles and anovulation were associated with spinal bone loss of approximately 2-4% per year in physically active women (William 1999). However, this analysis of menstrual dysfunction does not take into account that males develop stress fratures and may over-simplify the complex hormonal interaction involved in the increased risk of stress fractures.
An example of this is that taking the oral contraceptive pill does not appear to alter the rate of stress fractures, and amenorrheic sportswomen are less responsive to oestrogen therapy women with ovarian failure (Braam et al 2003). IT is now recognised that altered levels of oestrogen and progesterone are just one of the hormonal interactions that can cause a disruption of the hypothalmic-pituitary-ovarian axis and have a potentially detrimental effect on bony recovery.
Also associated with bone poor heath are decreased lutenizing hormone, decreased thyroid function, decreased insulin like growth factor 1, decreased leptin and an increase in cortisol (Lamrinoudaki & Paoadimitriou 2010).
Cortisol is know to have a negative feedback effect on the hypothalamus. High chronic training loads, such as those that running athletes regularly undertake, are know to cause a chronic stress on the body and increase cortisol levels. This can potentially affect bone recovery cycles and increase the risk of the development of a stress fracture.
General life stress and altered sleep have also been shown to increase cortisol levels. Finestone and Milgram (2008) found that, after 25 years of looking at modifiable risk factors for stress fractures in the Israeli military, only two interventions reduced the incidence of stress frasctures (from 31% incidence to 10%) – a decrease in cumulative marching and ensuring 6 hours of sleep per night. They studied bone resorption markers and found that recruits who had 6 hours of sleep had no change in these markers, recruits who had six hours of sleep in a vertical position had an increase in bone resorption markers by 98%, and
recruits who were sleep deprived for 64 hours had an increase by 170%. Interesting to note in this study was that they found responders and non-responders at ratios of 40 and 60 per cent respectively, which provides support for perhaps a genetic predisposition to stress fractures that will no doubt be the focus of future research in this area.
The more research that is conducted in the the area of stress fractures in athletes, the more it becomes apparent that the development of stress fractures can be a complex concept – more than just the traditional paradigms of simply too much load or too little energy intake by the athlete can explain. Rather, the risk profile of an athletes developing a stress fracture can be contributed to by a complex interaction of hormonal factors, the balance of which can be effected by training loads and also life stresses. It is important for the treating physiotherapist to be aware of such factors and explore potentially contributing causes with any athletes they are treating.