Gaze stabilization function does not predict injury incidence among collegiate athletes

Author(s):  
Rebecca A. Bliss ◽  
Addie Long ◽  
Chloe Anderson ◽  
Allison Niederee ◽  
Hannah Arellanes ◽  
...  
2020 ◽  
Vol 30 (4) ◽  
pp. 249-257
Author(s):  
C. Quintana ◽  
N.R. Heebner ◽  
A.D. Olson ◽  
J.P. Abt ◽  
M.C. Hoch

BACKGROUND: The vestibular-ocular reflex (VOR) integrates the vestibular and ocular systems to maintain gaze during head motion. This reflex is often negatively affected following sport-related concussion. Objective measures of gaze stability, a function mediated by the VOR, such as the computerized dynamic visual acuity test (DVAT) and gaze stabilization test (GST), may have utility in concussion management. However, normative data specific to sport, sex, or concussion history have not been established in collegiate athletes. OBJECTIVE: The objective of this study was to establish normative values for the DVAT and GST in collegiate athletes and explore the effect of sport, sex, and concussion history on VOR assessments. METHODS: The DVAT and GST were completed by 124 collegiate athletes (72 male, 52 female, mean±SD, age: 19.71±1.74 years, height: 173.99±13.97 cm, weight: 80.06±26.52 kg) recruited from Division-I athletic teams (football, soccer and cheerleading). The DVAT and GST were performed in the rightward and leftward directions during a single session in a standardized environment. Normative values for DVAT and GST measures were expressed as percentiles. Non-parametric statistics were used to compare differences between groups based on sex, sport, and concussion history. Alpha was set a-priori at 0.05. RESULTS: Overall, the median LogMAR unit for 124 athletes completing the DVAT was 0 (IQR = 0.17) for both leftward and rightward. The median velocities achieved on the GST were 145 °/sec and 150 °/sec (IQR = 45 and 40) for the leftward and rightward directions respectively. Significant differences were observed between sports (p = 0.001–0.17) for the GST with cheerleading demonstrating higher velocities than the other sports. However, no significant differences were identified based on sex (p≥0.09) or history of concussion (p≥0.15). CONCLUSIONS: Normative estimates for the DVAT and GST may assist in the clinical interpretation of outcomes when used in post-concussion evaluation for collegiate athletes. Although sex and previous concussion history had no effect on the DVAT or GST, performance on these measures may be influenced by type of sport. Sport-related differences in the GST may reflect VOR adaptations based on individual sport-specific demands.


2019 ◽  
Vol 3 (Supplement_1) ◽  
Author(s):  
Kiley Field ◽  
John Gieng ◽  
Giselle Pignotti ◽  
Sofia Apsey

Abstract Objectives The relationship between the inflammatory potential of the diet, estimated by the Dietary Inflammatory Index (DII) score, and bone health has been studied in older populations and suggests that the diet can influence bone mineral density (BMD) and fracture risk. These relationships have yet to be explored in other potentially vulnerable populations, such as athletes, where risk of injuries may be more common due to high physical stresses and over-use. The aims of this study were 1) to examine the correlation between DII scores, and BMD in collegiate athletes, and 2) to assess the relationship between DII score and self-reported prior injury incidence. Methods Healthy collegiate athletes (n = 43) were recruited for this study: football, n = 12; men's soccer, n = 2; women's soccer, n = 13; women's swimming, n = 12; and women's basketball, n = 4. For each athlete, three 24-hour dietary intakes were collected using a standardized multiple-pass interview methodology (Nutrition Data System for Research) and this data was used to calculate individual DII scores. Body composition, including whole-body sub-total BMD, was measured using dual-energy X-ray absorptiometry. A modified overuse injury questionnaire (Oslo Sports Trauma Research Centre) was used to assess incidence of injuries in the prior 12 months. Results The participants (n = 14 male, n = 29 female) had a mean age of 19.4 ± 1.1 yrs and BMI of 25.8 ± 4.1 kg/m2. Mean DII score was −0.43 ± 0.17 points (range: −3.94 to 4.34). Mean BMD was 1.251 ± 0.169 g/cm2. Overall, DII score and BMD was not correlated (P = 0.47). Furthermore, DII scores of athletes that reported no prior injury did not differ from those who reported 1 or more injuries. Conclusions Unlike research in postmenopausal women, it appears that bone health of young healthy athletes is less vulnerable to the influence of diets with higher inflammatory potential. Moreover, the lack of difference in DII score among athletes reporting various levels of prior injury suggests that the inflammatory potential of the diet is a poor predictor of injury risk in collegiate athletes. Funding Sources N/A.


2021 ◽  
Vol 9 (8) ◽  
pp. 232596712110322
Author(s):  
Jason M. Avedesian ◽  
Tracey Covassin ◽  
Shelby Baez ◽  
Jennifer Nash ◽  
Ed Nagelhout ◽  
...  

Background: Collegiate athletes with prior sports-related concussion (SRC) are at increased risk for lower extremity (LE) injuries; however, the biomechanical and cognitive mechanisms underlying the SRC-LE injury relationship are not well understood. Purpose: To examine the association between cognitive performance and LE land-and-cut biomechanics among collegiate athletes with and without a history of SRC and to determine the association among multiple cognitive testing batteries in the same athlete cohort. Study Design: Controlled laboratory study. Methods: A cohort of 20 collegiate athletes with prior SRC (9 men, 11 women; mean ± standard deviation [SD] age, 20.5 ± 1.3 years; mean ± SD time since last SRC, 461 ± 263 days) and 20 matched controls (9 men, 11 women; mean ± SD age, 19.8 ± 1.3 years) completed land-and-cut tasks using the dominant and nondominant limbs. LE biomechanical variables and a functional visuomotor reaction time (FVMRT) were collected during each trial. Athletes also completed the Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) and Senaptec Sensory Station assessments. Results: In the SRC cohort, Pearson correlation coefficients indicated slower FVMRT was moderately correlated with decreased dominant limb ( r = –0.512) and nondominant limb ( r = –0.500) knee flexion, while increased dominant limb knee abduction moment was moderately correlated with decreased ImPACT Visual Memory score ( r = –0.539) and slower ImPACT Reaction Time ( r = 0.515). Most computerized cognitive measures were not associated with FVMRT in either cohort ( P > .05). Conclusion: Decreased reaction time and working memory performance were moderately correlated with decreased sagittal plane knee motion and increased frontal plane knee loading in collegiate athletes with a history of SRC. The present findings suggest a potential unique relationship between cognitive performance and LE neuromuscular control in athletes with a history of SRC injury. Last, we determined that computerized measures of cognitive performance often utilized for SRC management are dissimilar to sport-specific cognitive processes. Clinical Relevance: Understanding the relationship between cognitive performance and LE biomechanics in athletes with prior SRC may inform future clinical management strategies. Future research should prospectively assess cognitive and biomechanical measures, along with LE injury incidence, to identify mechanisms underlying the SRC-LE injury relationship.


Author(s):  
Andrew Froehle ◽  
Joseph Cox ◽  
Jedediah May ◽  
Kimberly Grannis ◽  
Dana Duren

Female athletes suffer painful, costly, and career-limiting non-contact anterior cruciate ligament (ACL) injuries more often than males. Previous research suggests that pubertal neuromusculoskeletal development contributes to this sex-bias, but the manner in which variation in pubertal development affects injury risk within females is poorly understood. Age at menarche is a variable, significant pubertal developmental event, signaling the onset of estrogen cycling and affecting musculoskeletal development. Earlier menarche may increase injury risk, possibly by increasing anterior knee laxity through prolonged estrogen exposure. The purpose of this case-control study was to test the primary hypothesis that collegiate athletes with previous ACL injuries have earlier age at menarche than their uninjured peers, and to test the secondary hypothesis that earlier menarche is related to greater anterior knee laxity in injured and uninjured athletes. The study sample consisted of female NCAA Division-I varsity athletes (N=14 injured, N=120 uninjured). Outcome measures included: menstrual history and ACL injury details (injury age, activity at time of injury, contact vs. non-contact), assessed by questionnaire; and anterior knee laxity assessed by KT-1000 arthrometer. Correlation, t-tests, and regression analysis were used to test for associations between age at menarche, injury incidence, and knee laxity. Fourteen athletes reported ≥1 non-contact ACL injury, and had significantly earlier menarche than uninjured athletes (12.6±1.3 y vs. 13.4±1.4 y; P=0.05). Earlier menarche also significantly predicted injury status (Wald c2=7.43; Pb=-1.02±0.37; OR=0.36; 95% CI:0.17-0.75), but was not correlated with anterior knee laxity. Within injured athletes, however, laxity in the unaffected knee was significantly related to time since menarche (r2=0.79, Pr2=0.72, P


2019 ◽  
Vol 7 (4) ◽  
pp. 232596711984071 ◽  
Author(s):  
Jeffrey D. Trojan ◽  
Joshua A. Treloar ◽  
Christopher M. Smith ◽  
Matthew J. Kraeutler ◽  
Mary K. Mulcahey

Background: As many as 30% of patients with knee pain seen in sports medicine clinics have complaints related to the patellofemoral joint. There is a paucity of research available regarding patellofemoral injuries, mechanism of injury, and playing time lost in collegiate athletes. Purpose: To describe the rates, mechanisms, severity, and potential sex-based differences of patellofemoral injuries in collegiate athletes across 25 National Collegiate Athletic Association (NCAA) sports. Study Design: Descriptive epidemiology study. Methods: Data from the 2009-2010 through the 2013-2014 academic years were obtained from the NCAA Injury Surveillance Program and were analyzed to calculate patellofemoral injury rates, mechanisms of injury, time lost, and need for surgery. Rate ratios and injury proportion ratios were used to quantify discernible differences between sex-comparable sports and timing of injury (ie, practice vs competition), respectively. Results: The overall patellofemoral injury incidence rate was 16.10 per 100,000 athlete-exposures (AEs). Women’s volleyball had the highest incidence of all sports (39.57 per 100,000 AEs). Injuries were 66% more likely to occur in competition than during practice. Female athletes experienced significantly more patellofemoral injuries than males in similar sports. Patellar tendinitis accounted for 49.2% of all patellofemoral injuries and was the most common injury in 20 of 25 studied sports. Patellar subluxation accounted for the most total days missed, and patellar dislocation had the highest mean days missed per injury (11.42 days). Patella fracture was the most likely injury to require surgery (80%). Conclusion: Patellofemoral injuries were most common in sports that require jumping and quick changes of direction, specifically women’s volleyball, men’s and women’s basketball, and women’s soccer. The majority of patellofemoral injuries in this cohort were classified as patellar tendinitis caused by overuse. Most injuries resulted in no competition or practice time lost. This information may contribute to the development of prevention programs aimed at addressing the most prevalent types and mechanisms of injury in each sport to reduce the incidence of patellofemoral injury in these athletes.


2010 ◽  
Vol 45 (4) ◽  
pp. 372-379 ◽  
Author(s):  
Jingzhen Yang ◽  
Corinne Peek-Asa ◽  
John B. Lowe ◽  
Erin Heiden ◽  
Danny T. Foster

Abstract Context: Social support has been identified as an important factor in facilitating recovery from injury. However, no previous authors have prospectively assessed the change in social support patterns before and after injury. Objective: To examine the preinjury and postinjury social support patterns among male and female collegiate athletes. Design: Prospective observational study. Setting: A Big Ten Conference university. Patients or Other Participants: A total of 256 National Collegiate Athletic Association Division I male and female collegiate athletes aged 18 or older from 13 sports teams. Main Outcome Measure(s): Injury incidence was identified using the Sports Injury Monitoring System. Social support was measured using the 6-item Social Support Questionnaire. Data on preinjury and postinjury social support patterns were compared. Results: Male athletes reported more sources of social support than female athletes, whereas female athletes had greater satisfaction with the support they received. Athletes' social support patterns changed after they became injured. Injured athletes reported relying more on coaches (P  =  .003), athletic trainers (P < .0001), and physicians (P  =  .003) for social support after they became injured. Athletes also reported greater postinjury satisfaction with social support received from friends (P  =  .019), coaches (P  =  .001), athletic trainers (P < .0001), and physicians (P  =  .003). Conclusions: Our findings identify an urgent need to better define the psychosocial needs of injured athletes and also strongly suggest that athletic trainers have a critical role in meeting these needs.


2017 ◽  
Vol 45 (9) ◽  
pp. 2148-2155 ◽  
Author(s):  
Hongmei Li ◽  
Jennifer J. Moreland ◽  
Corinne Peek-Asa ◽  
Jingzhen Yang

Background: Psychological risk factors are increasingly recognized as important in sport-related injury prevention. Understanding how these psychological factors may affect the risk of injuries could help design effective prevention programs. Purpose: To determine the effect of reported preseason anxiety and depressive symptoms on the risk of injuries during a prospective season in a cohort of collegiate athletes. Study Design: Cohort study; Level of evidence, 2. Methods: Collegiate athletes participating in 4 men’s sports and 5 women’s sports from 2 National Collegiate Athletic Association (NCAA) Division I universities were enrolled and prospectively followed during the 2007-2011 seasons. Preseason anxiety and depressive symptoms were measured at enrollment. Injuries occurring during the season were reported by certified athletic trainers. The injury incidence rate was calculated as the total number of injuries divided by the total number of athlete-exposures (ie, games and practices). Results: Of 958 enrolled athletes (response rate of 90.3%), 389 (40.6%) athletes sustained a total of 597 injuries. At preseason, 276 (28.8%) athletes reported anxiety symptoms, and 208 (21.7%) reported depressive symptoms. Among athletes reporting any of these symptoms, 48.5% (n = 158) reported having both anxiety and depressive symptoms. Athletes with preseason anxiety symptoms had a significantly higher injury incidence rate compared with athletes without anxiety symptoms (rate ratio [RR], 2.3; 95% CI, 2.0-2.6), adjusting for age, race, body mass index, history of injuries 12 months before baseline, and university attended, and this was observed for both male and female athletes. Only male athletes who reported co-occurring preseason depressive and anxiety symptoms had a significantly increased injury risk (RR, 2.1; 95% CI, 1.6-2.6) compared with male athletes who reported no co-occurring symptoms. However, no such increase in the injury risk was observed among female athletes or male athletes who reported preseason depressive symptoms but no anxiety symptoms. Conclusion: Athletes with anxiety symptoms at preseason were at an increased risk of injuries during the prospective season. Targeted programs could focus on psychological health and injury prevention for athletes, especially for those exhibiting symptoms at preseason.


2013 ◽  
Vol 48 (6) ◽  
pp. 782-789 ◽  
Author(s):  
Dustin R. Grooms ◽  
Thomas Palmer ◽  
James A. Onate ◽  
Gregory D. Myer ◽  
Terry Grindstaff

Context: A number of comprehensive injury-prevention programs have demonstrated injury risk-reduction effects but have had limited adoption across athletic settings. This may be due to program noncompliance, minimal exercise supervision, lack of exercise progression, and sport specificity. A soccer-specific program described as the F-MARC 11+ was developed by an expert group in association with the Federation Internationale de Football Association (FIFA) Medical Assessment and Research Centre (F-MARC) to require minimal equipment and implementation as part of regular soccer training. The F-MARC 11+ has been shown to reduce injury risk in youth female soccer players but has not been evaluated in an American male collegiate population. Objective: To investigate the effects of a soccer-specific warm-up program (F-MARC 11+) on lower extremity injury incidence in male collegiate soccer players. Design: Cohort study. Setting: One American collegiate soccer team followed for 2 seasons. Patients or Other Participants: Forty-one male collegiate athletes aged 18–25 years. Intervention(s): The F-MARC 11+ program is a comprehensive warm-up program targeting muscular strength, body kinesthetic awareness, and neuromuscular control during static and dynamic movements. Training sessions and program progression were monitored by a certified athletic trainer. Main Outcome Measure(s): Lower extremity injury risk and time lost to lower extremity injury. Results: The injury rate in the referent season was 8.1 injuries per 1000 exposures with 291 days lost and 2.2 injuries per 1000 exposures and 52 days lost in the intervention season. The intervention season had reductions in the relative risk (RR) of lower extremity injury of 72% (RR = 0.28, 95% confidence interval = 0.09, 0.85) and time lost to lower extremity injury (P < .01). Conclusions: This F-MARC 11+ program reduced overall risk and severity of lower extremity injury compared with controls in collegiate-aged male soccer athletes.


2019 ◽  
Vol 7 (3_suppl) ◽  
pp. 2325967119S0016
Author(s):  
Kevin A. Williams ◽  
Christian Askew ◽  
Christopher Mazoue ◽  
Jeffrey A. Guy ◽  
Toni Torres-McGehee ◽  
...  

Purpose: Vitamin D functions to regulate serum calcium concentrations via several cellular pathways. It has also been demonstrated to affect bone mineralization and turnover, thus making it essential for skeletal strength and adaptation to mechanical stress. Stress fractures are common athletic injuries formed by cyclic, repetitive skeletal loading that causes physical breakdown of the bone’s microstructure. Excessive running or jumping, coinciding with additional factors such as malnutrition or decreased sun exposure, may elevate an athlete’s risk for sustaining this overuse injury. Serum 25(OH)D is used as a clinical marker for vitamin D status. Previous research suggests adequate vitamin D status is important for prevention of skeletal injuries. Our hypothesis states that supplemental vitamin D will normalize athletes’ vitamin D status and will reduce stress fracture injury incidence rates for elite division 1 athletes. Methods: Prospective 245 subjects were recruited from 17 sports teams at the university. All subjects were over 18 years of age. No subjects were excluded. Subject 25(OH)D status was determined twice; once in August 2016, and again in February 2017. Following each testing cycle, subjects with 25(OH)D levels below 70 ng/ml were supplemented with cholecalciferol (50,000 IU) once a week for 8 weeks. Subjects were then monitored throughout their respective sporting seasons for stress fractures or other overuse skeletal injuries. Subjects initially completed an anthropometric questionnaire, and two compliance questionnaires were to be completed following each 8-week supplementation period. Retrospective: Athlete injury reports from each of the 17 teams will be used as a control to determine the incidence of stress fractures during the 2011-2015 seasons among non-supplemented athletes. Results: 245 subjects had 25(OH)D levels tested in August (40.4?14.3 ng/ml) with 18% being insufficient or deficient (<30 ng/ml). 191 subjects were tested in February (27.7?8.85 ng/ml) with 65% being insufficient or deficient, a significant decrease. All but three teams had significantly lower 25(OH)D levels in February compared to August. There was no significant difference between male and female athletes, however both groups saw significant declines in February. There was no significant difference between indoor and outdoor sports in August and February. 2 stress injuries were diagnosed in 118 enrolled subjects in our previous arm (1.65%). 2 additional stress injuries were diagnosed in 191 remaining subjects in our current arm (1.0%). 36 stress fractures were diagnosed in 571 subjects retrospectively (6.3%). Conclusion: Several factors have been shown to influence vitamin D status, including nutrition and sun exposure. Serum 25(OH)D status is expected to decline naturally in winter months due to lack of sun exposure. Results indicate a substantial decline in serum 25(OH)D from August to February in our cohort. This supports the notion that continued supplementation may be necessary to maintain appropriate vitamin D levels. The incidence of stress fractures per year as well as the proportion of stress fractures per academic year have consistently declined as more attention is paid toward vitamin D supplementation.


Author(s):  
Chelsea L Martin ◽  
Ellen Shanley ◽  
Chris Harnish ◽  
Amy Knab ◽  
Shefali Christopher ◽  
...  

Background Flourishing is a multi-dimensional construct that encompasses physical, psychological and social well-being. A proposed positive attribute of flourishing is resilience, which is the ability to bounce back despite the presence of stressors. A common stressor among athletes is overuse injuries, which may negatively affect well-being. Objective To examine the relationships of resilience and overuse injury with flourishing in collegiate athletes. Materials and methods 253 college athletes participated. The Flourishing Scale, Oslo Sports Trauma Research Center Overuse Injury Questionnaire (OSTRC), and Brief Resilience Scale (BRS) were administered via online questionnaire. For OSTRC scores, athletes were classified into injury and participation status groups. For BRS scores, athletes were classified into low resilience (LR), normal resilience (NR), and high resilience (HR) groups. Results Median flourishing score was 50.0 (46.5–53.5); mean BRS score was 21.6 (SD 4.3). Overuse injury and substantial overuse injury incidence proportion (IP) were 25.4 (95% CI: 20.3, 30.5) and 9.1 (95% CI: 7.0, 11.2). The IP for participants unable to play was 15.1 (95% CI: 12.9, 17.2). Significant differences were found in flourishing among resilience groups (p = 0.002) but not among overuse injury groups (p = 0.140) or participation variables (p = 0.205). Conclusion College athletes demonstrated high flourishing scores. Flourishing demonstrated a significant relationship with resilience across all groups but not among overuse injury or participation status. This finding indicates that college athlete well-being is strongly associated with resilience. Future longitudinal studies are needed to determine if resilience can be modified to positively influence athlete well-being.


Sign in / Sign up

Export Citation Format

Share Document