首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Context:

The long-term implications of concussive injuries for brain and cognitive health represent a growing concern in the public consciousness. As such, identifying measures sensitive to the subtle yet persistent effects of concussive injuries is warranted.

Objective:

To investigate how concussion sustained early in life influences visual processing in young adults. We predicted that young adults with a history of concussion would show decreased sensory processing, as noted by a reduction in P1 event-related potential component amplitude.

Design:

Cross-sectional study.

Setting:

Research laboratory.

Patients or Other Participants:

Thirty-six adults (18 with a history of concussion, 18 controls) between the ages of 20 and 28 years completed a pattern-reversal visual evoked potential task while event-related potentials were recorded.

Main Outcome Measure(s):

The groups did not differ in any demographic variables (all P values > .05), yet those with a concussive history exhibited reduced P1 amplitude compared with the control participants (P = .05).

Conclusions:

These results suggest that concussion history has a negative effect on visual processing in young adults. Further, upper-level neurocognitive deficits associated with concussion may, in part, result from less efficient downstream sensory capture.Key Words: mild traumatic brain injuries, visual processing, event-related potentials, pattern-reversal visual evoked potentials

Key Points

  • Visual processing and higher-level cognitive function were affected by concussion over the long term.
  • The potential contributions of low-level sensory deficits to higher-order neurocognitive dysfunction after concussion should be studied.
  • Event-related potentials have greater sensitivity than standard clinical tools and have the potential for clinical use.
The long-term and cumulative effects of concussive injuries represent a growing concern in the public consciousness. Concussion has been defined as “a complex pathophysiological process affecting the brain, induced by traumatic biomechanical forces.”1,2 Estimated incidence rates for this condition, described as a “silent epidemic” by the Centers for Disease Control and Prevention,25 range from a conservative 300 000 per year46 to a more liberal and recent estimate of 3.8 million cases in the United States annually.7 Because 15% to 20% of these injuries result from sport participation,8 sport-related concussion represents an increasing concern, not only in the public domain but also in clinical and research settings.Based on clinical evaluations, concussed persons typically return to their preinjury level of functioning within 7 to 10 days of injury,9,10 a time paralleling the acute neurometabolic cascade associated with concussion.11,12 Indeed, several investigations2,1315 of young adult athletes with a concussion history indicate normal performance on a variety of clinical tests after the acute injury stage. However, more recent studies using highly sensitive assessment measures suggest that a multitude of chronic nervous system dysfunctions and cognitive deficits stem from concussive injuries.1628 Thus, the chronic, subclinical effects of concussion remain unclear, and measures sensitive to subtle and persistent deficits stemming from concussion are needed.Electroencephalography, which records brain activity from electrodes placed on the scalp, has been extensively used to examine neuroelectric activity in normal and clinical populations for almost a century. More recently, event-related potentials (ERPs; patterns of neuroelectric activity that occur in preparation for or in response to an event) have emerged as a technique to provide insight into the neural processes underlying perception, memory, and action.29 The ERPs may be obligatory responses (exogenous) to stimuli in the environment or may reflect higher-order cognitive processes (endogenous) that often require active consideration by a person.30Over the past decade, electroencephalography and ERPs in particular have demonstrated the requisite sensitivity to detect subtle, covert deficits in neurocognitive function associated with concussion17,23,27,3133 (for review, see Broglio et al29). Although several groups have evaluated ERP components, such as the ERN, N2, and P3, to examine attention, perception, and memory, few authors23,3436 have evaluated the effects of concussion on neuroelectric indexes of sensory function. In particular, only 3 studies have evaluated the influence of concussion on visual-evoked potentials (VEPs).23,34,35 Findings from these studies suggest that for a significant portion of people, concussion may lead to chronic impairment in the neuroelectric correlates of visual processing.Believed to reflect the functional integrity of the visual system, VEPs are electrophysiologic signals passively evoked in response to visual stimuli that demonstrate a parietal-occipital maximum.3740 Efficient visual processing and sensory integration are essential to day-to-day functioning34,41; however, the visual system of a concussed individual is typically unevaluated.34 Thus, VEPs represent an underused and potentially valuable tool for evaluating and understanding sensory and nervous system dysfunction after injury.One VEP paradigm of particular utility is the pattern-reversal task (PR-VEP). This task uses an inverting patterned stimulus to evoke an electrocortical waveform, which is characterized by a negative deflection at about 75 milliseconds (N75), followed by a positive deflection at about 100 milliseconds (P1).42 The PR-VEP task is a standard in clinical research assessing central nervous system function43 because of the high sensitivity, specificity, and intraindividual stability of PR-VEPs relative to VEPs elicited by other paradigms.44,45 Specifically, the P1 elicited by this paradigm is less variable than the P1 components elicited by other paradigms, making it preferable for evaluating clinical populations.40 For example, Sarnthein et al45 observed a test-retest sensitivity of 95% and specificity of 99.7% for P1 component values. Further, Mellow et al,46 evaluating binocular reproducibility in a single participant, observed test-retest coefficients of variation of 9% to 14% for P1 amplitude and 1% to 2% for P1 latency.The P1 component is an exogenous or obligatory potential and is the first positive-going deflection after stimulus presentation (or inversion). The P1 is thought to reflect sensory processes such as gating, amplification, and preferential attention to sensory inputs.38,47 Within the context of the PR-VEP paradigm, the P1 is believed to index the functioning of the geniculostriatal pathway,39 which is thought to mediate visual processing. The P1 component values can provide important information to researchers and clinicians: reduced P1 amplitude may indicate neuronal atrophy,48 and increased P1 latency may indicate slowed neural conduction within the visual pathways.49To our knowledge, only one set of authors23 has evaluated the P1 component in relation to sport-induced concussion by using a pattern-reversal task to elicit VEPs in young and middle-aged adults. Approximately one-third of the participants who reported a concussion history evidenced P1 deficits, as determined by clinical diagnostic criteria. Such findings suggest that concussion may negatively influence the P1 component in a subset of persons, but further investigation is warranted to clarify the nature of the relationship between concussive injuries and the P1 VEP. Accordingly, the purpose of our investigation was to assess the relationship of sport-related concussion on visual processing using a pattern-reversal paradigm.  相似文献   

2.
3.

Context:

Epicondylalgia is a common condition involving pain-generating structures such as tendon, neural, and chondral tissue. The current noninvasive reference standard for identifying chondral lesions is magnetic resonance imaging. Musculoskeletal ultrasound (MUS) may be an inexpensive and effective alternative.

Objective:

To determine the intrarater reliability and validity of MUS for identifying humeroradial joint (HRJ) chondral lesions.

Design:

Cross-sectional study.

Setting:

Clinical anatomy research laboratory.

Patients or Other Participants:

Twenty-eight embalmed cadavers (14 women, 14 men; mean age = 79.5 ± 8.5 years).

Main Outcome Measure(s):

An athletic trainer performed MUS evaluation of each anterior and distal-posterior capitellum and radial head to identify chondral lesions. The reference standard was identification of chondral lesions by gross macroscopic examination. Intrarater reliability for reproducing an image was calculated using the intraclass correlation coefficient (3,k) for measurements of the articular surface using 2 images. Intrarater reliability to evaluate a single image was calculated using the Cohen κ for agreement as to the presence of chondral lesions. Validity was calculated using the agreement of MUS images and gross macroscopic examination.

Results:

Intrarater reliability was 0.88 (95% confidence interval = 0.77, 0.94) for reproducing an image and 0.93 (95% confidence interval = 0.80, 1.06) for evaluating a single image. Identifying chondral lesions on all HRJ surfaces with MUS demonstrated sensitivity = 0.93, specificity = 0.28, positive predictive value = 0.58, negative predictive value = 0.77, positive likelihood ratio = 1.28, and negative likelihood ratio = 0.27.

Conclusions:

Musculoskeletal ultrasound is a reliable and sensitive tool for a clinician with relatively little experience and training to rule out HRJ chondral lesions. These results may assist with clinical assessment and decision making in patients with lateral epicondylalgia to rule out HRJ chondral lesions.Key Words: elbow joint, articular cartilage, reliability, assessment

Key Points

  • When used by a clinician with limited experience and training, musculoskeletal ultrasound imaging was reliable and sensitive for ruling out humeroradial joint chondral lesions.
  • More study is needed, but musculoskeletal ultrasound imaging may be helpful in assessing and managing patients with lateral epicondylalgia.
The elbow is one of the joints most commonly affected by articular cartilage degeneration and loosening of chondral fragments.1 Chondral lesions and articular cartilage degeneration primarily occur in the lateral compartment of the elbow within the humeroradial joint (HRJ).24 Although HRJ chondral lesions can occur secondary to acute trauma, they may be present in the absence of trauma.5Lateral elbow conditions occur in a variety of sports,6 and up to 50% of overhead athletes experience elbow injuries.7 In overhead athletes, valgus extension overload is observed during the late cocking phase of throwing.8 This results in significant compression forces of up to 500 N on the HRJ, which can lead to chondral lesions.8,9 In the general population, HRJ chondral lesions have been demonstrated in 51% to 81% of patients with chronic lateral elbow pain.5,10 Articular cartilage lesions are easily misdiagnosed as tendinopathy of the wrist extensors because they have similar clinical presentations.5,11 Symptoms include insidious onset of lateral elbow pain, pain with resisted wrist extension, and failure to respond to conservative treatment.5 To our knowledge, no validated clinical examination exists for the differential diagnosis of HRJ chondral lesions from lateral epicondylopathy.The current noninvasive reference standard for the diagnosis of chondral lesions is magnetic resonance imaging.12 Musculoskeletal ultrasound (MUS) is a safe and inexpensive alternative imaging technique that is effective in showing articular cartilage abnormalities and loose bodies,1,13 although most research to date has focused on the hip, knee, and hand.12,14,15 At the elbow joint, several authors16,17 have demonstrated validity for detecting loose bodies with MUS. However, we know of no authors who have evaluated the validity of MUS for identifying HRJ chondral lesions.Traditionally, radiologists perform and interpret MUS examinations.18 More recently, the use of MUS has expanded to sports medicine clinicians, physiotherapists, and athletic trainers.1821 As the use of MUS extends into clinics and athletic training facilities, the reliability of clinicians'' examinations with MUS must be expanded because the technique is user dependent.22 Furthermore, it is critical to establish whether the performance of such clinicians is equal to that of expert technicians. However, before interrater reliability can be evaluated, MUS must be validated for identifying HRJ chondral lesions to determine its usefulness.Although our purpose was not to establish the reliability of clinicians'' use of MUS, we know that for a tool to be valid, it must be reliable.23 Therefore, the purposes of our study were to (1) investigate the intrarater reliability and validity of an athletic trainer to identify chondral lesions in the HRJ using MUS and (2) determine the prevalence of HRJ chondral lesions in elderly specimens. We hypothesized that an athletic trainer would be reliable at reproducing and evaluating MUS images and accurate in identifying chondral lesions at the HRJ using MUS.  相似文献   

4.

Context

Providing students with feedback is an important component of athletic training clinical education; however, little information is known about the feedback that Approved Clinical Instructors (ACIs; now known as preceptors) currently provide to athletic training students (ATSs).

Objective

To characterize the feedback provided by ACIs to ATSs during clinical education experiences.

Design

Qualitative study.

Setting

One National Collegiate Athletic Association Division I athletic training facility and 1 outpatient rehabilitation clinic that were clinical sites for 1 entry-level master''s degree program accredited by the Commission on Accreditation of Athletic Training Education.

Patients or Other Participants

A total of 4 ACIs with various experience levels and 4 second-year ATSs.

Data Collection and Analysis

Extensive field observations were audio recorded, transcribed, and integrated with field notes for analysis. The constant comparative approach of open, axial, and selective coding was used to inductively analyze data and develop codes and categories. Member checking, triangulation, and peer debriefing were used to promote trustworthiness of the study.

Results

The ACIs gave 88 feedback statements in 45 hours and 10 minutes of observation. Characteristics of feedback categories included purpose, timing, specificity, content, form, and privacy.

Conclusions

Feedback that ACIs provided included several components that made each feedback exchange unique. The ACIs in our study provided feedback that is supported by the literature, suggesting that ACIs are using current recommendations for providing feedback. Feedback needs to be investigated across multiple athletic training education programs to gain more understanding of certain areas of feedback, including frequency, privacy, and form.Key Words: assessment, evaluation, pedagogy, preceptors

Key Points

  • Feedback had several different components that made each feedback exchange unique.
  • The feedback that the Approved Clinical Instructors (ACIs) provided mostly was aligned with recommendations in the literature, suggesting our ACIs provided effective feedback to athletic training students and current recommendations are applicable to athletic training clinical education.
  • Researchers should continue to assess the feedback that is occurring in different athletic training education programs to gain more understanding of the current use of feedback across several programs so they can guide ACI training and evaluation, including the development of recommendations for the appropriate frequency of feedback.
Feedback is any information provided to a student that helps correct, reinforce, or suggest change in his or her performance.1,2 It is a type of evaluation that is less formal and judgmental than structured, summative evaluation and assessment2 and is an effective educational technique.3,4 Providing feedback to students also has been described as one of the most important characteristics of clinical instructors in athletic training,5,6 medicine,7,8 nursing,9 and physical therapy.10 In addition, feedback has been shown to improve clinical performance in medical11,12 and nursing students.13,14Most research on feedback has been focused on the recommended characteristics of feedback, such as its specificity, timing, tone, and relation to educational and career goals.3,4,15 Much of the existing research is based on student and instructor perceptions of whether these recommendations are followed rather than actual observed feedback.12,16 Feedback research in athletic training is much less extensive than other areas of clinical education. Most research on feedback in athletic training education has been focused on general effective clinical instructor behaviors.5,17,18 These investigators have identified feedback as an important behavior of Approved Clinical Instructors (ACIs),5 and along with evaluation, it is considered a standard for selecting, training, and evaluating ACIs.17,18 Several authors1,19,20 have provided suggestions for giving effective feedback to athletic training students (ATSs) in clinical education. The supervision, questioning, feedback (SQF) model of clinical teaching provides guidelines for giving feedback to ATSs at different developmental levels.1 Stemmans21 compared the quantity of feedback provided by clinical instructors with different amounts of experience. The researcher found that novice clinical instructors provided less feedback to ATSs than more experienced clinical instructors did. Berry et al22 reported that students in outpatient rehabilitation clinics spent more time engaged in active learning than did students in intercollegiate and high school settings. Because learning experiences differ among clinical settings, the feedback exchange also may differ among settings.Providing feedback is considered to be one of the most important roles of ACIs during clinical education experiences.5,6 However, feedback has been minimally explored in practitioner-based articles and research studies specific to athletic training. Little is known about the feedback ACIs provide to ATSs. Similarly, to our knowledge, no one has examined how feedback is used in different clinical education settings, such as rehabilitation clinics and collegiate athletic training facilities. Therefore, the purpose of our study was to characterize the feedback provided by ACIs to ATSs during clinical education sessions in 1 outpatient rehabilitation clinic and 1 collegiate athletic training facility.  相似文献   

5.

Objective:

To introduce the characteristics of a Chance fracture and increase awareness of the mechanism of injury that may occur during athletic activity.

Background:

A T12 Chance fracture was diagnosed in an 18-year-old male rodeo athlete. The rider was forced into extreme lumbothoracic hyperflexion when the horse bucked within the chute, pinning the rider''s legs to his chest.

Differential Diagnosis:

Burst fracture, abdominal organ rupture, spinal dislocation, spinal cord injury, disk herniation, pars interarticularis fracture, spinal nerve injury, paralysis.

Treatment:

The patient underwent an open reduction and fixation of the thoracic fracture. Posterior stabilization was obtained with nonsegmental instrumentation. Allograft and autografts were used for posterolateral arthrodesis at T11–T12 and T12–L1.

Uniqueness:

Motor vehicle crashes with occupants wearing lap-type–only restraints account for nearly all previously reported Chance fractures. When only lap seatbelts are worn, the pelvis is stabilized, and the torso continues moving forward with impact. The stabilized body segment for this individual was reversed. Nearly 3 years after the initial surgery, fixation, and infection, the bareback rider has returned to full participation in rodeo.

Conclusions:

To our knowledge, this is the first reported diagnosis of a T12 Chance fracture in a rodeo athlete. When animals buck, athletes can be forced into hyperflexion, exposing them to Chance fractures. Therefore, anyone treating rodeo athletes must suspect possible spinal fracture when this mechanism is present and must treat all athletes with early conservative management and hospital referral.Key Words: sports, flexion-distraction injuries, spinal fractures, emergency medicineRodeo is considered an extreme sport, and subsequently, extreme risk is implied.1 Horseback riding, including rodeo and recreational riding, has been identified as one of the most common activities resulting in injuries requiring visits to emergency departments.2 The rough-stock events in rodeo include bareback riding, saddle bronc riding, and bull riding.1 Other events, including steer wrestling, tie-down roping, team roping, and ladies'' barrel racing, are considered timed events.1 In rodeo events, participants most often are injured while participating in rough-stock events.3 In an epidemiologic analysis of Canadian Professional Rodeo athletes, Butterwick et al4 found bull riders (31.2% of all injuries) were injured most frequently, followed by bareback riders and saddle bronc riders (16.0% and 14%, respectively, of all injuries).Intercollegiate rodeo is extremely competitive; the skill required and risk of injury are comparable with those seen in the professional ranks.5 During a 7-month, 10-event rodeo season, nearly one-quarter of all collegiate rough stock competitors sustained injuries.5 At the professional and collegiate ranks, competitors enter multiple rodeos each week and travel among towns to attend competitions. Depending on the event, the injury rate for rodeo participants ranges from 2.3 to 19.7 per 100 animal-exposures.5,6 Rough-stock competitors are 3 to 4 times more likely to be injured than other rodeo competitors.6,7Young athletes have an increased frequency of thoracolumbar spine injuries while participating in high-risk sporting activities, such as elite skiing, climbing, motorcycle racing, skydiving, and other extreme sports.8 More than half (53%) of all trauma to the thoracic and lumbar spine seen in adolescents has been attributed to recreational or competitive athletic activities.9 Chance10 first described unusual spinal injury patterns with a hyperflexion mechanism, usually resulting in a splitting of the posterior lumbar spine and neural arch without spinal cord damage, in patients admitted to the hospital after automobile crashes. Denis11 later postulated that the use of lap seatbelts in automobiles could explain these injuries because the restraints serve as an axis of rotation, subjecting the anterior spinal elements to compression during flexion and the posterior elements to distraction or tension.A Chance fracture is a unique spinal hyperflexion-distraction injury occurring around a fulcrum, which most often is described in the literature as a seatbelt crossing the lap.12 Ruptures are seen in the posterior ligaments, and the injury may include fractures to the pedicles or spinous process of the vertebrae.10 Chance fractures are relatively uncommon among the general population with the exception of individuals injured in automobile accidents and high-velocity athletic activities.13,14Denis11 proposed classifying the spine into 3 columns. The anterior column comprises the anterior longitudinal ligament, anterior annulus, and anterior vertebral body.11 The middle column comprises the posterior longitudinal ligament, posterior annulus, and posterior portion of the vertebral body. The posterior column comprises the spinal structures located dorsally to the posterior longitudinal ligament.11 Chance fractures result in a failure of the bony or soft tissue structures of the posterior and middle columns as they are subjected to tension, whereas the anterior column remains largely intact and becomes a fulcrum for injury.11 We present this case of a collegiate bareback rider who sustained a Chance fracture of the thoracic vertebra (T12). The mechanism of injury was the same as for other reported Chance fractures; however, the thoracic spine was stabilized in this patient, whereas most researchers have indicated the legs are typically the fixated segment during injury.1012,14,15 The athlete successfully underwent surgery; although he had medical complications, he eventually returned to bareback riding competition.  相似文献   

6.

Context:

Upper quarter injuries have a higher incidence in female swimmers; however, to date, there are few ways to assess the basic functional ability of this region. The upper quarter Y balance test (YBT-UQ) may assist in this process because it was developed to provide a fundamental assessment of dynamic upper quarter ability at the limit of stability.

Objective:

To examine how sex affects performance on the YBT-UQ in swimmers.

Design:

Cohort study.

Patients or Other Participants:

Forty-three male and 54 female National Collegiate Athletic Association Division I college swimmers were recruited preseason.

Main Outcome Measure(s):

We measured YBT-UQ performance for the left and right limbs in the medial, inferolateral, and superolateral directions. The maximum score for each direction was normalized to upper extremity length. The average of the greatest normalized reach scores in each reach direction was used to develop a composite score (average distance in 3 directions/limb length [LL] × 100). To examine reach symmetry between sexes, the difference in centimeters between the left and right sides was calculated for each reach direction prior to normalization. Statistical analysis was conducted using an independent-samples t test (P < .05).

Results:

Average scores in the medial (women: 92.5 ± 7.4%LL, men: 100.0 ± 8.7%LL; P < .01) and inferolateral (women: 85.6 ± 10.3%LL, men: 89.8 ± 10.8%LL; P = .05) directions and composite score (women: 83.4 ± 8.3%LL, men: 88.3 ± 8.9%LL; P < .01) were higher in men than in women. No differences were observed for reach symmetry in any direction.

Conclusions:

Performance on several YBT-UQ indices was worse for female than male collegiate swimmers. These results may have implications for the use of preseason and return-to-sport testing in swimmers as a measurement of upper quarter function and symmetry.Key Words: Y-Balance test, core stability, shoulder function, injury risk

Key Points

  • Female collegiate swimmers exhibited worse performance than their male counterparts on the upper quarter Y- balance test in the medial and inferolateral directions as well as in the average overall score.
  • No sex differences existed for reach symmetry for any of the reach directions.
  • The worse performance in women may be associated with shoulder and core stability limitations, which may explain the increased incidence of upper quarter injuries in female swimmers.
Upper quarter injuries are the most common injuries sustained by collegiate swimmers.1,2 More specifically, female swimmers have an increased risk of upper quarter injuries of the shoulder and back/neck compared with their male counterparts.1,2 Sallis et al3 found that female swimmers sustained 21.05 and 8.19 injuries/100 participant years, whereas male swimmers experienced 6.55 and 1.45 injuries/100 participant years for the shoulder and back/neck, respectively. These injuries are disabling, often contribute to decreased performance and missed practices and competitions, and may require surgery. A number of extrinsic and intrinsic variables have been shown to contribute to the increased risk of upper quarter injuries, regardless of sex.2,411 Extrinsic factors include rigorous training exposure focusing on shoulder-intensive movements, reduced cross-training or participation in other sports, prior injury, and age. Intrinsic risk factors include laxity of the capsuloligamentous structures, decreased core and scapular muscle endurance, and scapulothoracic and glenohumeral muscular imbalance; screening tools to assess these components and eventually minimize the role these factors play in overall injury risk for swimmers may be helpful. Currently, insufficient data exist for upper quarter functional testing in swimmers. Additional upper quarter functional testing may further explain the possible sex-related characteristics associated with injury disparity. Effective and functional upper quarter testing could be used to help develop offseason and dry-land training programs focused on performance enhancement and injury prevention.12,13Swimming requires a significant amount of upper body and core strength, endurance,11,14 and shoulder mobility and stability.5 Although a number of tests have been designed to assess upper quarter function, few tests assess upper quarter stability at the limit of closed chain stability,1518 which has been associated with performance enhancement and injury prevention in swimmers.12,13 The upper quarter Y balance test (YBT-UQ) can be conducted in the field setting with minimal equipment and examines the unilateral performance of the upper quarter at the end range of the athlete''s ability to maintain stability.1518 The YBT-UQ challenges the core and upper quarter strength, stability, and mobility that are required for the performance demands of swimming.13,16,18 The YBT-UQ is a reliable functional test,16,18 demonstrating a fair to moderate association with several tests that measure core stability (push-up and lateral side bend endurance, range, 0.38–0.45) and upper extremity function (closed kinetic chain upper extremity stability test, range, 0.43–0.49).18 Current research supports the notion that YBT-UQ performance is not affected by competition level,19 sex, or limb dominance in active adults18 and healthy college students.16 However, performance on the YBT-UQ has yet to be assessed in swimmers, who have different upper quarter demands than previously tested populations.Previous authors3 have shown that female swimmers are at increased risk for injury to the upper quarter compared with their male counterparts. Reasons for this discrepancy are unknown. Basic tests of upper extremity function in athletes, particularly those in sports with significant upper quarter demands, may allow us to identify movement limitations that can be addressed to improve the athlete''s endurance. Such tests may also be beneficial in assessing progress in dry-land training programs aimed at improving swimming performance.13 Given the current gaps in the literature, it is beneficial to examine YBT-UQ performance in male and female swimmers to determine if sex differences exist in performance. Based on previous research, we expected no sex difference on the YBT-UQ.16,18  相似文献   

7.

Context:

Lower extremity movement patterns have been implicated as a risk factor for various knee disorders. Ankle-dorsiflexion (DF) range of motion (ROM) has previously been associated with a faulty movement pattern among healthy female participants.

Objective:

To determine the association between ankle DF ROM and the quality of lower extremity movement during the lateral step-down test among healthy male participants.

Design:

Cross-sectional study.

Setting:

Training facility of the Israel Defense Forces.

Patients or Other Participants:

Fifty-five healthy male Israeli military recruits (age = 19.7 ± 1.1 years, height = 175.4 ± 6.4 cm, mass = 72.0 ± 7.6 kg).

Intervention(s):

Dorsiflexion ROM was measured in weight-bearing and non–weight-bearing conditions using a fluid-filled inclinometer and a universal goniometer, respectively. Lower extremity movement pattern was assessed visually using the lateral step-down test and classified categorically as good or moderate. All measurements were performed bilaterally.

Main Outcome Measure(s):

Weight-bearing and non–weight-bearing DF ROM were more limited among participants with moderate quality of movement than in those with good quality of movement on the dominant side (P = .01 and P = .02 for weight-bearing and non–weight-bearing DF, respectively). Non–weight-bearing DF demonstrated a trend toward a decreased range among participants with moderate compared with participants with good quality of movement on the nondominant side (P = .03 [adjusted P = .025]). Weight-bearing DF was not different between participants with good and moderate movement patterns on the nondominant side (P = .10). Weight-bearing and non–weight-bearing ankle DF ROM correlated significantly with the quality of movement on both sides (P < .01 and P < .05 on the dominant and nondominant side, respectively).

Conclusions:

Ankle DF ROM was associated with quality of movement among healthy male participants. The association seemed weaker in males than in females.Key Words: anterior cruciate ligament, hip, knee, lateral step-down test, patellofemoral pain syndrome

Key Points

  • Healthy males with a moderate quality of movement on the lateral step-down test exhibited less ankle-dorsiflexion range of motion than those with a good quality of movement.
  • When a lower quality of movement is present in males, clinicians should consider interventions to increase ankle dorsiflexion.
An altered lower extremity movement pattern, consisting of excessive femoral adduction and internal rotation leading to excessive knee valgus alignment, has been implicated as a risk factor for patellofemoral pain syndrome (PFPS) and noncontact anterior cruciate ligament injuries.13 Various factors have been suggested to contribute to an altered movement pattern, including decreased strength of the ipsilateral hip musculature,4,5 increased subtalar joint pronation,6,7 and altered motor control.8 Assessment of movement pattern and the factors associated with it is therefore commonly performed in the evaluation of patients with PFPS, as well as in screening for the risk of knee injury.911Another possible contributor to an altered movement pattern is the available ipsilateral ankle-dorsiflexion (DF) range of motion (ROM). Decreased ankle DF ROM can limit the forward progression of the tibia over the talus during activities that require simultaneous knee flexion and ankle DF (eg, squatting, stair descent). A possible compensation for the limited motion of the tibia could be subtalar pronation, which may shift the tibia and the knee medially into greater valgus alignment.6,1214 Some evidence already exists for the association between ankle DF and the lower extremity movement pattern. Decreased DF has been previously associated with increased knee valgus during a drop-land maneuver,14 a squat,15 and a step-down maneuver16 among healthy participants.One limitation of the current literature regarding this topic is the inclusion of only female participants in many of the studies evaluating lower extremity movement patterns and the associated factors.4,6,14,1618 This is likely because of sex differences in kinematics, kinetics, and muscle-activation patterns during various functional activities.8,19,20 Women have been shown to perform activities such as cutting, jumping, and landing with greater knee valgus alignment and greater knee extension than men.19,20 These differences are hypothesized to account for the greater incidence of noncontact anterior cruciate ligament tears and PFPS among women.1,2,21,22 Accordingly, authors14,16 of the 2 studies that have previously linked decreased ankle DF with a faulty movement pattern included only female participants as well. A third study of a mixed population demonstrated only a statistical trend for the association between ankle DF and a faulty movement pattern.15 It is therefore unclear whether the association between ankle DF and lower extremity movement pattern is similar for both sexes.Paradoxically, another limitation of the current literature is the use of sophisticated 3-D motion-analysis systems in many of the studies evaluating lower extremity movement patterns.2,4,14,17,18 Although this type of analysis certainly contributes to a high level of precision and reliability, clinicians and coaches typically do not have the access, time, or skill to operate such systems. Instead, visual observation is often relied on to assess movement patterns in the clinic or on the field. It is unknown, however, to what extent any movement deviations identified during 3-D motion analyses correlate with movement deviations identified visually. Consequently, the findings from 3-D motion analyses studies may be difficult to apply in the clinical setting or on the field. We therefore decided to assess whether ankle DF ROM is related to the quality of lower extremity movement as assessed visually among healthy male participants.The lateral step-down (LSD) test is frequently used to assess movement patterns of the lower extremity.9,11,2325 Piva et al25 suggested a visually based rating system for classifying the quality of movement during the LSD test. The reliability of this rating system has been established previously.16,25 Our hypothesis was that male participants with a lower quality of movement on the LSD would exhibit less ankle DF ROM.  相似文献   

8.
9.
10.

Context:

When an athlete is injured, the primary focus of the sports medicine team is to treat the physical effects of the injury. However, many injured athletes experience negative psychological responses, including anxiety, regarding their injury.

Objective:

To compare the anxiety and social support of athletes with concussions and a matched group of athletes with orthopaedic injuries.

Design:

Cross-sectional study.

Setting:

Athletic training room.

Patients or Other Participants:

A total of 525 injuries among athletes from 2 Big Ten universities were observed. Of these, 63 concussion injuries were matched with 63 orthopaedic injuries for the athlete''s sex, sport, and time loss due to injury.

Main Outcome Measure(s):

Clinical measures included the State-Trait Anxiety Inventory (which measures both state and trait anxiety) and the modified 6-item Social Support Questionnaire.

Results:

The group with concussions relied on their family for social support 89% of the time, followed by friends (78%), teammates (65%), athletic trainers (48%), coaches (47%), and physicians (35%). The group with orthopaedic injuries relied on their family for social support 87% of the time, followed by friends (84%), teammates (65%), athletic trainers (57%), coaches (51%), and physicians (36%). We found no differences for the State-Trait Anxiety Inventory (t = −1.38, P = .193) between the concussed and orthopaedic-injury groups. Social Support Questionnaire scores were significant predictors for postinjury state anxiety. Specifically, increased scores were associated with decreased postinjury state anxiety (β = −4.21, P = .0001).

Conclusions:

Both the concussed athletes and those with orthopaedic injuries experienced similar state and trait anxiety and relied on similar sources of social support postinjury. However, athletes with orthopaedic injuries reported greater satisfaction with support from all sources compared with concussed athletes. In contrast, concussed athletes showed more significant predictor models of social support on state anxiety at return to play.Key Words: psychology, state anxiety, trait anxiety, return to play

Key Points

  • Athletes with concussions or orthopaedic injuries showed similar levels of state and trait anxiety.
  • Sources of social support were similar for athletes with concussions and orthopaedic injuries.
  • Compared with concussed athletes, athletes with orthopaedic injuries reported more satisfaction with social support from all sources.
  • Compared with athletes who sustained orthopaedic injuries, concussed athletes showed more significant predictor models of social support on state anxiety at return to play. These differences may reflect the nature of injury, suggesting that additional research is needed to understand the relationship of social-support satisfaction and postinjury anxiety by injury type.
With approximately 444 000 National Collegiate Athletic Association athletes competing annually, athletic injuries are likely to occur. According to the association''s injury-surveillance system, about 12 500 athletic injuries are sustained each year.1 When an athlete is injured, the primary focus of the sports medicine team is to treat the physical effects of the injury. However, many injured athletes experience negative psychological responses, including anxiety.2,3 Anxiety in athletes with orthopaedic injuries may result from the cognitive appraisal of the injury rather than from the injury itself.4,5 In contrast, anxiety in athletes with concussions may result from both cognitive appraisal and physiologic sequelae.6 Social support has been shown to mediate both the physical and psychological effects of athletic injury.7 Yet the relationship between anxiety and social support in concussed athletes compared with athletes with orthopaedic injuries is unknown.Injured athletes may exhibit trait anxiety related to perceived loss of athleticism, lack of social support, pain, and fear of reinjury.3,8,9 Injured athletes with high levels of trait anxiety may also experience high levels of state anxiety postinjury.10 Factors such as injury severity and time loss from practice or competition can influence whether or not athletes experience high or low levels of trait anxiety.11 Although anxiety after orthopaedic injuries has received attention in the literature, considerably less research has been conducted on anxiety in concussed athletes.Concussion is often referred to as the “invisible injury.”6 A concussed athlete who is experiencing lingering headaches and memory difficulties does not outwardly look any different from uninjured peers. Furthermore, it may be difficult to distinguish anxiety as a symptom of concussion from anxiety as a psychological effect.6,12In a recent study,13 concussed athletes did not experience as much emotional disturbance as athletes with anterior cruciate ligament injuries. Although these authors did not exclusively examine anxiety symptoms of concussed athletes versus those with orthopaedic injuries, emotional disturbances do seem to differ between the groups, which provides a framework for future studies.When working with injured athletes who are experiencing anxiety, it is important to also consider coping mechanisms that may facilitate recovery. The integrated model of psychological response to sport injury5 suggests that coping resources may be salient factors in an athlete''s postinjury psychological state. Seriously injured athletes will seek social support as a coping mechanism.11,14,15 Common types of social support that are useful for injured athletes include emotional support (eg, empathy), tangible support (eg, practical assistance), and informational support (eg, problem solving).11,15,16The social-support network for injured athletes often consists of family and friends, health professionals, coaches, teammates, and other injured athletes.11,15,17,18 Flint18 described how modeling of successful recovery by fellow or formerly injured athletes may be a helpful form of support and confidence for currently injured athletes. This modeling may provide currently injured athletes with information that aids their ability to successfully manage the recovery process.Literature within the sport-injury domain has shown that athletes may turn to coaches and health professionals (eg, athletic trainers [ATs]) for emotional support.7,16,17,19 Injured athletes may view ATs as an important source of emotional support given the amount of time injured athletes spend rehabilitating their injuries in the athletic training room.7,17,19 However, this may be different for athletes with concussions. Concussions often lack any outward physical signs of a sport injury (eg, braces, crutches). Additionally, athletes with concussions may spend less time in the athletic training room than athletes with orthopaedic injuries because concussion symptoms are usually managed by the athlete, with guidance from the ATs. Thus, athletes in the athletic training room may receive a more viable sense of social support.To our knowledge, no authors have examined differences in anxiety and social support between athletes who have sustained a concussion versus an orthopaedic injury. Understanding how anxiety and social support compare between athletes with concussions and those with orthopaedic injuries is important when the clinician attempts to provide a holistic approach to rehabilitation. Persons within an athlete''s social-support network who are more aware of their roles as social-support providers can help the athlete manage postinjury anxiety. Therefore, our objective was to compare the anxiety and social support of athletes with concussions versus a matched group of athletes with orthopaedic injuries.  相似文献   

11.

Context:

Increasing attention is being paid to the deleterious effects of sport-related concussion on cognitive and brain health.

Objective:

To evaluate the influence of concussion incurred during early life on the cognitive control and neuroelectric function of young adults.

Design:

Cross-sectional study.

Setting:

Research laboratory.

Patients or Other Participants:

Forty young adults were separated into groups according to concussive history (0 or 1+). Participants incurred all injuries during sport and recreation before the age of 18 years and were an average of 7.1 ± 4.0 years from injury at the time of the study.

Intervention(s):

All participants completed a 3-stimulus oddball task, a numeric switch task, and a modified flanker task during which event-related potentials and behavioral measures were collected.

Main Outcome Measure(s):

Reaction time, response accuracy, and electroencephalographic activity.

Results:

Compared with control participants, the concussion group exhibited decreased P3 amplitude during target detection within the oddball task and during the heterogeneous condition of the switch task. The concussion group also displayed increased N2 amplitude during the heterogeneous version of the switch task. Concussion history was associated with response accuracy during the flanker task.

Conclusions:

People with a history of concussion may demonstrate persistent decrements in neurocognitive function, as evidenced by decreased response accuracy, deficits in the allocation of attentional resources, and increased stimulus-response conflict during tasks requiring variable amounts of cognitive control. Neuroelectric measures of cognitive control may be uniquely sensitive to the persistent and selective decrements of concussion.Key Words: concussions, inhibition, mental flexibility, attention, P3, N2

Key Points

  • • Seven years after concussion, participants displayed disrupted higher-order neurocognition in the form of chronically impaired attention, working memory, inhibition, and interference control.
  • • The observed deficits in attention and conflict monitoring were only evident when cognitive demands were increased. Subtle deficits may remain unrecognized with other types of testing.
During the past decade, increased research efforts have been dedicated toward understanding the causes of brain and cognitive dysfunction stemming from concussive injuries. Concussions have been described as a “silent epidemic” by the Centers for Disease Control and Prevention,1 and estimates of the incidence in the United States range from a conservative 300 000 per year2 in 1997 to the more recent estimate of nearly 4 million cases per year.3 As 15%–20% of these injuries result from sport participation,4 sport-related concussion represents a growing public health concern. Concussion can be defined as a complex pathophysiologic process affecting the brain that is caused by a direct blow or an impulsive force transmitted to the head.5 Injured persons commonly display deficits in cognition, postural control, and symptoms,6 so the cost to society is heavy, with known negative effects on academic79 and vocational10 performance. The annual economic burden of concussions approaches $17 billion in direct and indirect expenses in the United States,11 warranting more investigation in the clinical and laboratory settings.Based on clinical evaluations, injured people typically return to a preinjury level of functioning within 7 to 10 days,12 a time frame mirroring the acute neurometabolic cascade associated with concussion.13 Investigations of young adult athletes who have progressed past the acute stages of injury indicate normal performance on a variety of clinical tests,1417 leading to a general belief that concussion is a transient brain injury. However, the transient view of concussion has recently come into question, as a growing body of evidence illustrates numerous chronic nervous system dysfunctions and cognitive deficits stemming from these injuries.1828 Further, recent epidemiologic reports16,29 reveal increased prevalence of mild cognitive impairment, dementia, and Alzheimer disease in retired contact-sport athletes. Evidence from these studies appears to diverge from the concept of concussion as a transient injury.Given this divergence and the lack of a definitive diagnostic tool, identifying aspects of cognition that are sensitive to subtle concussion-related deficits during the postacute phase is warranted. Because concussive injuries are inherently difficult to assess30 and result in a wide variety of injury outcomes, assessing multiple aspects of cognitive functioning (eg, planning, memory, cognitive flexibility) may provide further insight into the nature and duration of these injuries.31The aspects of cognitive functioning described by Aubry et al31 fall under the domain of cognitive control, which designates a subset of goal-directed, self-regulatory operations involved in the selection, scheduling, and coordination of computational processes underlying perception, memory, and action.32,33 Recent evaluations of the cognitive control of concussed persons demonstrate that tasks requiring this feature may be sensitive to detecting persistent cognitive deficits. For example, Pontifex et al26 observed deficits in cognitive control, as indicated by decreased response accuracy during a modified flanker task, which requires variable amounts of inhibitory control (ie, an aspect of cognitive control) in previously concussed persons an average of 2.9 years after injury. Also using a flanker task, de Beaumont et al20 observed decreased response accuracy in previously concussed persons an average of 34.7 years after injury relative to age-matched control participants. In addition, Ellemberg et al22 noted deficits in a group of previously concussed athletes 6 months after injury during the Stroop color-word test, which further measures inhibitory control, and the Tower of London DX task, which requires planning, working memory, and cognitive flexibility. Together, these studies provide convergent evidence that tasks requiring various aspects of cognitive control may be well suited for examining the relationship between concussion history and prolonged cognitive dysfunction. However, further examination of task specificity appears necessary if future researchers are to adequately detail this relationship.In addition to focusing on cognitive control, investigators have recently begun to incorporate sensitive measures of brain function in sport-related concussion research. Electroencephalography and event-related potentials (ERPs) in particular have emerged as valuable tools to evaluate covert neurocognitive deficits between stimulus engagement and response execution that stem from concussion. Therefore, electroencephalography may contribute to the development and refinement of differential diagnostic information for those with atypical clinical recovery after concussive injuries. The benefit of the ERP approach lies in its temporal sensitivity, which allows researchers to parse individual components in the stimulus-response relationship. Recent investigations19,23,3436 have demonstrated the efficacy of neuroelectric measures in detecting neurocognitive deficits associated with a concussion history. Beyond providing a unique method for researchers and clinicians to monitor enduring neurocognitive alterations, ERPs may serve as a measure of treatment effectiveness.In particular, the P3 component has been of considerable interest in recent concussion research. The P3 component can be further divided into interrelated but distinct subcomponents, the P3b (P300) and P3a, which are differentiated by both the context in which they occur and scalp topography.37 The P3b component, which is evoked in response to an infrequently occurring target stimulus, is believed to reflect the allocation of attentional resources (as indexed by component amplitude37) and stimulus classification and evaluation speed (as indexed by component latency38,39) and demonstrates a centroparietal maximum.37 The P3a component, which is evoked in response to a distracter or novel stimulus, is believed to reflect the orienting of focal attention to such novel or distracting environmental stimuli and exhibits a frontocentral maximum.37,40,41 Therefore, the P3 components can serve as valuable measures for researchers and clinicians to evaluate multiple aspects of cognition and brain function.Authors evaluating the persistent effects of concussion on neuroelectric indexes of cognition have observed decreased P3 amplitude19,20,23 and increased P3 latency,23 suggesting that concussive injuries may negatively affect attentional resource allocation and the speed of cognitive processing during environmental interactions. Further, this effect appears to endure: deficits have been observed in participants from approximately 3 years19,23 to more than 34 years after injury.20In addition to the P3 component, recent researchers19,26,42 have also observed enduring concussion-related deficits in the neuroelectric correlates of conflict monitoring and adaptation during cognitive control performance. These results suggest that in addition to attentional resource allocation, concussive injuries may negatively affect indexes of conflict monitoring and adaptation. One neuroelectric index of conflict is the N2 component, which immediately precedes the P3 component. The frontocentral N2, observed during cognitive control tasks, has been linked to the conscious detection of deviance,19,43 the mismatch of a stimulus with a mental template, and increased cognitive control over response inhibition.44 Accordingly, N2 amplitudes are more negative during conditions of greater conflict,44,45 arising from competition between the execution and inhibition of a single response.46 Thus, the N2 component can serve as a valuable index of stimulus-response conflict during cognitively demanding environmental transactions.Accordingly, the goal of our study was to evaluate the chronic influence of concussion on cognitive and brain function using cognitive control tasks, which allowed us to measure neuroelectric function. We used tasks requiring cognitive flexibility, inhibitory control, and working memory. We hypothesized that, relative to uninjured control participants, those with a concussion history would demonstrate deficits in task performance (ie, reaction time and response accuracy) for conditions requiring the upregulation of cognitive control. Further, we predicted that participants with a concussion history would demonstrate a smaller P3 amplitude, reflecting deficits in the allocation of attentional resources during cognitive control operations relative to participants without a concussion history, and a longer P3 latency for those with a concussion history, indicating prolonged delays in the speed of cognitive processing. Last, we predicted that participants with a concussion history would demonstrate increased stimulus-response conflict, as evidenced by greater N2 amplitude relative to controls, during task conditions requiring the upregulation of cognitive control.  相似文献   

12.

Context:

Analyzing ligament stiffness between males and females at 3 maturational stages across the lifespan may provide insight into whether changes in ligament behavior with aging may contribute to joint laxity.

Objective:

To compare the stiffness of the medial structures of the tibiofemoral joint and the medial collateral ligament to determine if there are differences at 3 distinct ages and between the sexes.

Design:

Cross-sectional study.

Setting:

Laboratory.

Patients or Other Participants:

A total of 108 healthy and physically active volunteers with no previous knee surgery, no acute knee injury, and no use of exogenous hormones in the past 6 months participated. They were divided into 6 groups based on sex and age (8–10, 18–40, 50–75 years).

Main Outcome Measure(s):

Ligament stiffness of the tibiofemoral joint was measured with an arthrometer in 0° and 20° of tibiofemoral-joint flexion. The slope values of the force-strain line that represents stiffness of the medial tibiofemoral joint at 0° and the medial collateral ligament at 20° of flexion were obtained.

Results:

When height and mass were controlled, we found a main effect (P < .001) for age group: the 8- to 10-year olds were less stiff than both the 18- to 40- and the 50- to 75-year-old groups. No effects of sex or tibiofemoral-joint position on stiffness measures were noted when height and mass were included as covariates.

Conclusions:

Prepubescent medial tibiofemoral-joint stiffness was less than postpubescent knee stiffness. Medial tibiofemoral-joint stiffness was related to height and mass after puberty in men and women.Key Words: medial collateral ligament, arthrometry, hormones, sex differences

Key Points

  • Medial tibiofemoral-joint stiffness was less in prepubescents than in postpubescents.
  • After puberty, medial tibiofemoral-joint stiffness was influenced by height and mass in both men and women.
The mechanical properties of connective tissue with respect to sex have been studied mainly in an effort to explain the greater risk of knee ligament injury in female athletes than in male athletes. Most authors13 have focused on the laxity of the anterior cruciate ligament (ACL) in postpubertal men and women. Theories have been generated and extensive research has been conducted to explain the two- to eightfold increase in ACL injuries in female athletes over male athletes.14 Although a single cause has not been identified, risk factors have been generalized into 4 categories5: environmental (external factors such as surface and footwear),5 anatomic and postural,14 hormonal,613 and biomechanical1416 (such as kinematics16,17 and neuromuscular factors15,18,19).The injury rate to the collateral ligaments of the knee is also greater in females than males but not to the same extent as for ACL injury.1,2 However, the medial collateral ligament (MCL), a major stabilizing structure in the tibiofemoral joint, remains a prevalent source of injury in the general population,20 particularly as a result of sport participation.13 As males and females mature from prepuberty, through puberty and adulthood, and then reach the postfertile years, the material properties and structure of the joints change, as do hormonal levels.2123 In this study, we examine the mechanical and material properties of the MCL and other supporting structures of the tibiofemoral joint in vivo in prepubertal and postpubertal males and females and older adults, including postmenopausal females. These properties in these groups of participants have not been previously described in the literature. Examining the material properties of a ligament, such as stiffness, provides a way to detect joint structural differences between sexes and across age groups, thus elucidating structural differences in the ligament material properties secondary to the exposure to sex hormones.Aronson et al24,25 measured stiffness of the medial tibiofemoral joint in full extension because of the important role the medial joint structures play in minimizing valgus positioning (abduction of the joint), which has been suggested to contribute to ACL injury risk.3,5,1517 Additionally, Aronson et al24,25 examined the extracapsular MCL in 20° of flexion to reduce the confounding contributions of possible changes in intracapsular structures, such as meniscal injury and articular degeneration, to stiffness measurements.The purpose of our investigation was to assess the stiffness of the medial tibiofemoral joint in full extension and the MCL in 20° of flexion in males and females in 3 distinct age groups (prepubertal children, postpubertal young adults, and older adults).  相似文献   

13.

Context:

One of the greatest catalysts for turnover among female athletic trainers (ATs) is motherhood, especially if employed at the National Collegiate Athletic Association Division I level. The medical education literature regularly identifies the importance of role models in professional character formation. However, few researchers have examined the responsibility of mentorship and professional role models as it relates to female ATs'' perceptions of motherhood and retention.

Objective:

To evaluate perceptions of motherhood and retention in relation to mentorship and role models among female ATs currently employed in the collegiate setting.

Design:

Qualitative study.

Setting:

Female athletic trainers working in National Collegiate Athletic Association Division I.

Patients or Other Participants:

Twenty-seven female ATs employed in the National Collegiate Athletic Association Division I setting volunteered. Average age of the participants was 35 ± 9 years. All were full-time ATs with an average of 11 ± 8 years of clinical experience.

Data Collection and Analysis:

Participants responded to questions by journaling their thoughts and experiences. Multiple-analyst triangulation and peer review were included as steps to establish data credibility.

Results:

Male and female role models and mentors can positively or negatively influence the career and work–life balance perceptions of female ATs working in the Division I setting. Female ATs have a desire to see more women in the profession handle the demands of motherhood and the demands of their clinical setting. Women who have had female mentors are more positive about the prospect of balancing the rigors of motherhood and job demands.

Conclusions:

Role models and mentors are valuable resources for promoting perseverance in the profession in the highly demanding clinical settings. As more female ATs remain in the profession who are able to maintain work–life balance and are available to serve as role models, the attitudes of other women may start to change.Key Words: role models, retention, quality of life

Key Points

  • Role models and mentors are being identified by female athletic trainers working in the Division I setting.
  • Perceptions of work–life balance can be positively affected by how role models and mentors maintain balance within their own lives. Conversely, those individuals who cannot maintain balance can negatively affect their proteges'' perceptions of work–life balance.
  • Female athletic trainers working in the Division I setting desire more female role models who are effectively balancing the multiple responsibilities of their personal and professional lives.
The positive and negative influences of role models and mentors have been well established in the medical literature, specifically in academic medicine.13 In a published systematic review of the literature,1 mentorship in academic medicine was reported to enhance personal and career development, as well as research productivity, including publications and grant awards. Mentoring was described in the late 1970s by Levinson,4 who exposed the relationship as one of the most significant influences an individual can have in early adulthood. Mentoring has been emphasized as a critical element for personal and career advancement and career selection.1,2 However, mentoring is not always a positive experience. Repeated negative learning experiences may adversely affect the development of professionalism in medical students and residents.5 A lack of mentoring may contribute to career success deficiencies in academic medicine, especially for women.1,3 Furthermore, female physicians are less likely than their male colleagues to identify role models for work–life balance (WLB).6Role models and mentors differ in that mentors are senior members of a group who intentionally encourage and support younger colleagues in their careers.5 Mentoring often includes role modeling. A role model teaches predominantly by example and helps to form one''s professional identity and commitment by promoting observation and comparison.5 Role modeling is less intentional, more informal, and more episodic than mentoring. Individuals serving as supervisors are the gatekeepers to establishing an environment that enhances a family-friendly atmosphere and ensures that their employees realize WLB. Work–life balance is attained when an individual''s right to a contented life inside and outside paid work is accepted and valued as the norm. Mazerolle et al7 found that head athletic trainers (ATs) informally try to encourage WLB through role modeling. Therefore, supervisors and bosses may incidentally act as role models.Of great concern in the field of athletic training is the subject of retention among female ATs, which has recently been heavily researched.811 The departure of female ATs from the profession has been theoretically linked to the desire to strike a balance between family obligations and personal time with work responsibilities.12,13 The National Collegiate Athletic Association Division I clinical setting holds unique professional challenges for ATs. Long road trips, nights away from home, pressure to win, supervision of athletic training students, infrequent days off, high athlete-to-AT ratios, athletes on scholarship, and extended competitive seasons are some of the stresses faced by an AT working in the Division I setting.14 Concerns regarding WLB and time for parenting influence decisions to persist at the collegiate level.12,13 Several investigators11,13 in athletic training have suggested that motherhood is a primary factor leading to the departure of females from the profession. Role models and mentoring have emerged as possible factors to aid in the retention of females in the collegiate setting once they become mothers.Limited research on mentoring exists in the context of athletic training. Two studies15,16 have examined the effects of professional socialization among high school and collegiate ATs. Similar to mentoring, professional socialization is a process by which individuals learn the knowledge, skills, values, roles, and attitudes associated with their professional responsibilities.17 The mentoring roles of ATs evolve over their careers. Initially, ATs make network connections in order to learn, but as they become more experienced, they take on more of a mentoring role. This occurs as a result of being contacted by less experienced colleagues for advice on how to deal with problems in their clinical settings.16 Additionally, a recent study18 examined female athletic training students'' perceptions on motherhood in the athletic training profession; the students felt strongly that a female mentor who had children would greatly benefit them personally as well as professionally. Though the students named mentorship as an important retention factor, they had very limited direct mentorship from a female AT with children employed in the collegiate setting. This finding mirrors research in the medical literature6 highlighting a lack of role models or mentors being identified by females in various professions. Although Pitney16 highlighted the presence of mentors in the athletic training profession, a scarcity of information exists regarding the part mentors and role models play for female ATs specifically and in their influence on WLB views. The purpose of our study, therefore, was to examine the effect of role models and mentors on perceptions of career and motherhood among female ATs working in the Division I setting. This study will be the first to assess perceptions of mentors and role models among female ATs throughout the life experience spectrum (single, married, married with children). The following central research question guided this study: how do role models and mentors within athletic training influence female ATs employed in the Division I setting?  相似文献   

14.

Context:

Bullying has received a vast amount of attention in the recent past. One form of bullying, workplace bullying (WPB), has been a substantial concern explored in many health professions that can negatively influence a health care provider''s role in an organization. To date, however, WPB has not been investigated in athletic training contexts.

Objective:

To examine the perceptions of certified athletic trainers who experienced or witnessed WPB during employment in the collegiate setting.

Design:

Qualitative study.

Setting:

College or university.

Patients or Other Participants:

Fifteen athletic trainers (7 women, 8 men) with an average age of 42 ± 12 years.

Data Collection and Analysis:

Data were collected via semistructured, in-depth phone interviews or asynchronous online interviews. Data were analyzed using an inductive content analysis. Trustworthiness was established with member checks and peer debriefing.

Results:

Four themes emerged from the analysis: (1) antecedents of WPB, (2) consequences of WPB, (3) coping with WPB, and (4) lack of workplace environment training. The antecedents of WPB involved the bully''s personality and perceptions of the athletic training profession as well as environmental factors including the pressure to win and a lack of administrative support. The consequences of WPB included increased stress, feelings of inadequacy, and increased distrust. Individuals coped with WPB by relying on emotional resilience and avoidance. A final theme, lack of workplace environment training, revealed that little attention was given to interpersonal issues and WPB in the workplace.

Conclusions:

Workplace bullying incidents occur when administrators tolerate bullying behaviors from controlling and manipulative individuals who lack respect for the athletic training professional. Several negative outcomes result from bullying interactions, including stress and anxiety; WPB is dealt with by learning to be more emotionally resilient and avoiding confrontations. Workplace training is needed to prepare athletic trainers for such negative experiences.Key Words: interpersonal conflict, workplace harassment, negative acts

Key Points

  • The perceived antecedents of workplace bullying included not only the bully''s personality characteristics, such as being controlling, insecure, arrogant, and self-centered, but also a negative perception of the athletic training profession.
  • Pressure to win from coaches and a lack of administrative support were identified as factors that led to bullying behavior.
  • The consequences of being a bullying target included increased stress and anxiety, feelings of inadequacy, and increased distrust of others.
  • Athletic trainers coped with bullying by being emotionally resilient and avoiding the bully.
As health care providers in a variety of settings, certified athletic trainers (ATs) must work closely with an array of individuals to provide quality health care. These interpersonal interactions can influence an AT''s ability to perform his or her role, especially if conflict arises. Although occasional interpersonal conflicts are not uncommon in a work environment, repeated subtle episodes affecting performance or outwardly hostile acts can create a negative workplace environment due to the presence of workplace bullying (WPB).Workplace bullying is explained by Maguire and Ryan1 as
… a behavior that goes beyond simple rudeness and incivility. While WPB may include overt aggression or threat of violence, like other forms of aggression experienced … it frequently involves subtle or covert acts, rather than direct violence.1(p120)
Another feature of WPB is that an individual repeatedly is the target of negative actions from 1 or several individuals in an organization2 and a power disparity between the bully and the victim exists.35 Some examples of WPB include intimidating behaviors, ridicule in connection with an employee''s work, withholding information that affects an employee''s job, gossiping, being condescending or patronizing, allocating unrealistic workloads, taking credit for others'' work without acknowledging their contributions, and blocking career pathways, just to name a few.6Harassment is commonly defined as “any unwelcome conduct based on a protected class under the federal civil rights laws that is severe, pervasive, or persistent and creates a hostile environment.”7(p6) Both bullying and harassment involve actions that attempt to degrade, intimidate, or victimize an individual, but they are not one and the same—bullying is a relationship issue, whereas harassment is a human rights issue.7(p6) Unfortunately, as Namie8 stated, “Bullying is nearly invisible. It is nonphysical, and nearly always sublethal workplace violence.”(p2) Also contributing to the effect of WPB is that no current laws in the United States protect an individual from WPB, although several states have proposed legislation. In 2010, versions of the Healthy Workplace Bill were introduced but did not pass in New York and in Illinois.9 This was followed by versions of the bill being introduced in California, Connecticut, Hawaii, Kansas, Massachusetts, Maryland, Minnesota, Missouri, Montana, Nevada, New Hampshire, New Jersey, Oklahoma, Oregon, Utah, Vermont, Washington, West Virginia, and Wisconsin.10 Although none of these states enacted the bill, the sheer volume of states examining the issue of WPB demonstrates how the effect of WPB is beginning to be noticed.This increase in attention to WPB may be because of previous research11,12 on WPB illustrating a decrease in productivity, increased absenteeism, and greater attrition. Victims also reported both mental health problems, including posttraumatic stress disorder, anxiety, and depression,4,5,1315 and physical problems.8,16,17 Previous research in nursing,1126 occupational therapy,2730 physiotherapy,31,32 and medicine3336 illustrates how WPB is an emergent critical concern for a variety of health care providers.Organizational factors including burnout,37 professional socialization,38,39 work–family conflict,4043 and sexual harassment4446 have been widely investigated in athletic training, but to date, WPB is absent from the research. Unlike the other organizational factors, WPB research is still in its infancy. As a result, a greater understanding of prevalence, factors contributing to a WPB-conducive environment, and the effect of WPB on ATs is needed. Therefore, the purpose of our study was to explore ATs'' perceptions of WPB in the collegiate setting.  相似文献   

15.
16.

Context:

Of the individuals able to return to sport participation after an anterior cruciate ligament(ACL) injury, up to 25% will experience a second ACL injury. This population may be more sensitive to hormonal fluctuations, which may explain this high rate of second injury.

Objective:

To examine changes in 3-dimensional hip and knee kinematics and kinetics during a jump landing and to examine knee laxity across the menstrual cycle in women with histories of unilateral noncontact ACL injury.

Design

 Controlled laboratory study.

Setting:

Laboratory.

Patients or Other Participants:

A total of 20 women (age = 19.6 ± 1.3 years, height = 168.6 ± 5.3 cm, mass = 66.2 ± 9.1 kg) with unilateral, noncontact ACL injuries.

Intervention(s)

Participants completed a jump-landing task and knee-laxity assessment 3 to 5 days after the onset of menses and within 3 days of a positive ovulation test.

Main Outcome Measure(s):

Kinematics in the uninjured limb at initial contact with the ground during a jump landing, peak kinematics and kinetics during the loading phase of landing, anterior knee laxity via the KT-1000, peak vertical ground reaction force, and blood hormone concentrations (estradiol-β-17, progesterone, free testosterone).

Results:

At ovulation, estradiol-β-17 (t = −2.9, P = .009), progesterone (t = −3.4, P = .003), and anterior knee laxity (t = −2.3, P = .03) increased, and participants presented with greater knee-valgus moment (Z = −2.6, P = .01) and femoral internal rotation (t = −2.1, P = .047). However, during the menses test session, participants landed harder (greater peak vertical ground reaction force; t = 2.2, P = .04), with the tibia internally rotated at initial contact (t = 2.8, P = .01) and greater hip internal-rotation moment (Z = −2.4, P = .02). No other changes were observed across the menstrual cycle.

Conclusions

Knee and hip mechanics in both phases of the menstrual cycle represented a greater potential risk of ACL loading. Observed changes in landing mechanics may explain why the risk of second ACL injury is elevated in this population.Key Words: hormones, estrogen, vertical ground reaction force, knee-valgus moment

Key Points

  • Clinicians should be aware of the high rate of second injury and biomechanical consequences of many factors related to return to sport participation after anterior cruciate ligament (ACL) injury, including sensitivity to hormonal fluctuations and asymmetrical limb loading.
  • The biomechanical profiles of women with ACL injury changed during the preovulatory phase of the menstrual cycle, possibly increasing the risk of second ACL injury.
  • Women with ACL reconstructions should have their landing mechanics evaluated before returning to sport participation.
  • Anterior knee laxity and jump-landing biomechanics changed across the menstrual cycle in women with unilateral ACL injuries.
  • Both menstrual cycle phases had biomechanical variables associated with ACL loading.
The risk of sustaining a noncontact anterior cruciate ligament (ACL) injury is not equal across the menstrual cycle.14 The menstrual cycle consists of the follicular, ovulatory, and luteal phases, which have markedly different hormonal profiles. The follicular phase is associated with the lowest concentrations of estrogen, progesterone, and testosterone. Ovulation, which follows the follicular phase, occurs between days 9 and 20 and is associated with a spike in luteinizing hormone and then a spike in estrogen.5 This is the largest concentration of estrogen during the menstrual cycle. The final phase of the menstrual cycle is the luteal phase, which is associated with prolonged elevated estrogen levels. Progesterone also increases substantially during this phase. Researchers6,7 have reached consensus that the preovulatory phase (from the follicular phase to ovulation) of the menstrual cycle presents the highest risk for noncontact ACL injuries. The risk of injury is thought to result from hormonal fluctuations influencing tissue that, in turn, affects neuromuscular characteristics during dynamic tasks, such as landing from a jump.8,9 Differences across menstrual cycle phases have been identified in variables believed to be associated with joint stability, including laxity,5,10,11 muscle stiffness,9 strength,1214 proprioception,15 and muscle-activation patterns.16 However, this area is not without controversy, with other researchers1720 observing no change across the menstrual cycle in similar variables.Reproductive hormones seem to influence ACL laxity in females with normal menstrual cycles and physiologic levels of estrogen and progesterone.5,11,2125 Numerous authors have concluded that anterior laxity differs between sexes, with males having less laxity than females.2531 Again, this area is not without controversy, as several authors have concluded that anterior knee laxity does not change across the menstrual cycle.18,3136 However, negative correlations have been observed between ACL stiffness and estrogen concentration in active females, indicating that an increase in estrogen is associated with lower levels of ligament stiffness.21 Additionally, evidence8,37,38 has suggested that ACL laxity may influence muscular response during dynamic activity. Park et al8 collected biomechanical data on 26 participants and initially observed no change in kinematic and kinetic variables across the 3 phases of the menstrual cycle. However, when they reorganized participants based on their relative levels of knee laxity into low-, medium-, and high-laxity time points, the authors found that the high-laxity group had a 30% increase in adduction impulse, 20% increase in adduction moment, and 45% increase in external rotation compared with the medium- and low-laxity groups.8 This information demonstrates that knee laxity can influence joint loading and potentially influence noncontact ACL injury.Besides knee laxity, other biomechanical factors are associated with ACL loading and ACL injury during jumping and landing. Landing with decreased sagittal-plane motion or moment (knee and hip extension) and increased frontal- and rotational-plane motion of the hip (adduction and internal rotation) and knee (valgus and internal rotation) contribute to ACL loading.39 Researchers8,40,41 have examined changes in jump-landing mechanics across the menstrual cycle in healthy female populations without histories of ACL injury. These authors observed no change in jump-landing hip and knee mechanics across the menstrual cycle, leading them to conclude that injury rates were most likely due to other factors, including strength or ligament properties.41 One limitation of these studies is that some women may be more responsive to hormonal fluctuations than others (ie, responders versus nonresponders).5,11 We theorize that females with histories of ACL injury may be responsive to hormonal fluctuations, and this increased sensitivity may have a greater effect on tissue and ultimately landing mechanics. Additionally, up to 25% of individuals who sustain primary ACL ruptures will have second ACL injuries, with many second injuries occurring in the contralateral limb.4244 In a recent paper on second ACL injuries, Paterno et al42 examined athletes who were returning to high-level sports and observed that 75% of second ACL injuries occurred in the contralateral limb and 88% of individuals sustaining these injuries were females. The rate of second injury is particularly high for individuals returning to sport participation even after successfully completing rehabilitation programs.44 This warrants further investigation because underlying risk factors, such as hormones, could play a role in the rates of second injury in the contralateral limb. Therefore, the purpose of our study was to examine anterior knee laxity and 3-dimensional hip and knee kinematics and kinetics across the menstrual cycle in a population of women with previous unilateral, noncontact ACL injuries. We hypothesized that biomechanical variables assessed during a jump landing would be altered at ovulation in ways that would increase ACL loading and laxity compared with menses. We based this theory on research in which investigators5,38 have identified increased ligamentous laxity at ovulation. Additionally, we hypothesized that hip and knee kinematics and kinetics during a jump landing would change in ways associated with increased ACL loading.5,38  相似文献   

17.

Context:

Tennis requires repetitive overhead movements that can lead to upper extremity injury. The scapula and the shoulder play a vital role in injury-free playing. Scapular dysfunction and glenohumeral changes in strength and range of motion (ROM) have been associated with shoulder injury in the overhead athlete.

Objective:

To compare scapular position and strength and shoulder ROM and strength between Swedish elite tennis players of 3 age categories (<14, 14–16, and >16 years).

Design:

Cross-sectional study.

Setting:

Tennis training sports facilities.

Patients or Other Participants:

Fifty-nine adolescent Swedish elite tennis players (ages 10–20 years) selected based on their national ranking.

Main Outcome Measure(s):

We used a clinical screening protocol with a digital inclinometer and a handheld dynamometer to measure scapular upward rotation at several angles of arm elevation, isometric scapular muscle strength, glenohumeral ROM, and isometric rotator cuff strength.

Results:

Players older than 16 years showed less scapular upward rotation on the dominant side at 90° and 180° (P < .05). Although all absolute scapular muscle strength values increased with age, there was no change in the body-weight–normalized strength of the middle (P = .9) and lower (P = .81) trapezius or serratus anterior (P = .17). Glenohumeral internal-rotation ROM and total ROM tended to decrease, but this finding was not statistically significant (P = .052 and P = .06, respectively). Whereas normalized internal-rotator strength increased from 14 to 16 years to older than 16 years (P = .009), normalized external-rotator and supraspinatus strength remained unchanged.

Conclusions:

Age-related changes in shoulder and scapular strength and ROM were apparent in elite adolescent tennis players. Future authors should examine the association of these adaptations with performance data and injury incidence.Key Words: upper extremity, scapular position, scapular muscle strength, range of motion, rotator cuff strength

Key Points

  • Elite adolescent tennis players showed some sport-specific adaptations in glenohumeral internal-rotation range of motion, rotator cuff strength, and scapular upward rotation.
  • Sport-specific adaptations seemed to change within the 10- to 20-years-old age range.
The tennis serve uses rapid upper extremity movements to create high racket and ball speeds. Optimal upper extremity strength, flexibility, and neuromuscular coordination are necessary for attaining a high-velocity outcome.1,2Due to the high loads and forces put on the shoulder complex during serving and hitting, tennis players are at increased risk for shoulder pain. Injury risk seems to increase with age3,4 and, despite some lack of evidence, has been suggested to be related to the level and volume of play.35 Shoulder injuries in overhead athletes are commonly due to repetitive use,6 muscle fatigue,7 and may be related to scapular dyskinesis,8,9 rotator cuff injury and weakness,10 or glenohumeral internal-rotation deficit,11,12 resulting in int ernal impingement or labral injury (or both).13,14In high-performance sports, athletes start full-time practice in early childhood, which overlaps with the period of skeletal and muscular development.15,16 As a result of the high demands on joint mobility, muscle strength, and complex biomechanics in the shoulder girdle during overhead sport movements, sport-specific adaptations at the glenohumeral and scapulothoracic level may occur even during adolescence.4,8,17Numerous authors have reported glenohumeral18,19 and scapulothoracic20,21 alterations in adult overhead sport populations. In addition, changes in glenohumeral range of motion3 and rotator cuff strength17 have been described in elite junior tennis players. Only recently have some studies4,8 been published describing the scapular position, strength, and flexibility variables in this young population. However, in these investigations, only general data were established for the whole period of adolescence. The specific age-related changes within adolescents (11–18 years) and the progression over time in this age category were not apparent. Moreover, even though the literature highlights the importance of the coupled movements at the shoulder and scapulothoracic joint for optimal kinematics during the tennis serve,1 to date no authors have combined glenohumeral and scapulothoracic measurements in adolescent tennis players. Therefore, the purpose of our study was to describe the age-related, sport-specific adaptations in the shoulder girdle in adolescent elite tennis players: in particular, glenohumeral rotational range of motion and strength and scapular upward rotation and muscle strength.  相似文献   

18.

Context:

Plyometric exercise has been recommended to prevent lower limb injury, but its feasibility in and effects on those with functional ankle instability (FAI) are unclear.

Objective:

To investigate the effect of integrated plyometric and balance training in participants with FAI during a single-legged drop landing and single-legged standing position.

Design:

Randomized controlled clinical trial.

Setting:

University motion-analysis laboratory.

Patients or Other Participants:

Thirty athletes with FAI were divided into 3 groups: plyometric group (8 men, 2 women, age = 23.20 ± 2.82 years; 10 unstable ankles), plyometric-balance (integrated)–training group (8 men, 2 women, age = 23.80 ± 4.13 years; 10 unstable ankles), and control group (7 men, 3 women, age = 23.50 ± 3.00 years; 10 unstable ankles).

Intervention(s):

A 6-week plyometric-training program versus a 6-week integrated-training program.

Main Outcome Measure(s):

Postural sway during single-legged standing with eyes open and closed was measured before and after training. Kinematic data were recorded during medial and lateral single-legged drop landings after a 5-second single-legged stance.

Results:

Reduced postural sway in the medial-lateral direction and reduced sway area occurred in the plyometric- and integrated-training groups. Generally, the plyometric training and integrated training increased the maximum angles at the hip and knee in the sagittal plane, reduced the maximum angles at the hip and ankle in the frontal and transverse planes in the lateral drop landing, and reduced the time to stabilization for knee flexion in the medial drop landing.

Conclusions:

After 6 weeks of plyometric training or integrated training, individuals with FAI used a softer landing strategy during drop landings and decreased their postural sway during the single-legged stance. Plyometric training improved static and dynamic postural control and should be incorporated into rehabilitation programs for those with FAI.Key Words: plyometric training, balance training, landings, ankle injuries

Key Points

  • After 6 weeks of isolated plyometric or combined plyometric and balance training, people with functional ankle instability demonstrated increased lower extremity maximal sagittal-plane angles and decreased maximal frontal-plane and transverse-plane angles on ground contact.
  • Static and dynamic postural control improved with plyometric training, which should be included in rehabilitation programs for patients with functional ankle instability.
Ankle sprains often occur during physical activities such as basketball and soccer that require sudden stops, jumping, landing, and rotation around a planted foot. Although a patient with an ankle sprain may recover without experiencing persistent pain and swelling, most patients go on to develop chronic dysfunction, such as recurrent ankle sprain or instability.1 Athletes report a 73% recurrence rate of lateral ankle sprain,2 and the impairments associated with ankle sprain persist in 40% of patients 6 months after injury.3 These findings demonstrate that prolonged ankle dysfunction or disability is commonly attributable to ankle sprain.Functional ankle instability (FAI) is identified in those with symptoms such as frequent episodes of ankle giving way and feelings of ankle instability4 after ankle sprains and often presents with sensorimotor deficits in muscle reaction time, joint position sense, postural sway, and time to stabilization (TTS) of ground reaction force.5,6 Several outcome measures, including center-of-pressure (COP) sway, leg reaching with the Star Excursion Balance Test, surface electromyography, and kinematics, are used to evaluate the neuromuscular and biomechanical characteristics of individuals with FAI. Measurement of COP sway during the single-legged stance is an easy way to evaluate static postural stability.7 People with ankle instability had greater variation in the magnitude of medial-lateral COP than a healthy group.8 In addition, TTS is effective for detecting differences between unstable and healthy groups.9,10 The TTS for ground reaction force is the time required to achieve stability after a dynamic perturbation, and this time is longer in those with FAI.9,11 In addition to TTS for ground reaction force, TTS for kinematics is a novel method to investigate the ability to regain balance in people with FAI; participants with FAI took longer TTS for ankle inversion after 1-legged hopping.12 The advantage of using TTS for kinematics instead of TTS for ground reaction force is to provide more specific information about dynamic neuromuscular control of body segments.Rehabilitation programs for ankle sprain include muscle-strengthening, balance-training, neuromuscular-training, and proprioceptive-training protocols. The use of balance training for ankle reeducation has become common in recent years and is effective in reducing episodes of inversion.13 Balance training focuses on improving the ability to maintain a position through conscious and subconscious motor control.14 Certain tools, such as the balance board,15 Dura Disc, minitrampoline,16 biomechanical ankle platform system (BAPS),17 and Star Excursion Balance Test,18 can be used to assist training. In individuals with FAI, a 12-week BAPS exercise program with progressive testing reduced the radius of COP in single-legged standing.17 Another study19 showed that 4 weeks of balance improved shank-rearfoot coupling stability during walking. Proprioceptive training attempts to restore proprioceptive sensibility, retrain afferent pathways, and enhance the sensation of joint movement.14 Eils and Rosenbaum20 found that 6 weeks of multi-station proprioceptive exercise in individuals with ankle instability reduced the standard deviation of COP (referring to the 68.2% range of COP dispersion) and maximum sway of COP (referring to the maximum range of COP dispersion) in the medial-lateral direction. However, Coughlan and Caulfield21 reported no change in ankle kinematics during treadmill walking and running after a 4-week neuromuscular training program with the “both sides up” (BOSU) balance trainer.Plyometric training has positive effects on sport performance, including distance running,22 jumping,23 sprinting, and leg-extension force.24 The focus of plyometric training is the stretch-shortening cycle induced in the muscle-tendon complex, where soft tissues repeatedly lengthen and shorten.25 Plyometric exercise is described as “reactive neuromuscular training”26 because it increases the excitability of the neurologic receptors and improves reactivity of the neuromuscular system. Plyometric training desensitizes the Golgi tendon organs through adaptation to the stretch-shortening exercise, which allows the elastic components of muscles to tolerate greater stretching.27 Previously, plyometric training was theorized to improve neuromuscular control and dynamic stability, reduce the incidence of serious knee injuries,28 and reduce the risk of injury by increasing functional joint stability of the lower limbs.23,28 Furthermore, 6 weeks of plyometric exercise enhanced results on functional performance testing in athletes after lateral ankle sprain.29 Plyometric exercise is thought to enable segments to absorb joint force effectively by promoting the mechanical advantage of soft tissue structures30 through increasing initial and maximal knee and hip flexion during the jump-landing task.30 The increased knee-flexion and hip-flexion angles during landing protect the knee via hamstrings tension.31,32To date, investigations on the effect of plyometric training have emphasized functional performance28,29 or preventing anterior cruciate ligament injuries.33,34 Data on the feasibility and effectiveness of plyometric training in those with FAI are very limited.29 Therefore, our purposes were to determine the effects on lower extremity biomechanics of a 6-week plyometric-training program or a 6-week integrated program with plyometric and balance training in athletes with FAI. We hypothesized that both training programs would increase maximum joint angles in the sagittal plane and reduce the time needed to regain stability during drop-landing tasks. We further hypothesized that the integrated training would reduce postural sway during single-legged stance and decrease the center of mass (COM)-COP deviation during drop-landing tasks.  相似文献   

19.

Context:

Biomechanically, the motions used by baseball and softball pitchers differ greatly; however, the throwing motions of position players in both sports are strikingly similar. Although the adaptations to the dominant limb from overhead throwing have been well documented in baseball athletes, these adaptations have not been clearly identified in softball players. This information is important in order to develop and implement injury-prevention programs specific to decreasing the risk of upper extremity injury in softball athletes.

Objective:

To compare range-of-motion and humeral-retrotorsion characteristics of collegiate baseball and softball position players and of baseball and softball players to sex-matched controls.

Design:

Cross-sectional study.

Setting:

Research laboratories and athletic training rooms at the University of North Carolina at Chapel Hill.

Patients or Other Participants:

Fifty-three collegiate baseball players, 35 collegiate softball players, 25 male controls (nonoverhead athletes), and 19 female controls (nonoverhead athletes).

Intervention(s):

Range of motion and humeral retrotorsion were measured using a digital inclinometer and diagnostic ultrasound.

Main Outcome Measure(s):

Glenohumeral internal-rotation deficit, external-rotation gain, total glenohumeral range of motion, and humeral retrotorsion.

Results:

Baseball players had greater glenohumeral internal-rotation deficit, total–range-of-motion, and humeral-retrotorsion difference than softball players and male controls. There were no differences between glenohumeral internal-rotation deficit, total–range-of-motion, and humeral-retrotorsion difference in softball players and female controls.

Conclusions:

Few differences were evident between softball players and female control participants, although range-of-motion and humeral-retrotorsion adaptations were significantly different than baseball players. The throwing motions are similar between softball and baseball, but the athletes adapt to the demands of the sport differently; thus, stretching/strengthening programs designed for baseball may not be the most effective programs for softball athletes.Key Words: upper extremity, shoulder, injury prevention

Key Points

  • Compared with softball players, baseball players had greater glenohumeral internal-rotation deficit and humeral-retrotorsion and total range-of-motion differences.
  • Few differences were observed between softball players and female control participants,.
Because of the repetitive nature of overhead sport, shoulder and elbow pain is common among overhead athletes. The overhead-throwing motions performed by baseball and softball players are the primary factors placing the upper extremity, particularly the shoulder and elbow, at risk for overuse injuries.1 In an analysis of the National Collegiate Athletic Association Injury Surveillance System data from 1988 to 2004, 45% of all time lost from baseball and 33% of all time lost from softball (for practices and games) because of injury was attributed to the upper extremity.2,3 Overhand throwing in position players was the most common injury mechanism for the shoulder, accounting for 24.3% in high school baseball players and 50.2% in high school softball players.4 Although the shoulder of the baseball player has received substantial attention, sports medicine research assessing the shoulder of the softball player is limited, despite the fact that the prevalence of shoulder injuries in softball players parallels that in baseball athletes.2,57To date, research surrounding the overhead athlete''s shoulder range of motion (ROM) has focused on baseball players. Substantial evidence shows alterations in ROM of the baseball player''s dominant shoulder as compared with the nondominant arm and with nonoverhead athletes, which have been theorized to be shoulder and elbow injury risk factors.813 The general pattern of adaptation in the baseball player''s shoulder is increased external rotation and decreased internal rotation, horizontal adduction, and total ROM. In addition to ROM adaptations, baseball players show differences in humeral torsion when the throwing arm is compared with the nonthrowing arm. Side-to-side differences in baseball players range from 0° to 29°, with baseball players demonstrating greater humeral torsion in the throwing arm, whereas control groups show no differences bilaterally.9,11,1416 Bilateral variations suggest that torsion is influenced by the degree of upper extremity activity.14 Increased humeral retrotorsion is of interest because it has been shown to contribute to increased external-rotation and decreased internal-rotation ROM in the throwing arm and has been linked to a history of upper extremity injury in baseball players.9,11,14,17,18Although these adaptations have been identified in baseball players and hypothesized to be caused by repetitive throwing, research on ROM and humeral-retrotorsion adaptations in softball players is very limited.15,19 It is important to understand ROM alterations and potential injury risk factors in softball players, as injury rates between softball and baseball athletes are comparable. Baseball and softball athletes appear to have a similar throwing motion, but on average, the female athlete has less height, mass, overall size, muscle mass, limb length, and absolute muscle strength20 and uses a larger, heavier ball on a smaller field in softball,21,22 which would influence the force production and kinetics of the throwing motion and the stresses at the shoulder joint. Differing stresses at the shoulder during baseball and softball throwing could clinically present with different physical adaptations between athletes in each sport and thus differing injury mechanisms. Often, baseball and softball athletes are prescribed similar injury-prevention programs, as the assumption is that physical characteristics and injury risk factors are the same between the sports because of the similar overhead throwing motions.23 Understanding the physical adaptations that are present in softball players will help clinicians to develop injury-prevention programs specific to softball athletes or support the use of programs that are aimed at influencing the physical adaptations seen in baseball players. Therefore, the purpose of our study was to compare ROM and humeral-retrotorsion characteristics of collegiate baseball position players, softball position players, and sex-matched controls.  相似文献   

20.

Context:

Approved Clinical Instructors (ACIs; now known as preceptors) are expected to provide feedback to athletic training students (ATSs) during clinical education experiences. Researchers in other fields have found that clinical instructors and students often have different perceptions of actual and ideal feedback and that several factors may influence the feedback exchanges between instructors and students. However, understanding of these issues in athletic training education is minimal.

Objective:

To investigate the current characteristics and perceptions of and the influences on feedback exchanges between ATSs and ACIs.

Design:

Qualitative study.

Setting:

One entry-level master''s degree program accredited by the Commission on Accreditation of Athletic Training Education.

Patients or Other Participants:

Four ACIs and 4 second-year ATSs.

Data Collection and Analysis:

Individual, semistructured interviews were conducted with participants and integrated with field notes and observations for analysis. We used the constant comparative approach to inductively analyze data and develop codes and categories. Member checking, triangulation, and peer debriefing were used to promote trustworthiness of the study.

Results:

Participants described that feedback plays an important role in clinical education and has several purposes related to improving performance. The ACIs and ATSs also discussed several preferred characteristics of feedback. Participants identified 4 main influences on their feedback exchanges, including the ACI, the ATS, personalities, and the learning environment.

Conclusions:

The ACIs and ATSs had similar perceptions of ideal feedback in addition to the actual feedback that was provided during their clinical education experiences. Most of the preferences for feedback were aligned with recommendations in the literature, suggesting that existing research findings are applicable to athletic training clinical education. Several factors influenced the feedback exchanges between ACIs and ATSs, which clinical education coordinators should consider when selecting clinical sites and training ACIs.Key Words: assessment, evaluation, pedagogy, preceptors

Key Points

  • Both Approved Clinical Instructors (ACIs) and athletic training students (ATSs) recognized feedback has an important role in clinical education for several reasons.
  • Several characteristics of the learning environment influenced ACI-ATS interactions and student learning and should be considered when selecting and improving clinical sites, pairing ACIs and ATSs, and educating ACIs to give good feedback.
  • Researchers need to continue investigating the roles that patient volume, supervision, ACI workload, ACI experience, personalities, and similar factors have on student learning and feedback.
Approved Clinical Instructors (ACIs; now known as preceptors) are responsible for providing feedback to athletic training students (ATSs) during their clinical education experiences.1,2 Feedback provides information to students about their performances that they can use to improve and refine their clinical skills, reasoning, and professional behaviors.36 Several suggestions for providing effective feedback have been given in athletic training3,7,8; however, current research on the actual use of feedback in athletic training clinical education is limited.Much of the existing research on feedback in medicine5,9 and nursing10 is based on student and instructor perceptions of feedback. Whereas students and instructors agree that feedback is important,9,10 researchers5 have found several disagreements over what students, instructors, and experts believe is good feedback. In addition, instructors and students often have different perceptions of the feedback that actually is given in clinical education settings. Clinical instructors often believe they provide effective feedback more often than instructors of medical students do,9,11,12 and these opinions are frequently different from what is observed by a third party.11 These differences may stem from an inability of instructors to self-assess their behaviors or an inability of students to recognize feedback.13,14In addition to perceptions of feedback, several investigators in the areas of medical1517 and athletic training18,19 education have examined the factors that influence the feedback exchanges between clinical instructors and students. These investigators have found that several factors influence how feedback is given and received in the clinical education setting, including interpersonal and communication abilities of clinical instructors,15,16 their abilities to adjust feedback based on student needs,18 and their past experiences as teachers and learners.20 In addition, student receptivity to feedback,21 the clinical environment,17 and the degree of supervision19 have been found to influence the feedback exchanges between students and teachers. These aspects of the student-instructor relationship further complicate the delivery and use of feedback in clinical education. Therefore, the purpose of our study was to gain understanding of the complex feedback interactions that occur in athletic training education by investigating the current characteristics and perceptions of and the influences on feedback exchanges between ATSs and ACIs. The findings related to the characteristics of the actual feedback that is being provided were presented in part I of this study. In part II, we include the findings specific to the perceptions of and influences on feedback.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号