首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background:

The high prevalence of pain and depression in persons with spinal cord injury (SCI) is well known. However the link between pain intensity, interference, and depression, particularly in the acute period of injury, has not received sufficient attention in the literature.

Objective:

To investigate the relationship of depression, pain intensity, and pain interference in individuals undergoing acute inpatient rehabilitation for traumatic SCI.

Methods:

Participants completed a survey that included measures of depression (PHQ-9), pain intensity (“right now”), and pain interference (Brief Pain Inventory: general activity, mood, mobility, relations with others, sleep, and enjoyment of life). Demographic and injury characteristics and information about current use of antidepressants and pre-injury binge drinking also were collected. Hierarchical multiple regression was used to test depression models in 3 steps: (1) age, gender, days since injury, injury level, antidepressant use, and pre-injury binge drinking (controlling variables); (2) pain intensity; and (3) pain interference (each tested separately).

Results:

With one exception, pain interference was the only statistically significant independent variable in each of the final models. Although pain intensity accounted for only 0.2% to 1.2% of the depression variance, pain interference accounted for 13% to 26% of the variance in depression.

Conclusion:

Our results suggest that pain intensity alone is insufficient for understanding the relationship of pain and depression in acute SCI. Instead, the ways in which pain interferes with daily life appear to have a much greater bearing on depression than pain intensity alone in the acute setting.Key words: depression, pain, spinal cord injuriesThe high incidence and prevalence of pain following spinal cord injury (SCI) is well established16 and associated with numerous poor health outcomes and low quality of life (QOL).1,7,8 Although much of the literature on pain in SCI focuses on pain intensity, there is emerging interest in the role of pain interference or the extent to which pain interferes with daily activities of life.7,9 With prevalence as high as 77% in SCI, pain interference impacts life activities such as exercise, sleep, work, and household chores.2,7,1013 Pain interference also has been associated with disease management self-efficacy in SCI.14 There is a significant relationship between pain intensity and interference in persons with SCI.7 Like pain, the high prevalence of depression after SCI is well-established.1517 Depression and pain often co-occur,18,19 and their overlap ranges from 30% to 60%.19 Pain is also associated with greater duration of depressed mood.20 Pain and depression share common biological pathways and neurotransmitter mechanisms,19 and pain has been shown to attenuate the response to depression treatment.21,22Despite the interest in pain and depression after SCI and implications for the treatment of depression, their co-occurrence has received far less attention in the literature.23 Greater pain has been associated with higher levels of depression in persons with SCI,16,24 although this is not a consistent finding.25 Similarly, depression in persons with SCI who also have pain appears to be worse than for persons with non-SCI pain, suggesting that the link between pain and depression may be more intense in the context of SCI.26 In one of the few studies of pain intensity and depression in an acute SCI rehabilitation setting, Cairns et al 27 found a co-occurrence of pain and depression in 22% to 35% of patients. This work also suggested an evolution of the relationship between pain and depression over the course of the inpatient stay, such that they become associated by discharge. Craig et al28 found that pain levels at discharge from acute rehabilitation predicted depression at 2-year follow-up. Pain interference also has been associated with emotional functioning and QOL in persons with SCI1,7,29,30 and appears to mediate the relationship between ambulation and depression.31Studies of pain and depression in person with SCI are often limited methodologically to examine the independent contributions of pain intensity and interference to depression in an acute setting. For example, they include only pain intensity16,23,25,28,30; classify subjects by either pain plus depression23 or pain versus no pain8,28,30; use pain intensity and interference as predictor and outcome, respectively1; collapse pain interference domains into a single score1; or use only univariate tests (eg, correlations).7,8,25,30 In addition, the vast majority focus on the chronic period of injury. To fill a gap in knowledge, we examined the independent contributions of pain intensity and pain interference to depression, while accounting for injury and demographic characteristics, antidepressant treatment, and pre-injury binge drinking in a sample of persons with acute SCI. We hypothesized that when accounting for both pain intensity and interference in the model, interference would have an independent and significant relationship with depression, above and beyond pain intensity.  相似文献   

2.

Background:

The relationship between cardiovascular disease (CVD) risk factors and dietary intake is unknown among individuals with spinal cord injury (SCI).

Objective:

To investigate the relationship between consumption of selected food groups (dairy, whole grains, fruits, vegetables, and meat) and CVD risk factors in individuals with chronic SCI.

Methods:

A cross-sectional substudy of individuals with SCI to assess CVD risk factors and dietary intake in comparison with age-, gender-, and race-matched able-bodied individuals enrolled in the Coronary Artery Risk Development in Young Adults (CARDIA) study. Dietary history, blood pressure, waist circumference (WC), fasting blood glucose, high-sensitivity C-reactive protein (hs-CRP), lipids, glucose, and insulin data were collected from 100 SCI participants who were 38 to 55 years old with SCI >1 year and compared to 100 matched control participants from the CARDIA study.

Results:

Statistically significant differences between SCI and CARDIA participants were identified in WC (39.2 vs 36.2 in.; P < .001) and high-density lipoprotein cholesterol (HDL-C; 39.2 vs 47.5 mg/dL; P < .001). Blood pressure, total cholesterol, triglycerides, glucose, insulin, and hs-CRP were similar between SCI and CARDIA participants. No significant relation between CVD risk factors and selected food groups was seen in the SCI participants.

Conclusion:

SCI participants had adverse WC and HDL-C compared to controls. This study did not identify a relationship between consumption of selected food groups and CVD risk factors.Key words: cardiovascular disease risk factors, dietary intake, spinal cord injuryCardiovascular disease (CVD) is a leading cause of death in individuals with chronic spinal cord injuries (SCIs).15 This is partly because SCI is associated with several metabolic CVD risk factors, including dyslipidemia,610 glucose intolerance,6,1114 and diabetes.1517 In addition, persons with SCI exhibit elevated markers of inflammation18,19 and endothelial activation20 that are correlated with higher CVD prevalence.2123 Obesity, and specifically central obesity, another CVD risk factor,2426 is also common in this population.12,2729Dietary patterns with higher amounts of whole grains and fiber have been shown to improve lipid abnormalities,30 glucose intolerance, diabetes mellitus,3134 hypertension,35 and markers of inflammation36 in the general population. These dietary patterns are also associated with lower levels of adiposity.31 Ludwig et al reported that the strong inverse associations between dietary fiber and multiple CVD risk factors – excessive weight gain, central adiposity, elevated blood pressure, hypertriglyceridemia, low high-density lipoprotein cholesterol (HDL-C), high low-density lipoprotein cholesterol (LDL-C), and high fibrinogen – were mediated, at least in part, by insulin levels.37 Whole-grain food intake is also inversely associated with fasting insulin, insulin resistance, and the development of type 2 diabetes.32,38,39Studies in the general population have also shown a positive association between the development of metabolic syndrome as well as heart disease and consumption of a Western diet, a diet characterized by high intake of processed and red meat and low intake of fruit, vegetables, whole grains, and dairy.40,41 Red meat, which is high in saturated fat, has been shown to have an association with adverse levels of cholesterol and blood pressure and the development of obesity, metabolic syndrome, and diabetes.40,42,43Numerous studies have shown that individuals with chronic SCI have poor diet quality.4449 A Canadian study found that only 26.7% of their sample was adherent to the recommendations about the consumption of fruit, vegetables, and grains from the “Eating Well with Canada’s Food Guide.”44 Individuals with chronic SCI have also been found to have low fiber and high fat intakes when their diets were compared to dietary recommendations from the National Cholesterol Education Program,46 the 2000 Dietary Guidelines for Americans,49 and the recommended Dietary Reference Intakes and the Acceptable Macronutrient Distribution Range.47,48However, unlike in the general population, the relationship between dietary intake and obesity and CVD risk factors is unknown in the chronic SCI population. If a dietary pattern consisting of higher intake of whole grains and dietary fiber is favorably associated with obesity and CVD risk factors in individuals with chronic SCI, then trials of increased intake of whole grains and fiber intake could be conducted to document health benefits and inform recommendations. The purpose of this pilot study is to investigate the association between selected food group intake and CVD risk factors in individuals with chronic SCI as compared to age-, gender-, and race-matched able-bodied individuals enrolled in the Coronary Artery Risk Development in Young Adults (CARDIA) study. Data will also be used to plan future studies in the relatively understudied field of CVD and nutrition in individuals with SCI.  相似文献   

3.

Background:

The predictors and patterns of upright mobility in children with a spinal cord injury (SCI) are poorly understood.

Objective:

The objective of this study was to develop a classification system that measures children’s ability to integrate ambulation into activities of daily living (ADLs) and to examine upright mobility patterns as a function of their score and classification on the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI) exam.

Methods:

This is a cross-sectional, multicenter study that used a convenience sample of subjects who were participating in a larger study on the reliability of the ISNCSCI. A total of 183 patients between 5 and 21 years old were included in this study. Patients were asked if they had participated in upright mobility in the last month and, if so, in what environment and with what type of bracing. Patients were then categorized into 4 groups: primary ambulators (PrimA), unplanned ambulators (UnPA), planned ambulators (PlanA), and nonambulators.

Results:

Multivariate analyses found that only lower extremity strength predicted being a PrimA, whereas being an UnPA was predicted by both lower extremity strength and lack of preservation of S45 pinprick sensation. PlanA was only associated with upper extremity strength.

Conclusions:

This study introduced a classification system based on the ability of children with SCI to integrate upright mobility into their ADLs. Similar to adults, lower extremity strength was a strong predictor of independent mobility (PrimA and UnPA). Lack of pinprick predicted unplanned ambulation, but not being a PrimA. Finally, upper extremity strength was a predictor for planned ambulation.Key words: ambulation, ISNCSCI, pediatrics, spinal cord injuryAfter a spinal cord injury (SCI), learning to walk often becomes the focus of rehabilitation for children and their families.1,2 Although the majority of children with SCI do not return to full-time functional ambulation, those who accomplish some level of walking report positive outcomes such as feeling “normal” again, being eye-to-eye with peers, and having easier social interactions.3 Although not frequently reported by patients, there is some evidence of physiological benefits as well.39 Regardless of age, upright mobility has been positively associated with community participation and life satisfaction.1012 For children, upright mobility allows them to explore their physical environment, which facilitates independence and learning as part of the typical developmental process.13,14With the use of standers, walkers, and other assistive devices, as well as a variety of lower extremity orthoses, it is a reasonable expectation that some children with spinal injuries achieve upright stance and mobility.7,9,1321 However, there are 2 main challenges for clinicians and patients: understanding the factors that either encourage or discourage upright activities, and identifying how best to determine whether upright mobility is successful and meaningful. The literature on adults suggests that upright mobility is dependent on physiological and psychosocial factors. Physiological factors include the patient’s current age, neurological level, muscle strength, and comorbidities.14,2227 Psychosocial factors include satisfaction with the appearance of the gait pattern, cosmesis, social support for donning/doffing braces, and assistance with transfer and during ambulation.3,9,19,2832The identification of outcome measures that provide a meaningful indication of successful upright mobility has been difficult. The World Health Organization (WHO) describes 2 constructs for considering outcomes – capacity and performance.33 Capacity refers to maximal capability in a laboratory setting. An example of a capacity measure is the Walking Index for Spinal Cord Injury (WISCI), which is an ordinal scale used to quantify walking capacity based on assistive device, type of orthosis, and amount of assistance required.34,35 Other capacity measures include the Timed Up and Go test and the 6-minute walk test.36,37 On the other hand, performance refers to actual activity during a patient’s daily activities in typical, real-life environments.33 For example, the FIM is an observation scale that scores the patient’s typical daily performance.36,3840 The FIM is considered a burden of care measure that determines the amount of actual assistance provided to a patient during typical routines and environments, which may or may not reflect maximal ability or capacity. Performance measures provide an adequate clinical snap-shot of a patients’ daily function (evaluates what they do), whereas capacity measures are better research tools, as they are able to detect subtle changes in ambulation (evaluates what they can do).In children, no capacity outcome measures of ambulation have been tested for validity or reliability. Availability of reliable and valid performance measures is also lacking. The WeeFIM is a performance measure for children, but it is not SCI specific. It is scored on the child’s burden of care, that is, on the maximal assistance required rather than the child’s maximal independence or the highest capacity of performance during a typical day. For children, another commonly used scale is the Hoffer Scale, which relies on the physician’s or therapist’s subjective determination of the purpose of the upright mobility activities (for function or for exercise).41,42 Because parents and school systems are encouraged to integrate “exercise” ambulation into daily activities, it may not be possible to distinguish between therapeutic and functional ambulation in the home, school, or community environments. In the schools, a teacher/therapist should incorporate upright mobility into the classroom setting by donning a child’s braces and then having her/him ambulate a short distance to stand at an easel in art class or to stand upright when talking to friends during recess. In this situation, walking serves the dual purpose of being functional and therapeutic.For this study, it was decided not to rely on a subjective determination of therapeutic versus functional ambulation as the main outcome measure. Instead, we were interested in the children and adolescents who have successfully integrated independent mobility into their daily activities, regardless of frequency, distance, or purpose. Recent literature in studies of children and adolescents suggests that spontaneity is important for participation in functional and social activities. For example, a survey of patients using functional electrical stimulation for hand function found a reduction in the dependence on others for donning splints, which facilitated independence with activities of daily living (ADLs) in adolescents.4345 In a more recent study, Mulcahey et al46 found that a reduction of spontaneity in adolescents was a barrier for social activity; during cognitive interviews, children reported not participating in sleepovers due to planning their bowel/bladder programs.To date, there are no measures that integrate spontaneity of standing and/or upright mobility into the daily activities of children. Toward that aim, this study introduces a new scale that attempts to categorize children into 4 mutually exclusive groups: primary ambulators, unplanned ambulators, planned ambulators, and nonambulators. The purpose of this study was to examine ambulation patterns among children and adolescents with SCI as a function of neurological level, motor level, and injury severity, as defined by the motor, sensory, and anorectal examinations of the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI). A secondary aim of the study was to determine how performance on the ISNCSCI exam was associated with the ability of children to independently integrate ambulation into their daily routines.  相似文献   

4.

Background:

Functional electrical stimulation (FES) therapy has been shown to be one of the most promising approaches for improving voluntary grasping function in individuals with subacute cervical spinal cord injury (SCI).

Objective:

To determine the effectiveness of FES therapy, as compared to conventional occupational therapy (COT), in improving voluntary hand function in individuals with chronic (≥24 months post injury), incomplete (American Spinal Injury Association Impairment Scale [AIS] B-D), C4 to C7 SCI.

Methods:

Eight participants were randomized to the intervention group (FES therapy; n = 5) or the control group (COT; n = 3). Both groups received 39 hours of therapy over 13 to 16 weeks. The primary outcome measure was the Toronto Rehabilitation Institute-Hand Function Test (TRI-HFT), and the secondary outcome measures were Graded Redefined Assessment of Strength Sensibility and Prehension (GRASSP), Functional Independence Measure (FIM) self-care subscore, and Spinal Cord Independence Measure (SCIM) self-care subscore. Outcome assessments were performed at baseline, after 39 sessions of therapy, and at 6 months following the baseline assessment.

Results:

After 39 sessions of therapy, the intervention group improved by 5.8 points on the TRI-HFT’s Object Manipulation Task, whereas the control group changed by only 1.17 points. Similarly, after 39 sessions of therapy, the intervention group improved by 4.6 points on the FIM self-care subscore, whereas the control group did not change at all.

Conclusion:

The results of the pilot data justify a clinical trial to compare FES therapy and COT alone to improve voluntary hand function in individuals with chronic incomplete tetraplegia.Key words: chronic patients, functional electrical stimulation, grasping, therapy, upper limbIn the United States and Canada, there is a steady rate of incidence and an increasing rate of prevalence of individuals living with spinal cord injury (SCI). For individuals with tetraplegia, hand function is essential for achieving a high level of independence in activities of daily living.15 For the majority of individuals with tetraplegia, the recovery of hand function has been rated as their highest priority.5Traditionally, functional electrical stimulation (FES) has been used as a permanent neuroprosthesis to achieve this goal.614 More recently, researchers have worked toward development of surface FES technologies that are meant to be used as shortterm therapies rather than permanent prosthesis. This therapy is frequently called FES therapy or FET. Most of the studies published to date, where FES therapy was used to help improve upper limb function, have been done in both the subacute and chronic stroke populations1523 and 2 have been done in the subacute SCI population.13 With respect to the chronic SCI population, there are no studies to date that have looked at use of FES therapy for retraining upper limb function. In a review by Kloosterman et al,24 the authors have discussed studies that have used various combinations of therapies for improving upper extremity function in chronic SCI individuals; however, the authors found that the only study that showed significant improvements before and after was the study published by Needham-Shropshire et al.25 This study examined the effectiveness of neuromuscular stimulation (NMS)–assisted arm ergometry for strengthening triceps brachii. In this study, electrical stimulation was used to facilitate arm ergometry, and it was not used in the context of retraining reaching, grasping, and/or object manipulation.Since 2002, our team has been investigating whether FES therapy has the capacity to improve voluntary hand function in complete and incomplete subacute cervical SCI patients who are less than 180 days post injury at the time of recruitment in the study.13 In randomized controlled trials (RCTs) conducted by our team, we found that FES therapy is able to restore voluntary reaching and grasping functions in individuals with subacute C4 to C7 incomplete SCI.13 The changes observed were transformational; individuals who were unable to grasp at all were able to do so after only 40 one-hour sessions of the FES therapy, whereas the control group showed significantly less improvement. Inspired by these results, we decided to conduct a pilot RCT with chronic (≥24 months following injury) C4 to C7 SCI patients (American Spinal Injury Association Impairment Scale [AIS] B-D), which is presented in this article. The purpose of this pilot study was to determine whether the FES therapy is able to restore voluntary hand function in chronic tetraplegic individuals. Based on the results of our prior phase I1 and phase II2,3 RCTs in the subacute SCI population, we hypothesized that individuals with chronic tetraplegia who underwent the FES therapy (intervention group) may have greater improvements in voluntary hand function, especially in their ability to grasp and manipulate objects, and perform activities of daily living when compared to individuals who receive similar volume and duration of conventional occupational therapy (COT: control group).  相似文献   

5.
Primary vesicoureteral reflux (pVUR) is one of the most common causes of pediatric kidney failure. Linkage scans suggest that pVUR is genetically heterogeneous with two loci on chromosomes 1p13 and 2q37 under autosomal dominant inheritance. Absence of pVUR in parents of affected individuals raises the possibility of a recessive contribution to pVUR. We performed a genome-wide linkage scan in 12 large families segregating pVUR, comprising 72 affected individuals. To avoid potential misspecification of the trait locus, we performed a parametric linkage analysis using both dominant and recessive models. Analysis under the dominant model yielded no signals across the entire genome. In contrast, we identified a unique linkage peak under the recessive model on chromosome 12p11-q13 (D12S1048), which we confirmed by fine mapping. This interval achieved a peak heterogeneity LOD score of 3.6 with 60% of families linked. This heterogeneity LOD score improved to 4.5 with exclusion of two high-density pedigrees that failed to link across the entire genome. The linkage signal on chromosome 12p11-q13 originated from pedigrees of varying ethnicity, suggesting that recessive inheritance of a high frequency risk allele occurs in pVUR kindreds from many different populations. In conclusion, this study identifies a major new locus for pVUR and suggests that in addition to genetic heterogeneity, recessive contributions should be considered in all pVUR genome scans.Vesicoureteral reflux (VUR; OMIM no. 193000) is the retrograde flow of urine from the bladder to the ureters and the kidneys during micturation. Uncorrected, VUR can lead to repeated urinary tract infections, renal scarring and reflux nephropathy, accounting for up to 25% of pediatric end stage renal disease.1,2 VUR is commonly seen as an isolated disorder (primary VUR; pVUR), but it can also present in association with complex congenital abnormalities of the kidney and urinary tract or with specific syndromic disorders, such as renal-coloboma and branchio-oto-renal syndromes.38pVUR has a strong hereditary component, with monozygotic twin concordance rates of 80%.912 Sibling recurrence rates of 30% to 65% have suggested segregation of a single gene or oligogenes with large effects.9,1214 Interestingly however, the three published genome-wide linkage scans of pVUR have strongly suggested multifactorial determination.1517 Two pVUR loci have been identified with genome-wide significance on chromosomes 1p13 and 2q37 under an autosomal dominant transmission with locus heterogeneity.15,16 Multiple suggestive signals have also been reported, but remarkably, these studies show little overlap.1517 These data suggest that pVUR may be extremely heterogeneous, with mutations in different genes each accounting for a fraction of cases. The genes underlying pVUR loci have not yet been identified, but two recent studies have reported segregating mutations in the ROBO2 gene in up to 5% of pVUR families.18,19Despite evidence for genetic heterogeneity and different subtypes of disease, genetic studies have all modeled pVUR as an autosomal dominant trait.1517,20 Recessive inheritance has generally not been considered because the absence of affected parents can be explained by spontaneous resolution of pVUR with older age. However, many pVUR cohorts are composed of affected sibships or pedigrees compatible with autosomal recessive transmission, suggesting the potential for alternative modes of inheritance.912,16,17,2022 Systematic family screening to clarify the mode of inheritance is not feasible for pVUR because the standard diagnostic tool, the voiding cystourethrogram (VCUG), is invasive and would expose participants to radiation. Formal assessment of a recessive contribution in sporadic pVUR has also been difficult because studies have been conducted in populations with low consanguinity rates.912,16,17,2022 However, recent studies have identified an unexpected recessive contribution to several complex traits such as ductus arteriosus or autism.23,24 Thus, in addition to genetic heterogeneity, genes with alternative modes of transmission may segregate among pVUR families, and misspecification of the inheritance model may complicate mapping studies of this trait.Several approaches can be considered to address the difficulties imposed by complex inheritance, variable penetrance, and genetic heterogeneity. Studying large, well characterized cohorts with newer single-nucleotide polymorphism (SNP)-based technologies can maximize inheritance information across the genome and increase the power of linkage studies.25 In addition, in the setting of locus heterogeneity and uncertainty about the mode of transmission, analysis under a dominant and a recessive model has greater power compared with nonparametric methods and more often results in detection of the correct mode of transmission without incurring a significant penalty for multiple testing.2629 We combined these approaches in this study and successfully localized a major gene for VUR, which unexpectedly demonstrates autosomal recessive transmission.  相似文献   

6.

Background:

Understanding the related fates of muscle density and bone quality after chronic spinal cord injury (SCI) is an important initial step in determining endocrine-metabolic risk.

Objective:

To examine the associations between muscle density and indices of bone quality at the distal lower extremity of adults with chronic SCI.

Methods:

A secondary data analysis was conducted in 70 adults with chronic SCI (C2-T12; American Spinal Injury Association Impairment Scale [AIS] A-D; ≥2 years post injury). Muscle density and cross-sectional area (CSA) and bone quality indices (trabecular bone mineral density [TbBMD] at the distal tibia [4% site] and cortical thickness [CtTh], cortical area [CtAr], cortical BMD [CtBMD], and polar moment of inertia [PMI] at the tibial shaft [66% site]) were measured using peripheral quantitative computed tomography. Calf lower extremity motor score (cLEMS) was used as a clinical measure of muscle function. Multivariable linear regression analyses were performed to determine the strength of the muscle-bone associations after adjusting for confounding variables (sex, impairment severity [AIS A/B vs AIS C/D], duration of injury, and wheelchair use).

Results:

Muscle density was positively associated with TbBMD (b = 0.85 [0.04, 1.66]), CtTh (b = 0.02 [0.001, 0.034]), and CtBMD (b = 1.70 [0.71, 2.69]) (P < .05). Muscle CSA was most strongly associated with CtAr (b = 2.50 [0.12, 4.88]) and PMI (b = 731.8 [161.7, 1301.9]) (P < .05), whereas cLEMS was most strongly associated with TbBMD (b = 7.69 [4.63, 10.76]) (P < .001).

Conclusion:

Muscle density and function were most strongly associated with TbBMD at the distal tibia in adults with chronic SCI, whereas muscle size was most strongly associated with bone size and geometry at the tibial shaft.Key words: bone mineral density, bone quality, muscle density, muscle size, osteoporosis, peripheral quantitative computed tomography, spinal cord injurySpinal cord injury (SCI) is associated with sublesional muscle atrophy,13 changes in muscle fiber type,4,5 reductions in hip and knee region bone mineral density (BMD),68 and increased central and regional adiposity after injury.9,10 Adverse changes in muscle and bone health in individuals with SCI contribute to an increased risk of osteoporosis,1113 fragility fractures,14 and endocrine-metabolic disease (eg, diabetes, dyslipidemia, heart disease).1517 Crosssectional studies have shown a higher prevalence of lower extremity fragility fractures among individuals with SCI ranging from 1% to 34%.1820 Fragility fractures are associated with negative health and functional outcomes, including an increased risk of morbidity and hospitalization,21,22 mobility limitations,23 and a reduced quality of life.24 Notably, individuals with SCI have a normal life expectancy, yet fracture rates increase annually from 1% per year in the first year to 4.6% per year in individuals greater than 20 years post injury.25,26Muscle and bone are thought to function as a muscle-bone unit, wherein muscle contractions impose loading forces on bone that produce changes in bone geometry and structure.27,28 A growing body of evidence has shown that individuals with SCI (predominantly those with motor complete injury) exhibit similar patterns of decline in muscle cross-sectional area (CSA) and BMD in the acute and subacute stages following injury.4,11,29 Prospective studies have exhibited a decrease in BMD of 1.1% to 47% per year6,7,30 and up to 73% in the 2 to 7 years following SCI.8,14,31,32 Decreases in muscle CSA have been well-documented following SCI, with greater disuse atrophy observed after complete SCI versus incomplete SCI, presumably due to the absence of voluntary muscle contractions and associated mobility limitations.1,2,16 Muscle quality is also compromised early after SCI, resulting in sublesional accumulation of adipose tissue in the chronic stage of injury3,33,34; the exact time course of this event has been poorly elucidated to date. Adipose tissue deposition within and between skeletal muscle is linked to an increase in noncontractile muscle tissue and a reduction in muscle force-generating capacity on bone.35,36 Skeletal muscle fat infiltration is up to 4 times more likely to occur in individuals with SCI,1,16,37 contributing to metabolic complications (eg, glucose intolerance),16 reduced muscle strength and function,38 and mobility limitations3 – all factors that may be associated with a deterioration in bone quality after SCI.The association between lean tissue mass and bone size (eg, BMD and bone mineral content) in individuals with SCI has been wellestablished using dual energy x-ray absorptiometry (DXA).9,10,29,34 However, DXA is unable to measure true volumetric BMD (vBMD), bone geometry, and bone structure. Peripheral quantitative computed tomography (pQCT) is an imaging technique that improves our capacity to measure indices of bone quality and muscle density and CSA at fracture-prone sites (eg, tibia).3,39 Recent evidence from cross-sectional pQCT studies has shown that muscle CSA and calf lower extremity motor score (cLEMS) were associated with indices of bone quality at the tibia in individuals with SCI.13,40 However, neither study measured muscle density (a surrogate of fatty infiltration when evaluating the functional muscle-bone unit). Fatty infiltration of muscle is common after SCI1,16,37 and may affect muscle function or the muscle-bone unit, but the association between muscle density and bone quality indices at the tibia in individuals with chronic SCI is unclear. Muscle density measured using pQCT may be an acceptable surrogate of muscle quality when it is difficult to assess muscle strength due to paralysis.3,39 Additionally, investigating which muscle outcome (muscle density, CSA, cLEMS) is most strongly associated with vBMD and bone structure may inform modifiable targets for improving bone quality and reducing fracture risk after chronic SCI.The primary objective of this secondary analysis was to examine the associations between pQCTderived calf muscle density and trabecular vBMD at the tibia among adults with chronic SCI. The secondary objective was to examine the associations between calf muscle density, CSA, and function and tibial vBMD, cortical CSA and thickness, and polar moment of inertia (PMI). First, we hypothesize that calf muscle density will be a positive correlate of trabecular and cortical vBMD, cortical CSA and thickness, and PMI at the tibia in individuals with chronic SCI. Second, we hypothesize that of the key muscle variables (cLEMS, CSA and density), calf muscle density and cLEMS will be most strongly associated with trabecular vBMD, whereas calf muscle CSA will be most strongly associated with cortical CSA and PMI.  相似文献   

7.

Background:

Chronic spinal cord injury (SCI) is associated with an increase in risk factors for cardiovascular disease (CVD). In the general population, atherosclerosis in women occurs later than in men and usually presents differently. Associations between risk factors and incidence of CVD have not been studied in women with SCI.

Objective:

To determine which risk factors for CVD are associated with increased carotid intima-media thickness (CIMT), a common indicator of atherosclerosis, in women with SCI.

Methods:

One hundred and twenty-two females older than 18 years with traumatic SCI at least 2 years prior to entering the study were evaluated. Participants were asymptomatic and without evidence of CVD. Exclusion criteria were acute illness, overt heart disease, diabetes, and treatment with cardiac drugs, lipid-lowering medication, or antidiabetic agents. Measures for all participants were age, race, smoking status, level and completeness of injury, duration of injury, body mass index, serum lipids, fasting glucose, hemoglobin A1c, and ultrasonographic measurements of CIMT. Hierarchical multiple linear regression was conducted to predict CIMT from demographic and physiologic variables.

Results:

Several variables were significantly correlated with CIMT during univariate analyses, including glucose, hemoglobin A1c, age, and race/ethnicity; but only age was significant in the hierarchical regression analysis.

Conclusions:

Our data indicate the importance of CVD in women with SCI.Key words: age, cardiovascular disease, carotid intima-media thickness, hemoglobin A1c, risk factors, smokingThe secondary conditions of metabolic syndrome and cardiovascular disease (CVD) resulting from spinal cord injury (SCI ) are not well understood. In particular, persons with SCI have an increase in metabolic risk factors for cardiovascular disease (CVD),15 but researchers have not determined whether this increase is associated with an increased incidence of CVD. The association has not been shown in reports on mortality or prevalence rates for CVD in people with SCI612 or in the few studies that have appraised CVD in people with SCI using physiologic assessments.1318 Either the question was not addressed, or the evidence is insufficient due to low sample sizes and a lack of objective, prospective epidemiological studies assessing this question. Nevertheless, studies consistently show that metabolic syndrome is prevalent among individuals with SCI.15,12 Metabolic syndrome consists of multiple interrelated risk factors that increase the risk for atherosclerotic heart disease by 1.5- to 3-fold.19,20Compounding the uncertainty about the association of metabolic risk factors with CVD in SCI are possible gender differences.2124 Findings from studies of men with SCI might not apply to women with SCI. For example, the correlation between physical activity and high-density lipoprotein (HDL) levels in men with SCI is not found for women with SCI.25,26 Furthermore, able-bodied women develop atherosclerosis later than do able-bodied men, and they usually present differently.27 Some studies indicate that abnormal glucose metabolism may play a particularly important role in CVD in women27; data from our group suggest that this is the case in women with SCI as well.15 Although women constitute 18% to 20% of the SCI population, no studies have evaluated cardiovascular health in women with chronic SCI.Carotid intima-media thickness (CIMT) is the most robust, highly tested, and often used noninvasive endpoint for assessing the progression of subclinical atherosclerosis in men and women of all ages.2846 For people with SCI, CIMT is a reliable surrogate measure of asymptomatic CVD.15,47 The incidence of asymptomatic CVD appears to increase with the duration of SCI,15 where duration of injury is a cardiac risk factor independent of age.17 Moreover, CIMT is greater in men with SCI than in matched able-bodied controls,48 indicating a subclinical and atypical presentation of CVD. A variety of studies have confirmed the usefulness of high-resolution B-mode ultrasound measurement of CIMT for quantitation of subclinical atherosclerosis.49To better discern the association of risk factors with measures of subclinical atherosclerotic disease in women with SCI, we performed blood tests and ultrasonographic measurements of CIMT on 122 females with chronic SCI who were free of overt CVD. We tested for the 3 metabolic risk factors that are consistently identified in the varied definitions of metabolic syndrome: abnormal carbohydrate metabolism, abnormally high triglycerides, and abnormally low HDL cholesterol. We also tested for 4 other CVD risk factors: high levels of low-density lipoprotein (LDL), high total cholesterol, high body mass index (BMI), and a history of smoking.  相似文献   

8.

Objective:

To identify and classify tools for assessing the influence of spasticity on quality of life (QOL) after spinal cord injury (SCI).

Methods:

Electronic databases (MEDLINE/PubMed CINAHL and PsycInfo) were searched for studies published between 1975 and 2012. Dijkers’s theoretical framework on QOL was used to classify tools as either objective or subjective measures of QOL.

Results:

Sixteen studies met the inclusion criteria. Identified objective measures that were used to assess the influence of spasticity on QOL included the Short Form-36 (SF-36) the Sickness Impact Profile (SIP) and the Health Utilities Index-III (HUI-III). Subjective measures included the Quality of Life Index–SCI Version III (QLI-SCI) Life Situation Questionnaire–Revised (LSQ-R) Reciprocal Support Scale (RSS) Profile of Mood States (POMS) Spinal Cord Injury Spasticity Evaluation Tool (SCI-SET) and the Patient Reported Impact of Spasticity Measure (PRISM). A number of tools proved either to be insensitive to the presence of spasticity (QLI-SCI) or yielded mixed (SF-36) or weak (RSS LSQ-R) results. Tools that were sensitive to spasticity had limited psychometric data for use in the SCI population (HUI-III SIP POMS) although 2 were developed specifically for assessing spasticity on daily life post SCI (SCI-SET PRISM).

Conclusions:

Two condition-specific subjective measures the SCI-SET and PRISM emerged as the most promising tools for the assessment of spasticity impact on QOL after SCI. Further research should focus on establishing the psychometric properties of these measures for use in the SCI population.Key words: outcome measurement quality of life spasticity spinal cord injuryKey words: outcome, measurement, quality of life, spasticity, spinal cord injurySpasticity of the upper limbs, trunk, or lower limbs is typically experienced by individuals with an upper motor neuron spinal cord injury (SCI) following spinal shock, and the resulting spasms often negatively impact quality of life (QOL).1 Although there is great variability in definitions of spasticity, the most commonly cited definition is by Lance2(p485): “Spasticity is a motor disorder characterized by a velocity-dependent increase in tonic stretch reflexes (muscle tone) with exaggerated tendon jerks, resulting from hyperexcitability of the stretch reflex, as one component of the upper motor neuron syndrome.” A wider definition of spasticity includes increased exteroceptive reflexes, as well as loss of motor function (ie, muscle power and coordination).3 The notion is that muscle weakness and impaired coordination are not part of the spasticity syndrome itself, but rather are associated with spasticity.36 Spasticity following SCI is prevalent, with 65% to 78% of persons more than 1 year post injury reporting its occurence.7,8The decision to treat spasticity largely depends on the frequency, severity, and impact of the spasms on a person’s daily life.9,10 Treatment may include conservative physical therapy,11 with a possible combination of other modalities,1 including pharmacological treatments (eg, diazepam,12,13 baclofen,14 clonidine,14,15 tazanidine,12,13,16 dantrolene sodium,12,17 cyproheptadine,14,18 and cannabis17). Persons who do not respond to oral administration of medications may be surgically implanted with a pump for intrathecal administration of baclofen1921 or receive injections of chemodenervation agents (phenol and ethanol22 or botulinum toxin16,23). Severe recalcitrant cases require surgical intervention, including dorsal rhizotomy and cordotomy.24 Continued improvements in the definition and management of spasticity are hampered by the development of valid and reliable tools for assessing spasticity impact.1Relatedly, the valid and reliable assessment of QOL post SCI, which is an important outcome for understanding the additional burden of specific secondary health conditions that emerge post SCI and for gauging the success of rehabilitation interventions in minimizing their frequency and severity, is a challenge.25 Symptoms of spasticity may have a profound influence on an individual’s QOL,7,26 including lifestyle and sense of well-being,27 by limiting workplace participation, adding to the cost of medication, and increasing attendant care requirements.8,21,28 Despite these findings, there are problems with assessing the influence of spasticity on QOL that are related to the multidimensionality and breadth of spasticity definitions, the fluctuating nature of associated symptoms, and their clinical impact. It is therefore essential that health care professionals be made aware of available tools that are designed to assess the influence of spasticity on QOL after SCI.Furthermore, investigators should possess a broader understanding of the different conceptualizations of QOL and which tools correspond to each of these. This will help to ensure that the objectives of a study are well aligned with the selected QOL tool. Gaining prominence in the field is the notion that QOL can be measured from an objective or subjective perspective.29 Objective measures are based on the assumption that there is widespread agreement about what constitutes QOL.25 Such measures focus on external conditions and contain items that can be defined and quantified to reflect societal standards. Conversely, subjective measures are designed with the assumption that QOL can only be judged by the individuals experiencing it.30 Although there are advantages and disadvantages inherent in each measurement type,29 subjective measures give patients a means of providing health professionals with a greater understanding of QOL and its connection to their health and well-being following SCI, whereas objective measures can be used to inform decision makers on how to allocate funds and resources for various interventions.To date, no systematic reviews on the influence of spasticity on QOL, or on the appropriateness of QOL measures for assessing spasticity, have been conducted. Given the substantial influence spasticity has on QOL, there is a need to improve conceptual understandings of QOL to ensure that investigators employ appropriate research designs as well as suitable outcome measures to assess this prevalent secondary health condition. Hence, the purpose of this systematic literature review was to classify and evaluate outcome measures that are used to assess the influence of spasticity on QOL following SCI.  相似文献   

9.

Background:

A large percentage of individuals with spinal cord injury (SCI) report shoulder pain that can limit independence and quality of life. The pain is likely related to the demands placed on the shoulder by transfers and propulsion. Shoulder pathology has been linked to altered scapular mechanics; however, current methods to evaluate scapular movement are invasive, require ionizing radiation, are subject to skin-based motion artifacts, or require static postures.

Objective:

To investigate the feasibility of applying 3-dimensional ultrasound methods, previously used to look at scapular position in static postures, to evaluate dynamic scapular movement.

Method:

This study evaluated the feasibility of the novel application of a method combining 2-dimensional ultrasound and a motion capture system to determine 3-dimensional scapular position during dynamic arm elevation in the scapular plane with and without loading.

Results:

Incremental increases in scapular rotations were noted for extracted angles of 30°, 45°, 60°, and 75° of humeral elevation. Group differences were evaluated between a group of 16 manual wheelchair users (MWUs) and a group of age- and gender-matched able-bodied controls. MWUs had greater scapular external rotation and baseline pathology on clinical exam. MWUs also had greater anterior tilting, with this difference further accentuated during loading. The relationship between demographics and scapular positioning was also investigated, revealing that increased age, pathology on clinical exam, years since injury, and body mass index were correlated with scapular rotations associated with impingement (internal rotation, downward rotation, and anterior tilting).

Conclusion:

Individuals with SCI, as well as other populations who are susceptible to shoulder pathology, may benefit from the application of this imaging modality to quantitatively evaluate scapular positioning and effectively target therapeutic interventions.Key words: kinematics, scapula, ultrasound, wheelchair userThe shoulder is a common site of injury across many populations. Because it is the most mobile joint in the body, the high prevalence of disorders is not surprising. Individuals are at increased risk for shoulder pathology when exposed to high forces, sustained postures, and repetitive movements.1 Wheelchair users are exposed to all of these factors in activities of daily living. Among manual wheelchair users (MWUs), 35% to 67% report shoulder pain.27 In this population, the presence of shoulder dysfunction significantly affects function and decreases quality of life.8,9 With altered scapular kinematics being linked to a multitude of shoulder problems, the identification of changes in kinematics may allow for earlier detection of pathology and targeting of appropriate interventions.1025 However, evaluation of dynamic scapular movement is a challenging task, as the scapula rotates about 3 axes while also gliding underneath overlying tissue. Direct visualization of the bone is ideal but is often limited by cost, availability, and exposure to radiation, and skin-based systems are prone to error.2633The overall goal of this study was to investigate the feasibility of applying 3-dimensional ultrasound methods, previously used to look at scapular position in static postures, to evaluate dynamic scapular movement.34 The specific goals were as follows:
  1. Evaluate intermediate angles of functional elevation during dynamic movement (30°, 45°, 60°, and 75°). We hypothesize that we will see incremental increases in external rotation, upward rotation, and posterior tipping throughout the movement to maintain the distance between the acromion and humerus.
  2. Compare dynamic scapular movement between MWUs and able-bodied controls (ABs). We anticipate that the nature of wheelchair propulsion and demands of activities of daily living will elucidate differences between this population and ABs with comparably lower daily demands on the shoulder.
  3. Evaluate the effect of loading on scapular movement, as other studies have suggested that differences in kinematics are clearer in the presence of loading.10,35,36
  4. Investigate the relationship between shoulder pathology, age, years since injury, and body mass index (BMI) and scapular positioning.
  相似文献   

10.
Despite optimal immunosuppressive therapy, more than 50% of kidney transplants fail because of chronic allograft dysfunction. A noninvasive means to diagnose chronic allograft dysfunction may allow earlier interventions that could improve graft half-life. In this proof-of-concept study, we used mass spectrometry to analyze differences in the urinary polypeptide patterns of 32 patients with chronic allograft dysfunction (14 with pure interstitial fibrosis and tubular atrophy and 18 with chronic active antibody-mediated rejection) and 18 control subjects (eight stable recipients and 10 healthy control subjects). Unsupervised hierarchical clustering showed good segregation of samples in groups corresponding mainly to the four biomedical conditions. Moreover, the composition of the proteome of the pure interstitial fibrosis and tubular atrophy group differed from that of the chronic active antibody-mediated rejection group, and an independent validation set confirmed these results. The 14 protein ions that best discriminated between these two groups correctly identified 100% of the patients with pure interstitial fibrosis and tubular atrophy and 100% of the patients with chronic active antibody-mediated rejection. In summary, this study establishes a pattern for two histologic lesions associated with distinct graft outcomes and constitutes a first step to designing a specific, noninvasive diagnostic tool for chronic allograft dysfunction.During the past three decades, the incidence and prevalence of ESRD has increased each year all over the world.1 Kidney transplantation is the treatment of choice for ESRD because it prolongs survival,2 improves quality of life, and is less costly than dialysis3; however, despite these improvements, a substantial proportion of grafts develop progressive dysfunction and fail within a decade, even with the use of appropriate dosages of immunosuppressive drugs to prevent acute rejection.4 Chronic allograft dysfunction (CAD) causes more than 50% of graft losses.57 Although patients can return to dialysis after transplant failure, loss of a functioning graft is associated with a three-fold increase in the risk for death,2,8,9 a substantial decrease in quality of life in survivors, and a four-fold increase in cost.1,3The decline in function, often associated with hypertension and proteinuria, constitutes a clinical syndrome that has been called chronic allograft nephropathy (CAN). The histopathologic hallmarks of these patients are chronic interstitial fibrosis, tubular atrophy, vascular occlusive changes, and glomerulosclerosis, usually evaluated by the Banff working classification.10 Major outcomes discussed at the last Banff Conference included the elimination of the nonspecific term CAN and recognition of the entity “chronic active antibody-mediated rejection” (CAAR).11 The rationale for this update was the improper use of “CAN” as a generic term for all causes of chronic renal allograft dysfunction with interstitial fibrosis and tubular atrophy (IF/TA), which hampers accurate diagnosis and appropriate therapy, and increasing recognition of the role of alloantibody in chronic renal allograft deterioration and the corresponding histologic changes, making the identification of an antibody-mediated component of chronic rejection feasible.11Effective strategies to prevent renal function deterioration should focus on the early detection and treatment of patients who develop CAD. In addition to elevated serum creatinine, usually associated with proteinuria and arterial hypertension, more specific and sensitive markers are needed to identify high-risk patients or initial lesions without any changes in serum creatinine or proteinuria.5,11New analytic tools that allow rapid screening and accurate protein identification in body fluids are now emerging within the field of proteomic science. High-throughput mass spectrometry (MS) methods allow simultaneous detection of a large number of proteins in a large set of biologic tissues or samples. Protein fingerprinting MS methods using modern matrix-assisted laser desorption/ionization-time of-flight MS (MALDI-MS) instrumentation can detect hundreds of peak signals that, as a whole, could be considered a reflex of the body''s physiologic status.12 To date, MALDI-MS has been successfully used to detect patterns of substantial overexpression of proteins in cancer cells.1315 Urine seems to be an ideal source of potential biomarkers, and urine proteomic approaches have been used in numerous attempts to define biomarkers for a variety of nephro-urologic disorders.1618 The aim of this study was to evaluate whether chromatography by solid-phase extraction coupled to MS would differentiate urinary polypeptide patterns in patients with pure IF/TA, patients with CAAR, and two control groups: Healthy individuals and stable renal transplant recipients.  相似文献   

11.
Proteinuria and increased renal reabsorption of NaCl characterize the nephrotic syndrome. Here, we show that protein-rich urine from nephrotic rats and from patients with nephrotic syndrome activate the epithelial sodium channel (ENaC) in cultured M-1 mouse collecting duct cells and in Xenopus laevis oocytes heterologously expressing ENaC. The activation depended on urinary serine protease activity. We identified plasmin as a urinary serine protease by matrix-assisted laser desorption/ionization time of-flight mass spectrometry. Purified plasmin activated ENaC currents, and inhibitors of plasmin abolished urinary protease activity and the ability to activate ENaC. In nephrotic syndrome, tubular urokinase-type plasminogen activator likely converts filtered plasminogen to plasmin. Consistent with this, the combined application of urokinase-type plasminogen activator and plasminogen stimulated amiloride-sensitive transepithelial sodium transport in M-1 cells and increased amiloride-sensitive whole-cell currents in Xenopus laevis oocytes heterologously expressing ENaC. Activation of ENaC by plasmin involved cleavage and release of an inhibitory peptide from the ENaC γ subunit ectodomain. These data suggest that a defective glomerular filtration barrier allows passage of proteolytic enzymes that have the ability to activate ENaC.Nephrotic syndrome is characterized by proteinuria, sodium retention, and edema. Increased renal sodium reabsorption occurs in the cortical collecting duct (CCD),1,2 where a rate-limiting step in transepithelial sodium transport is the epithelial sodium channel (ENaC), which is composed of the three homologous subunits: α, β, γ.3ENaC activity is regulated by hormones, such as aldosterone and vasopressin (AVP)4,5; however, adrenalectomized rats and AVP-deficient Brattleboro rats are capable of developing nephrotic syndrome,1,6 and nephrotic patients do not consistently display elevated levels of sodium-retaining hormones,7,8 suggesting that renal sodium hyper-reabsorption is independent of systemic factors. Consistent with this, sodium retention is confined to the proteinuric kidney in the unilateral puromycin aminonucleoside (PAN) nephrotic model.2,9,10There is evidence that proteases contribute to ENaC activation by cleaving the extracellular loops of the α- and γ-subunits.1113 Proteolytic activation of ENaC by extracellular proteases critically involves the cleavage of the γ subunit,1416 which probably leads to the release of a 43-residue inhibitory peptide from the ectodomain.17 Both cleaved and noncleaved channels are present in the plasma membrane,18,19 allowing proteases such as channel activating protease 1 (CAP1/prostasin),20 trypsin,20 chymotrypsin,21 and neutrophil elastase22 to activate noncleaved channels from the extracellular side.23,24 We hypothesized that the defective glomerular filtration barrier in nephrotic syndrome allows the filtration of ENaC-activating proteins into the tubular fluid, leading to stimulation of ENaC. The hypothesis was tested in the PAN nephrotic model in rats and with urine from patients with nephrotic syndrome.  相似文献   

12.
13.
Donor characteristics such as age and cause of death influence the incidence of delayed graft function (DGF) and graft survival; however, the relative influence of donor characteristics (“nature”) versus transplant center characteristics (“nurture”) on deceased-donor kidney transplant outcomes is unknown. We examined the risks for DGF and allograft failure within 19,461 recipient pairs of the same donor''s kidneys using data from the US Renal Data System. For the 11,894 common-donor pairs transplanted at different centers, a recipient was twice as likely to develop DGF when the recipient of the contralateral kidney developed DGF (odds ratio [OR] 2.05; 95% confidence interval [CI] 1.82 to 2.30). Similarly, for 7567 common-donor pairs transplanted at the same center, the OR for DGF was 3.02 (95% CI 2.62 to 3.48). For pairs transplanted at the same center, there was an additional 42% risk for DGF compared with pairs transplanted at different centers. After adjustment for DGF, the within-pair ORs for allograft failure by 1 yr were 1.92 (95% CI 1.33 to 2.77) and 1.77 (95% CI 1.25 to 2.52) for recipients who underwent transplantation at the same center and different centers, respectively. These data suggest that both unmeasured donor characteristics and transplant center characteristics contribute to the risk for DGF and that the former also contribute significantly to allograft failure.Delayed graft function (DGF) is an important predictor of graft failure after kidney transplantation.13 The incidence of DGF after deceased-donor kidney transplants ranges between 23 and 50%.46 Although some studies have been mixed, several large studies have shown that DGF influences graft failure both through its association with and independent of acute rejection.5,710 DGF also adversely affects cost, length of hospitalization, and patient rehabilitation.1113 Allograft failure results in half of the deceased-donor kidneys being lost at 11 yr after transplantation.14There are many known determinants of DGF and allograft failure. Studies have implicated a number of immunologic and nonimmunologic characteristics, including donor factors, recipient factors, and the transplant procedure.4,6,1521 A limited effort has been made to evaluate the relative contribution of these risk factors by exploiting that there is variation in the response of recipients of kidneys from the same donor.18,2224 This approach is similar to studies of monozygotic twins reared apart, which seek to quantify the relative importance of environmental and genetic factors on the basis of variability within twin pairs and among twin pairs.22,25 Analyses that examine outcomes in two recipients of kidneys from the same deceased donor can be used to determine the donor''s relative contribution to the recipients’ outcomes.We retrospectively evaluated a national cohort of deceased-donor transplant recipients to understand better the complex relationship between donor (“nature”) and transplant center effects (“nurture”) associated with DGF and kidney allograft failure. We examined the within-pair correlation of these outcomes among recipients of kidneys from the same deceased donor and adjusted for transplant center effect by estimating separate odds ratios (ORs) for recipient pairs who underwent transplantation at the same transplant center and at different transplant centers. The transplant center effect was detected by determining the difference in outcomes for the paired kidneys from the same deceased donor transplanted at the same versus different centers.  相似文献   

14.
Administration of activated protein C (APC) protects from renal dysfunction, but the underlying mechanism is unknown. APC exerts both antithrombotic and cytoprotective properties, the latter via modulation of protease-activated receptor-1 (PAR-1) signaling. We generated APC variants to study the relative importance of the two functions of APC in a model of LPS-induced renal microvascular dysfunction. Compared with wild-type APC, the K193E variant exhibited impaired anticoagulant activity but retained the ability to mediate PAR-1-dependent signaling. In contrast, the L8W variant retained anticoagulant activity but lost its ability to modulate PAR-1. By administering wild-type APC or these mutants in a rat model of LPS-induced injury, we found that the PAR-1 agonism, but not the anticoagulant function of APC, reversed LPS-induced systemic hypotension. In contrast, both functions of APC played a role in reversing LPS-induced decreases in renal blood flow and volume, although the effects on PAR-1-dependent signaling were more potent. Regarding potential mechanisms for these findings, APC-mediated PAR-1 agonism suppressed LPS-induced increases in the vasoactive peptide adrenomedullin and infiltration of iNOS-positive leukocytes into renal tissue. However, the anticoagulant function of APC was responsible for suppressing LPS-induced stimulation of the proinflammatory mediators ACE-1, IL-6, and IL-18, perhaps accounting for its ability to modulate renal hemodynamics. Both variants reduced active caspase-3 and abrogated LPS-induced renal dysfunction and pathology. We conclude that although PAR-1 agonism is solely responsible for APC-mediated improvement in systemic hemodynamics, both functions of APC play distinct roles in attenuating the response to injury in the kidney.Acute kidney injury (AKI) leading to renal failure is a devastating disorder,1 with a prevalence varying from 30 to 50% in the intensive care unit.2 AKI during sepsis results in significant morbidity, and is an independent risk factor for mortality.3,4 In patients with severe sepsis or shock, the reported incidence ranges from 23 to 51%57 with mortality as high as 70% versus 45% among patients with AKI alone.1,8The pathogenesis of AKI during sepsis involves hemodynamic alterations along with microvascular impairment.4 Although many factors change during sepsis, suppression of the plasma serine protease, protein C (PC), has been shown to be predictive of early death in sepsis models,9 and clinically has been associated with early death resulting from refractory shock and multiple organ failure in severe sepsis.10 Moreover, low levels of PC have been highly associated with renal dysfunction and pathology in models of AKI.11 During vascular insult, PC becomes activated by the endothelial thrombin-thrombomodulin complex, and the activated protein C (APC) exhibits both antithrombotic and cytoprotective properties. We have previously demonstrated that APC administration protects from renal dysfunction during cecal ligation and puncture and after endotoxin challenge.11,12 In addition, recombinant human APC [drotrecogin alfa (activated)] has been shown to reduce mortality in patients with severe sepsis at high risk of death.13 Although the ability of APC to protect from organ injury in vivo is well documented,11,14,15 the precise mechanism mediating the response has not been ascertained.APC exerts anticoagulant properties via feedback inhibition of thrombin by cleavage of factors Va and VIIIa.16 However, APC bound to the endothelial protein C receptor (EPCR) can also exhibit direct potent cytoprotective properties by cleaving protease-activated receptor-1 (PAR-1).17 Various cell culture studies have demonstrated that the direct modulation of PAR-1 by APC results in cytoprotection by several mechanisms, including suppression of apoptosis,18,19 leukocyte adhesion,19,20 inflammatory activation,21 and suppression of endothelial barrier disruption.22,23 In vivo, the importance of the antithrombotic activity of APC is well established in model systems24,25 and in humans.26 However, the importance of PAR-1-mediated effects of APC also has been clearly defined in protection from ischemic brain injury27 and in sepsis models.28 Hence, there has been significant debate whether the in vivo efficacy of APC is attributed primarily to its anticoagulant (inhibition of thrombin generation) or cytoprotective (PAR-1-mediated) properties.17,29The same active site of APC is responsible for inhibition of thrombin generation by the cleavage of factor Va and for PAR-1 agonism. Therefore, we sought to generate point mutations that would not affect catalytic activity, but would alter substrate recognition to distinguish the two functions. Using these variants, we examined the relative role of the two known functions of APC in a model of LPS-induced renal microvascular dysfunction.  相似文献   

15.
16.
OBJECTIVE— A restricted region of proinsulin located in the B chain and adjacent region of C-peptide has been shown to contain numerous candidate epitopes recognized by CD8+ T-cells. Our objective is to characterize HLA class I–restricted epitopes located within the preproinsulin leader sequence.RESEARCH DESIGN AND METHODS— Seven 8- to 11-mer preproinsulin peptides carrying anchoring residues for HLA-A1, -A2, -A24, and -B8 were selected from databases. HLA-A2–restricted peptides were tested for immunogenicity in transgenic mice expressing a chimeric HLA-A*0201/β2-microglobulin molecule. The peptides were studied for binding to purified HLA class I molecules, selected for carrying COOH-terminal residues generated by proteasome digestion in vitro and tested for recognition by human lymphocytes using an ex vivo interferon-γ (IFN-γ) ELISpot assay.RESULTS— Five HLA-A2–restricted peptides were immunogenic in transgenic mice. Murine T-cell clones specific for these peptides were cytotoxic against cells transfected with the preproinsulin gene. They were recognized by peripheral blood mononuclear cells (PBMCs) from 17 of 21 HLA-A2 type 1 diabetic patients. PBMCs from 25 of 38 HLA-A1, -A2, -A24, or -B8 patients produced IFN-γ in response to six preproinsulin peptides covering residues 2–25 within the preproinsulin region. In most patients, the response was against several class I–restricted peptides. T-cells recognizing preproinsulin peptide were characterized as CD8+ T-cells by staining with peptide/HLA-A2 tetramers.CONCLUSIONS— We defined class I–restricted epitopes located within the leader sequence of human preproinsulin through in vivo (transgenic mice) and ex vivo (diabetic patients) assays, illustrating the possible role of preproinsulin-specific CD8+ T-cells in human type 1 diabetes.Type 1 diabetes involves the activation of lymphocytes against β-cell autoantigens. In animals, the predominant role of T-cells is supported by experiments in which diabetes is transferred by diabetogenic T-cells, is prevented by antibodies that interfere with T-cell activation, or fails to develop in diabetes-prone mice in which key genes in T-cell differentiation or activation are deficient. In humans, T-cells are predominant within insulitis at early stages of diabetes. Moreover, type 1 diabetes has been reported in an immunodeficient patient deprived of B-cells (1).Major histocompatibility complex (MHC) class II–restricted CD4+ T-cells are central in the diabetes process, but CD8+ T-cells play a pivotal role in its initiation in NOD mice (2). In human, CD8+ T-cells are predominant, and a high percentage of interferon-γ (IFN-γ)-positive cells is detected within insulitis in recent-onset diabetes in most observations (3,4,5,6). Recurrent diabetes in recipients of isografts from a discordant twin is accompanied by predominant CD8+ T-cell infiltration (7).Among β-cell autoantigens, proinsulin has been ascribed a key role in diabetes. In humans, insulin and proinsulin are common targets of autoantibodies (8,9) and T-cells (10,11,12,13,14,15,16,17) in diabetic and pre-diabetic individuals. Anti-insulin antibodies are the first to be detected in children at risk for diabetes and carry a high positive predictive value for diabetes (9). In NOD mice, injection of insulin-specific T-cell clones accelerate diabetes (18). Protection from diabetes is obtained by injecting insulin in pre-diabetic mice (19). In addition, proinsulin 1−/− or 2−/− NOD mice show delayed or accelerated diabetes, respectively (20,21).Several new β-cell HLA class I–restricted epitopes have been reported recently (22,23,24,25,26,27). We and others have shown that a restricted region of human proinsulin located in the B chain and adjacent C-peptide clusters proteasome cleavage sites generating correct COOH-termini of putative MHC class I peptides and many epitopes that are recognized by diabetic CD8+ T-cells (22,23). Recognition of epitopes that are located within the C-peptide and C-peptide–B chain junction, including residues that are excised during the secretion process, makes a strong case for proinsulin as an autoantigen in diabetes. Despite strong evidence that leader sequence peptides are presented by class I HLA molecules, especially HLA-A2.1 (28), only two HLA A2.1 preproinsulin leader sequence peptides have been identified (26,27).To characterize class I–restricted epitopes within the preproinsulin leader sequence, we selected 8- to 11-mer peptides carrying anchoring residues for class I molecules. These peptides were studied for immunogenicity in HLA-A*0201 transgenic mice (29). Mouse CD8+ T-cell clones specific to HLA-A*0201–restricted peptides were tested for cytotoxicity against HLA-A2 target cells transfected with the preproinsulin gene. In human, peptides were studied for binding to common class I molecules, for carrying COOH-terminal residues generated by proteasome digestion, and for recognition by peripheral blood mononuclear cells (PBMCs) from diabetic patients.  相似文献   

17.

Purpose:

To investigate the changes in quality of life (QOL) in persons with spinal cord injury (SCI) and their close persons during the first 2 years post injury.

Method:

Longitudinal multiple sample multiple wave panel design. Data included 292 patients recruited from Austrian British German Irish and Swiss specialist SCI rehabilitation centers and 55 of their close persons. Questionnaire booklets were administered at 6 weeks 12 weeks 1 year and 2 years after injury to both samples.

Results:

Study 1 investigated the WHOQOL-BREF domains in individuals with SCI and found differences mostly in the physical domain indicating that QOL increases for persons with SCI from onset. An effect of the culture was observed in the psychological and environmental domains with higher QOL scores in the German-speaking sample. Study 2 compared individuals with SCI to their close persons and found differences in the physical environmental and social domains over time. The scores on the psychological dimension did not significantly differ between the persons with SCI and their close persons over time.

Conclusion:

QOL measured by the WHOQOL-BREF shows that QOL changes during rehabilitation and after discharge. Apart from the physical dimension the persons with SCI and their close persons seem to experience a similar change in QOL. Further longitudinal research is suggested to clarify the mutual adjustment process of people with SCI and their close persons and to explore cultural differences in QOL between English-and German-speaking countries.Key words: close persons, quality of life, rehabilitation, spinal cord injuryAspinal cord injury (SCI) is a highly disruptive event in the life of an individual and requires a considerable coping process. Shortly after the injury, all attention is put into stabilizing the patient and from that moment the individual has to cope with challenges at physical, social, environmental, and psychological levels.The institutionalized context of the rehabilitation provides a largely standardized supportive setting that helps the person with SCI to become acquainted with the recently acquired disability. The health care professionals in collaboration with the patients and their close persons, that is, relatives or significant others, work together to prepare the transition back to everyday life.One expectation of rehabilitation is that the person with SCI will regain a satisfactory level of well-being and fulfill his or her aims in life. Many factors may facilitate the retrieval of a good quality of life (QOL). Some aspects of SCI are permanent or only susceptible to small changes (eg, the paralysis and other irrevocable neurological problems related to the injury), but many others (eg, psychological and social aspects) can be more or less actively influenced by the person with SCI.A number of recent studies regarding QOL in SCI emphasize that QOL is not strongly affected by physical variables.14 Age 37 and gender1,24,7 are also weakly related to the QOL of the persons with SCI. Physical health aspects that can explain differences in QOL are pain1,6,811 or secondary conditions such as pressure sores and dysreflexia. 4,6,11Psychological resources are strong predictors of life satisfaction and well-being. Psychological resources are personal traits and characteristics that might influence the way a person perceives and manages challenges. Positive affect,3,11 high self-efficacy,1,6,9,11 optimism,6 hope,3,11 and sense of coherence11,12 have proven to be positively associated with better QOL. More dynamic psychological processes such as appraisals or coping strategies used by the persons with SCI have also significantly contributed to predicting QOL over time. 13Studies reporting on how the environment affects perceived QOL in persons with SCI are less frequent, despite a probable relation between mobility impairments and the advantages that a well-designed, obstacle-free, secure, and friendly environment might convey to persons with SCI. 1416A supportive familial environment6 and friends17 are important. Close persons of individuals with SCI have to adjust to the new situation, such as accepting a partner with altered needs with different degrees of daily care. Close persons of persons with SCI reported that becoming a caregiver is difficult and often dramatically life changing.18 Weitzenkamp19 reports high levels of depression in spousal caregivers, with sometimes higher rates of depression than partners with SCI. He also compared caregivers of persons with SCI with non-caregiving spouses, and found that caregivers showed higher levels of depression and emotional and physical stress.Familial caregivers hold a central position in the life of persons with SCI. Their health and QOL affects the health and well-being of the persons with SCI; life with SCI involves a systemic adjustment process. Ploypetch5 indicated that the participants’ perceived QOL was affected more if the caregiver was a member of the participants’ family rather than a health professional. Lucke 20 showed that the level of QOL in persons with SCI and their caregiving spouses may vary significantly over time, with a drop at 3 months post injury and an increase in QOL in the following 6 months. As a consequence, QOL has to be seen as dynamic, reflecting the adjustment to changes occurring over time in individuals and their environment. Some longitudinal studies of persons with SCI and their QOL have been conducted,2123 but longitudinal studies involving the close persons of individuals with SCI remain scarce in the literature. 20The present report shows the changes in QOL during the first 2 years post injury in individuals with SCI and their close persons from 5 European countries. The first aim is to describe the physical, social, environmental, and psychological QOL in persons with SCI in relation to injury and sociodemographic variables.
Hypothesis 1: It is expected that QOL in persons with SCI is quite high, especially after 2 years.
Second, the experienced QOL of the individuals with SCI and their close persons are compared at 6 weeks, 12 weeks, 1 year, and 2 years post injury.
Hypothesis 2: We expected that the scores of the physical QOL domain would be lower in the persons with SCI compared to the noninjured close persons. No significant differences were expected between the scores of the psychological domain.
  相似文献   

18.
19.
OBJECTIVE—We have examined maternal mechanisms for adult-onset glucose intolerance, increased adiposity, and atherosclerosis using two mouse models for intrauterine growth restriction (IUGR): maternal protein restriction and hypercholesterolemia.RESEARCH DESIGN AND METHODS—For these studies, we measured the amino acid levels in dams from two mouse models for IUGR: 1) feeding C57BL/6J dams a protein-restricted diet and 2) feeding C57BL/6J LDL receptor–null (LDLR−/−) dams a high-fat (Western) diet.RESULTS—Both protein-restricted and hypercholesterolemic dams exhibited significantly decreased concentrations of the essential amino acid phenylalanine and the essential branched chain amino acids leucine, isoleucine, and valine. The protein-restricted diet for pregnant dams resulted in litters with significant IUGR. Protein-restricted male offspring exhibited catch-up growth by 8 weeks of age and developed increased adiposity and glucose intolerance by 32 weeks of age. LDLR−/− pregnant dams on a Western diet also had litters with significant IUGR. Male and female LDLR−/− Western-diet offspring developed significantly larger atherosclerotic lesions by 90 days compared with chow-diet offspring.CONCLUSIONS—In two mouse models of IUGR, we found reduced concentrations of essential amino acids in the experimental dams. This indicated that shared mechanisms may underlie the phenotypic effects of maternal hypercholesterolemia and maternal protein restriction on the offspring.In humans, malnutrition during pregnancy results in babies with lower birth weight and an increased risk of neonatal mortality and morbidity (1). Low birth weight is also associated with an increased risk for certain chronic diseases, including type 2 diabetes, cardiovascular disease, and hypertension (24). One proposed explanation linking low birth weight to chronic diseases is the Barker “thrifty phenotype” hypothesis, which postulates that the lack of adequate nutrients in the intrauterine environment “programs” the offspring for survival in a nutrient-poor world. It follows that if the actual postnatal environment is not nutrient poor but instead nutrient rich, metabolic pathways will have been “malprogrammed,” leading to adult-onset metabolic syndrome diseases, including atherosclerosis and diabetes (5). A great deal of evidence now supports the Barker hypothesis (6); therefore, current research in humans and in animal models is focused on specific mechanisms for in utero programming (4).Many types of maternal stresses in different animal models have been used to produce intrauterine growth restriction (IUGR) (7). In the current study, we used two mouse models of IUGR, one using maternal protein restriction to examine increased adiposity and glucose intolerance end points, and one using a high-cholesterol maternal environment in LDLR−/− mice to examine cardiovascular end points. Previous work using the rat model has shown that maternal protein restriction results in offspring with IUGR (4), low muscle mass (8), adult-onset glucose intolerance (9), hypertension (10,11), and early aging (12,13). Maternal effects of a low-protein diet included a significant decrease in the placental protein 11 β-hydroxysteroid dehydrogenase, an enzyme that protects the fetus from maternal glucocorticoids (14). A concomitant increase in glucocorticoid-inducible enzymes was found in the fetuses of dams on a low-protein diet (15). Studies examining maternal programming for atherosclerosis have found a significant association between maternal hypercholesterolemia and increased atherosclerotic lesions in the offspring in newborn and adult rabbits (16,17), in adult mice (18), and in human fetuses (19) and children (20). Existing evidence for in utero programming from hypercholesterolemia (21) includes increased maternal oxidative stress (22) and an altered adaptive immune response to oxidized LDL (23). Although IUGR itself is associated with an increased risk for atherosclerosis in humans (24), high maternal cholesterol in humans has not been established as causative for IUGR. Using a rabbit model, however, it was shown that a moderate 0.2% cholesterol, low-fat chow gestational diet resulted in litters with IUGR (25). The decreased birth weight was associated with an excessive accumulation of lipids in the placenta, suggesting possible interference with nutrient transport to the fetus (25).Because maternal protein restriction and hypercholesterolemia both create an abnormal maternal metabolic environment, we hypothesized that there may be a common disruption of metabolic pathways affecting the offspring. To test the hypothesis, we used two mouse models for in utero conditions leading to IUGR, one of protein restriction and one of hypercholesterolemia. We then looked for commonalities in the experimental dams to identify possible pathways for the developmental origins of metabolic syndrome diseases. In both models, the dams had decreased levels of certain essential amino acids.  相似文献   

20.

Background:

Chronic or recurrent musculoskeletal pain in the cervical and shoulder region is a common secondary problem after spinal cord injury (SCI), reported by 30% to 70% of individuals.

Objective:

The purpose of this study was to investigate the effect of electromyographic (EMG) biofeedback training, in addition to a standard exercise program, on reducing shoulder pain in manual wheelchair users with SCI.

Methods:

Fifteen individuals with SCI, C6 or lower, who were manual wheelchair users with shoulder pain were randomly assigned to 1 of 2 interventions. The Exercise group (n = 7) received instruction on a standard home-based exercise program. The EMG Biofeedback plus Exercise group (n = 8) received identical exercise instruction plus EMG biofeedback training to improve muscle balance and muscle relaxation during wheelchair propulsion. Shoulder pain was assessed by the Wheelchair Users Shoulder Pain Index (WUSPI) at baseline, at posttest 10 weeks after the start of intervention, and at follow-up 16 weeks after posttest.

Results:

The number of participants per group allowed only within-group comparisons; however, the findings indicated a beneficial effect from EMG biofeedback training. Shoulder pain, as measured by WUSPI, decreased 64% from baseline to posttest for the EMG Biofeedback plus Exercise group (P = .02). Shoulder pain for the Exercise group decreased a nonsignificant 27%. At follow-up, both groups showed continued improvement, yet the benefit of EMG biofeedback training was still discernible. The EMG Biofeedback plus Exercise group had an 82% reduction in shoulder pain from baseline to follow-up (P = .004), while the Exercise group showed a 63% reduction (P = .03) over the same time period.

Conclusions:

This study provides preliminary evidence that EMG biofeedback has value when added to an exercise intervention to reduce shoulder pain in manual wheelchair users with SCI. These findings indicate that EMG biofeedback may be valuable in remediating musculoskeletal pain as a secondary condition in SCI. This preliminary conclusion will need to be studied and verified through future work.Key words: biofeedback, exercise, pain, spinal cord injuryChronic or recurrent musculoskeletal pain in the cervical and shoulder region is a common secondary problem after spinal cord injury (SCI) that interferes with daily activities and reduces quality of life (QOL).1 Reported prevalence ranges from 30% to 70%.29 Both incidence and intensity increase with duration of injury and become an added concern for SCI individuals as they age.8,10,11 For manual wheelchair users, who rely on their upper extremities for mobility (wheelchair propulsion), 35% report shoulder pain. Shoulder pain increases to 60% for wheelchair athletes.12,13Muscle imbalance plays a role in the development of musculoskeletal pain in manual wheelchair users, as muscles used for the forceful push phase strengthen relative to muscles used for the recovery phase.13,14 Recent studies have reported effective use of exercise programs designed to address this problem.1518 Two of these studies were randomized trials that included an untreated control group to rule out spontaneous recovery.17,18 As a result of this research, exercise has become a standard intervention for reducing shoulder pain in manual wheelchair users with SCI.In addition to muscle imbalance, muscle fatigue likely plays a significant role. Research on work-related musculoskeletal pain (WRMP) has demonstrated that disruption of normal work/rest cycles can promote localized muscle fatigue and pain.1921 Electromyographic studies have found that repetitive tasks consist of alternating periods of muscle contraction and muscle relaxation; a muscle that is slow to relax – bypassing the rest portion of work/rest cycles – is at risk of developing WRMP.2224 Clinical studies have reported slow and incomplete relaxation of the upper trapezius muscle during repetitive tasks, compared to pain-free controls, in musicians with occupational upper extremity pain25 and individuals with cervical pain and headache.19, 24 Manual wheelchair propulsion, with its highly repetitive muscle activity, puts the wheelchair user at risk of muscle fatigue and eventually pain.This research suggests that EMG biofeedback interventions designed to improve work/ rest cycling may be an effective addition to rehabilitation programs for cervical and shoulder pain in manual wheelchair users with SCI. At present, EMG biofeedback procedures are widely used for muscle training in rehabilitation, athletics, and the workplace and have been effective in training overactive muscles to relax quickly during functional activities.20,2630 In the present study, EMG biofeedback training is designed to improve both muscle balance and work/rest cycles during functional wheelchair propulsion. The purpose of this study was to determine the extent to which EMG biofeedback training is associated with a reduction in shoulder pain, above and beyond the effects of a standard exercise program.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号