首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Both Toll-like receptor 4 (TLR4) and monocytes focus stimuli, causing them to contribute differently to chronic injury of a transplanted kidney.

Aim

The aim of our study was to determine if TLR4 monocyte is a diagnostic tool and possibly a target for therapeutic intervention.

Materials

We studied 143 kidney transplant (KT) patients (88 male, 55 female; 50.3 ± 12.8 years); median was 10.4 post KT, follow-up was 11.4 months, and 46 patients had delayed graft function (DGF+) history. Control group (38 healthy volunteers) had monocyte mRNA-TLR4 expression (TLR4ex). DGF+ were divided by median of TLR4ex (?0.1034) into 2 groups: low-TLR4 expression (L-TLR4ex) and high-TLR4 expression (H-TLR4ex).

Results

We showed that in comparison with DGF?, the DGF+ had much lower TLR4ex, and worse KT function both currently (TLR-day) (serum creatinine [sCr] P?=?.002; estimated glomerular filtration rate [eGFR] P = .001) and post follow-up (sCr P?=?.006; eGFR P?=?.005). The DGF+ with L/H-TLR4ex comparison showed no differences in TLR-day KT function but did show differences in post follow-up (sCr P?=?.01; eGFR P?=?.02; ΔeGFR% P?= .001). Regression analysis showed an association between recipient age, tacrolimus concentration, and uremic milieu (ie, TLR-day sCr and GFR with TLR4ex). Reverse regression analysis indicated an association of TLR4ex (especially L/H-TLR4ex) with post follow-up parameters of KT function and numeric/qualitative measures of change.

Conclusion

DGF affects the fate of a graft. Within a several months after transplantation, TLR4ex of peripheral blood mononuclear cells declines in DGF patients. Low LR4ex in patients with DGF+ is associated with poor prognosis for the efficiency of the KT. In patients with DGF+, the proper selection of immunosuppression (tacrolimus dosing) is very important. Higher concentrations of tacrolimus may improve prognosis. The analysis of TLR4ex change may be a useful parameter for the real assessment of immunosuppression efficacy. It is important for transplanted organ function that peripheral blood mononuclear cells effectively leave circulation and remain in the graft.  相似文献   

2.

Background

One of the serious postoperative complications associated with joint replacement is bacterial infection. In our recent investigations, iodine supported titanium implants demonstrated antibacterial activity in both in vitro and in vivo studies. The surfaces of the implants have porous anodic oxide layer with the antiseptic properties of iodine. According to the literature the titanium with porous anodic oxide have good osteoconductivity. But it is not clear whether the properties of iodine influence bone bonding of implants.

Objectives

The aim of this study is to evaluate the influence of the properties of iodine and porous anodic oxide layer in the bone bonding ability of titanium implants.

Study design & methods

Titanium rods were implanted in intramedullary rabbit femur models, in regard to the cementless hip stem. The implant rods were 5 mm in diameter and 25 mm in length. Three types of titanium rods were implanted.One was untreated titanium (control group (CL)), another was titanium with oxide layer without iodine (oxide layer group (OL)), and the other was Iodine treated Titanium (iodine group (ID)). The rods were inserted into the distal femur. We assessed the bonding strength by a measuring pull-out test at 4, 8, and 12 weeks after implantation. The bone-implant interfaces were evaluated at 4 weeks after implantation.

Results

Pull-out test results of the ID implants were 202, 355, and 344 N, and those of the OL implants were 220, 310, 329 N at 4, 8, and 12 weeks, significantly higher than those of the CL implants (102, 216, and 227 N). But there were no significant difference in ID implants and OL implants. Histological examination revealed that new bone formed on the surface of each types of implants, but significantly more bone made direct contact with the surfaces of the ID implants and OL implants.

Conclusions

This research showed that new type of coating, iodine coated titanium has low toxicity and good osteoconductivity.  相似文献   

3.

Background

One of the main actions of vitamin D is bone mineralization regulation. Vitamin D is linked also to hypertension, diabetes, and cardiovascular disease. Vitamin D deficiency may result in osteomalacia, but its excess may result in bone calcium mobilization. Kidney transplant recipients are also at risk of hypovitaminosis D because of impaired graft function. The aim of the study was to assess vitamin D concentration in patients after heart and kidney transplantation.

Material and methods

Ninety-eight stable heart transplant recipients were enrolled in the study; 80 kidney transplant recipients and 22 healthy volunteers served as controls. The laboratory tests, including parameters of 25-hydroxyvitamin D (calcidiol), were assayed using commercially available kits.

Results

Calcidiol deficiency (level below 10 ng/mL) was observed in 10% of the transplant group and in 55 % of the orthotopic heart transplant recipients (OHT). There was positive correlation between calcidiol concentration, hemoglobin, kidney function, and serum glucose in kidney transplant recipients. In OHT, vitamin D correlated with age, kidney function, hemoglobin, cholesterol, low-density lipoprotein cholesterol, and glucose. Both groups had similar kidney function. In both groups of patients with estimated glomerular filtration rate above 60 mL/min/1.72 m2, vitamin D was significantly higher. In OHT, vitamin D was higher in nondiabetic patients. In OHT in multivariate analysis, vitamin D was predicted in 24% by kidney function (beta = ?0.30; P?=?.02) and hemoglobin concentration (beta = 0.25; P?=?.03).

Conclusions

Vitamin D deficiency is more common in patients after heart transplantation than in kidney allograft recipients despite similar kidney function. The possible associations between the cardiovascular system and vitamin D merit further studies.  相似文献   

4.

Background and aim

Liver grafts from donors with chronic and active history of alcohol abuse are usually immediately ruled out for use in liver transplantation (LT). The aim of our study is to evaluate the use of those grafts.

Methods

From 2011 to 2016, a study group (Group 1) composed of 5 adult LT patients transplanted with livers from donors with alcohol abuse, was compared with a control group (Group 2) of 10 randomly matched patients who received liver transplants. Preoperative, intraoperative, and postoperative data were compared.

Results

Among donors, serum gamma-glutamyl transferase values were significantly higher in Group 1. In recipients, post-LT laboratory exams showed significantly higher peak values of aspartate transaminase and alanine transaminase in Group 1; higher values of aspartate aminotransferase, alanine aminotransferase, and total bilirubin in Group 1 were also recorded on day 0. Early allograft dysfunction occurred at higher rates in Group 1 (80% vs 20%, P = .025), with no differences in early rejection episodes or early surgical repeat interventions. All patients from both groups were alive after 20 ± 10 (range 6–35) months from LT.

Conclusion

Despite higher rates of early allograft dysfunction, selected liver grafts from donors with alcohol abuse can be accepted for LT with good clinical results.  相似文献   

5.

Background

Lumbar decompression surgery is often used to treat neurological symptoms of the lower extremity as a result of lumbar disease. However, this method also leads to the improvement of the accompanying low back pain (LBP). We studied the extent of LBP improvement after lumbar decompression surgery without fusion and the associated preoperative factors.

Methods

Patients (n = 140) with lumbar spinal stenosis (n = 90) or lumbar disc herniation (n = 50) were included. To evaluate the change in LBP, VAS scores and the Oswestry disability index scores were measured before surgery and 2 weeks, 3 months, and 6 months after surgery. The predictors of residual LBP were investigated using logistic regression analyses.

Results

In total, 140 patients were examined. The VAS scores for LBP before surgery and 2 weeks, 3 months, and 6 months after surgery were 4.4 ± 3.0 (mean ± standard deviation), 1.1 ± 1.5, 1.3 ± 1.8, and 1.9 ± 2.2, respectively. LBP significantly improved 2 weeks after surgery (P < 0.001), stabilized between 2 weeks and 3 months after surgery, but was significantly aggravated 3–6 months after surgery (P < 0.001). At 6 months after surgery, 67 (47.9%) patients had a VAS score of >1. The predictors of residual LBP included severe preoperative LBP, degenerative scoliosis and the size of the Cobb angle. The independent predictors, determined by multivariate analysis were degenerative scoliosis and the size of the Cobb angle.

Conclusions

LBP was alleviated at 2 weeks after lumbar decompression surgery for lumbar disc herniation and lumbar spinal stenosis. The predictors of residual LBP after decompression included more severe LBP at baseline, degenerative scoliosis and the size of Cobb angle.

Level of evidence

Level 3.  相似文献   

6.

Purpose

This study aimed to prospectively compare the femoral tunnel enlargement at the aperture as well as inside the tunnel after anatomic anterior cruciate ligament (ACL) reconstruction with bone-patellar tendon-bone (BTB) graft to that with hamstring tendon (HST) graft.

Methods

This study included 24 patients with unilateral ACL rupture. Twelve patients underwent anatomic rectangular tunnel (ART) ACL reconstruction with BTB graft and the remaining 12 underwent anatomic triple-bundle (ATB) ACL reconstruction with HST graft. Three-dimensional computer models of femur and bone tunnels were reconstructed from computed tomography images obtained at 3 weeks and 1 year postoperatively. The femoral tunnel enlargement from 3 weeks to 1 year was evaluated by comparing the cross-sectional area (CSA), and compared between the two groups.

Results

The CSA in the ART group at 1 year decreased at the aperture as well as inside the tunnel comparing that at 3 weeks. The CSAs of both tunnels in the ATB group at 1 year significantly increased at the aperture in comparison to those at 3 weeks, and gradually decreased toward the inside of the tunnel. The enlargement rate at the aperture in the ART group was ?12.9%, which was significantly smaller than that of anteromedial graft (27.9%; P = 0.006) and posterolateral graft (31.3%; P = 0.003) in the ATB group. The tunnel enlargement rate at 5 mm from the aperture in the ART group was also significantly smaller than that in the ATB group. At 10 mm from the aperture, there was no significant difference between the tunnel enlargement rate in the ART group and that of anteromedial tunnel.

Conclusions

The tunnel enlargement rate around the aperture was significantly smaller after the ART procedure than that after the ATB procedure. Thus, BTB graft might be preferable as a graft material to HST graft in the femoral tunnel enlargement.  相似文献   

7.

Background

Cardiovascular events (CVE) contribute to serious complications and death after liver transplantation (LT). Troponin I (TnI) level >0.07 mg/L and prior cardiac disease are known to be the independent predictors for posttransplant CVE. We evaluated single-center cardiac workup to predict early cardiovascular morbidity and mortality after LT.

Patients and methods

We recruited 105 consecutive liver transplant recipients (male/female, 59/46; mean age, 51.66?±?11.67 years). The cardiological assessment at evaluation for LT included medical history, electrocardiogram, echocardiography, Holter monitoring, and exercise test. We collected data regarding CVE including hypotonia with catecholamine usage, arrhythmia, sudden cardiac death, pulmonary edema, and myocardial infarction within 7 days after LT.

Results

CVE during LT occurred in 42 recipients (40%) and after LT in 9 patients (8.57%). Proposed cutoff level of TnI >0.07 mg/L did not correlate with CVE during operation (P?=?.73) or after LT (P?=?.47). CVE during LT was associated with arterial hypertension in medical history (P?<.001), right ventricular systolic pressure (P<?.05), and clinical scores: Child-Pugh (P = .04), Model for End-Stage Liver Disease (MELD) (P = .04), MELD incorporating serum sodium (P<.03), and integrated MELD score (P?=?.01). CVE after LT correlated only with arrhythmia (P<.001) and catecholamine usage (P?<?.05) perioperatively. Of interest, catecholamine usage during LT was associated with prolonged stay at the intensive care unit (P?<?.05).

Conclusion

The single-center algorithm with noninvasive cardiac procedures without TnI assessment is optimal in evaluation before LT; however, medical history and severity of the liver disease are crucial for short-term cardiovascular morbidity after LT.  相似文献   

8.

Background

Despite recommendations on how to prevent baseball injuries in youths by the Japanese Society of Clinical Sports Medicine, shoulder and elbow pain still frequently occurs in young baseball players. We conducted a questionnaire survey among baseball players at elementary schools across the country to understand the practice conditions of players, examining the risk factors of shoulder and elbow pain in baseball players.

Methods

The questionnaire survey was conducted among elementary school baseball players as members of the Baseball Federation of Japan in September 2015.

Results

A total of 8354 players belonging to 412 teams (average age: 8.9) responded to the survey. Among 7894 players who did not have any shoulder and/or elbow pain in September 2014, elbow pain was experienced in 12.3% of them, shoulder pain in 8.0% and shoulder and/or elbow pain in 17.4% during the previous one year. A total of 2835 (39.9% of the total) practiced four days or more per week and 97.6% practiced 3 h or more per day on Saturdays and Sundays. The risk factors associated shoulder and elbow pain included a male sex, older age, pitchers and catchers, and players throwing more than 50 balls per day.

Conclusions

It has been revealed that Japanese elementary school baseball players train too much. Coaches should pay attention to older players, male players, pitchers and catchers in order to prevent shoulder and elbow pain. Furthermore, elementary school baseball players should not be allowed to throw more than 50 balls per day.

Study design

Retrospective cohort study.  相似文献   

9.

Background

The main challenge with cytomegalovirus (CMV) prophylaxis in IgG donor-positive/recipient-negative (D+/R–) kidney transplant recipients is late-onset CMV disease. We evaluated a novel protocol for the prevention of late-onset CMV infection and disease in D+/R? organ recipients.

Methods

Our prospective, observational, cohort study included 100 adult kidney transplant recipients. Prophylaxis with low-dose valganciclovir (450 mg/d, 3 times a week for 6 months) was administered to D+/R? recipients. Risk factors for CMV infection and disease were identified. Renal function and the outcomes of CMV infection and disease were compared between D+/R? (n = 15) and recipient-positive (R+; n = 81) organ recipients.

Results

D+/R? recipients showed significant independent risk factors with high hazard ratios for CMV infection (2.04) and disease (10.3). The proportion of CMV infection in D+/R? and R+ recipients was 80% and 46% (P = .023), and that of CMV disease was 33% and 6.2% (P = .008), repectively. D+/R? recipients developed CMV infection and disease within 6 months after transplantation. However, both CMV infection- and disease-free survival rates beyond 1 year post-transplantation defined as late-onset were stable in D+/R? recipients. Moreover, serum creatinine levels at 1 year post-transplantation were comparable between D+/R? and R+ recipients (1.45 ± 0.71 vs 1.16 ± 0.35 mg/dL, P = .26).

Conclusion

Our novel protocol prevented late-onset CMV infection and disease beyond 1 year post-transplantation in D+/R? recipients.  相似文献   

10.

Background

The selection of optimal donor is crucial for successful hematopoietic stem cell transplantation (HSCT). Thereby, it is appropriate to know, in addition to basic human leukocyte antigen (HLA) gene matches, other immunogenic or nonimmunogenic parameters predicting the outcome of transplant.

Objective

A unified approach is necessary to provide a comprehensive view of the patient-donor compatibility characterization outside of standard HLA genes. The approach should be applicable as a tool for optimizing procedures for extended donor typing and/or verification typing of a donor.

Methods

The study used the summary, unification, and innovation of existing practical knowledge and experience of the Czech National Marrow Donor Registry of various factors beyond HLA matching with impact on transplant outcome.

Results

An information technology system–implemented procedure (a verification algorithm) is presented as the decision support approach for prematurely discarding less suitable donors from the transplantation process. It is intended primarily for the transplant specialist to help establish optimal procedures for verifying and determining donor critical factors.

Conclusions

A process defining HLAs, killer cell immunoglobulin–like receptors, and cytokine typing strategies was proposed to provide support to a transplant specialist in refining the choice of a suitable donor.  相似文献   

11.

Background

Despite reported associations between intrapulmonary vascular shunting (IPVS) and morbidity and mortality in pediatric liver transplantation (LT), there are no guidelines for screening.

Objective

To investigate IPVS before and after pediatric LT.

Methods

Retrospective records review of all pediatric LT (n = 370) from 2005 to 2015 at a single institute in Japan. All children with cirrhosis and clinical suspicion of IPVS without cardiac or pulmonary conditions were included. 99mTechnetium labelled macroaggregated albumin (99mTcMAA) scans were performed before and after LT. The severity of IPVS was graded using shunt ratios.

Results

Twenty-four children fulfilled inclusion criteria and underwent Tc99MAA scans. All revealed mild (<20%) to moderate (20%-40%) grades of IPVS. Following LT, the mean shunt ratio regressed from 20.69 ± 6.26% to 15.1 ± 3.4% (P = .06). The median (range) follow-up was 17 (4–85) months. Mortality was zero. The incidence of portal vein thrombosis (4.2%) biliary strictures (12.5%) and graft loss (4.1%) in the study group was not statistically significant compared to the remainder of the 370 transplants (3.2%, 9.4% and 3%, respectively). Sub-group analysis revealed hepatopulmonary syndrome (HPS) in 2 out of 24 children. The mean shunt ratios before and after LT were 39.2 ± 0.77% and 16.2 ± 8.5%, respectively (P = .08). There was 1 complication (intra-abdominal abscess).

Conclusions

HPS is less likely in mild to moderate IPVS. LT may achieve comparable results when performed in the presence of mild to moderate IPVS.  相似文献   

12.

Background

Blood loss during liver surgery is found to be correlated with central venous pressure (CVP). The aim of the current retrospective study is to find out the cutoff value of CVP and stroke volume variation (SVV), which may increase the risk of having intraoperative blood loss of more than 100 mL during living liver donor hepatectomies.

Method and Patients

Twenty-seven adult living liver donors were divided into 2 groups according to whether they had intraoperative blood loss of less (G1) or more than 100 mL (G2). The mean values of the patients' CVP and SVV at the beginning of the transaction of the liver parenchyma was used as the cutoff point. Its correlation to intraoperative blood loss was evaluated using the χ2 test; P?<?.001 was regarded as significant.

Results

The cutoff points of CVP and SVV were 8 mm Hg and 13% respectively. The odds ratio of having blood loss exceeding 100 mL was 91.25 (P?<?.001) and 0.36 (P?<?.001) for CVP and SVV, respectively.

Conclusion

CVP less than 5 mm Hg, as suggested by most authors, is not always clinical achievable. Our results show that a value of less than 8 mm Hg or SVV 13% is able to achieve a minimal blood loss of 100 mL during parenchyma transaction during a living donor hepatectomy. Measurements used to lower the CVP or increased SVV in our serial were intravenous fluids restriction and the use of a diuretic.  相似文献   

13.

Background

Model for End-Stage Liver Disease (MELD) score predicts multisystem dysfunction and death in patients with heart failure (HF). Left ventricular assist devices (LVADs) have been used for the treatment of end-stage HF.

Aim of the study

We evaluated the prognostic values of MELD, MELD-XI, and MELD-Na scores in patients with POLVAD MEV LVAD.

Materials and methods

We retrospectively analyzed data of 25 consecutive pulsatile flow POLVAD MEV LVAD patients (22 men and 3 women) divided in 2 groups: Group S (survivors), 20 patients (18 men and 2 women), and Group NS (nonsurvivors), 5 patients (4 men and 1 woman). Patients were qualified in INTERMACS class 1 (7 patients) and class 2 (18 patients). Clinical data and laboratory parameters for MELD, MELD-XI, and MELD-Na score calculation were obtained on postoperative days 1, 2, and 3. Study endpoints were mortality or 30 days survival. MELD scores and complications were compared between Groups S and NS.

Results

20 patients survived, and 5 (4 men and 1 woman) died during observation. Demographics did not differ. MELD scores were insignificantly higher in patients who died (Group 2). Values were as follows: 1. MELD preoperatively (21.71 vs 15.28, P = .225) in day 1 (22.03 vs 17.14, P = .126), day 2 (20.52 vs 17.03, P = .296); 2. MELD-XI preoperatively (19.28 vs 16.39, P = .48), day 1 (21.55 vs 18.14, P = .2662), day 2 (20.45 vs 17.2, P = .461); and 3. MELD-Na preoperatively (20.78 vs 18.7, P = .46), day 1 23.68 vs 18.12, P = .083), day 2 (22.00 vs 19.19, P = .295) consecutively.

Conclusions

The MELD scores do not identify patients with pulsatile LVAD at high risk for mortality in our series. Further investigation is needed.  相似文献   

14.

Background

Cardiovascular events (CVE) might occur in 20% to 70% of liver transplant recipients, and major CVE are associated with poor long-term survival. Overall, the ability to identify patients at the highest risk of death after liver transplantation (LT) has been improved. Abnormal pretransplant troponin I (TnI) level is regarded as one of predictors of postoperative CVE. We evaluated the number of early CVE after LT and the impact of pretransplant TnI on cardiovascular morbidity.

Patients and methods

We prospectively enrolled 110 consecutive liver transplant recipients (M/F 67/43, age 53.3 ± 10.4 years, 32.7% with hepatitis C virus). Seven of them (6.4%) were on urgent protocol and 3 patients (2.7%) had re-LT. TnI level was measured at listing for LT and directly after LT; clinical outcomes were observed within the first 7 days after LT.

Results

CVE during LT occurred in 51 recipients (46.4%). CVE after LT at the intensive care unit were noticed in 13 patients (11.8%). One patient (0.9%) died in the first 7 days after LT. The level of TnI >0.07 did not correlate with CVE during operation and 7 days after LT (P > .05), but the subgroup with TnI >0.07 before LT had a trend with higher TnI after LT (P?=?.065). Recipients with hepatitis C virus had a trend for higher TnI after LT (P?=?.061). CVE directly after LT correlated significantly with Child-Pugh (P?=?.01), Model for End-Stage Liver Disease (MELD), MELD incorporating serum sodium, and integrated MELD scales (P < .001).

Conclusion

In our single-center algorithm, TnI with canonical cutoff value of 0.07 was not an effective predictor for cardiac outcomes shortly after LT in our population.  相似文献   

15.

Background

The purpose of this study was to compare the mechanical stability of a relatively thin locking plate (FlexitSystem implant) with a relatively firm locking plate (TomoFix implant), both used for opening wedge high tibial osteotomy.

Methods

Seven fresh frozen paired human cadaveric tibiae were used. The opening wedge high tibial osteotomies in the left tibiae were fixated with the FlexitSystem implant and in the right tibiae with the TomoFix implant. The tibiae were CT-scanned to determine the bone mineral density. Axial loading was applied in a cyclic fashion for 50,000 cycles. We compared throughout the loading history the relative motions between the proximal and distal tibia using roentgen stereophotogrammetry analysis at set intervals. Also the strength of the reconstructions was compared using a displacement-controlled compressive test until failure.

Results

One pair (with the lowest bone mineral density) failed during the preparation of the osteotomy. The FlexitSystem implant displayed a similar stability compared to the TomoFix implant, with low translations (mean 2.16 ± 1.02 mm vs. 4.29 ± 5.66 mm) and rotations (mean 3.17 ± 2.04° vs. 4.30 ± 6.78°), which was not significant different. Although on average the FlexitSystem reconstructions were slightly stronger than the Tomofix reconstructions (mean 4867 ± 944 N vs. 4628 ± 1987 N), no significant (p = 0.71) differences between the two implants were found.

Conclusion

From a biomechanical point of view, the FlexitSystem implant is a suitable alternative to the TomoFix implant for a high tibial open wedge osteotomy.  相似文献   

16.
17.

Background

The purpose of this study was to evaluate the possibility of implanting the anterior atlantoaxial lateral mass intervertebral cage, a new type of fixation, by the transoral approach.

Method

This study examined the possibility of implantation in vivo by the quantitative measurement on the dry atlantoaxial bone and implantation of the anterior atlantoaxial lateral mass intervertebral cage in specimens. Anterior atlantoaxial lateral mass intervertebral cages were implanted in 10 atlantoaxial joint specimens using the transoral approach. Eight anatomical parameters (width, the thickness, ordinates, abscissas, and declination angles of the mass) from each of the 30 dry atlas and axis bone specimens were measured. These parameters determined the size and the design of the cage and the way of implantation.

Results

The course of the vertebral artery forms the safe boundary for transoral surgery. The shape of the area of work exposure was an inverted trapezoid. In specimens, the anterior atlantoaxial lateral mass intervertebral cages could be successfully implanted using the transoral approach. The parameters of the human atlantoaxial lateral masses exposed anteriorly showed that there was enough space, for the safe anterior implantation of the cage. The surgery using the transoral atlantoaxial reduction and plate makes possible the implantation of the anterior cage.

Conclusion

The implantation of anterior atlantoaxial lateral mass intervertebral cage through transoral approach is possible.  相似文献   

18.

Background

The optimum approach in total hip arthroplasty (THA) should reduce the risk of postoperative dislocation or limping, be applicable in every case, and be reusable in the future. The purpose of this study was to introduce our transgluteal approach for THA and to evaluate the type and frequency of complications around the greater trochanter.

Methods

This study retrospectively evaluated 892 THA cases between January 2010 and March 2015 performed using our transgluteal approach that osteotomized only the lateral anteroinferior greater trochanter. The trochanteric fragment was reattached using one of three different protocols: Group A, three non-absorbable polyester sutures; Group B, two non-absorbable polyester sutures and one ultra-high molecular weight polyethylene (UHMWPE) fiber cable; or Group C, two UHMWPE fiber cables. Postoperative complications were assessed and recorded, and univariate logistic regression analyses were performed to determine whether risk factors and radiological complications around the greater trochanter were correlated.

Results

None of the hips required revision for infection, dislocation, or limping. The rate of radiological complications around the greater trochanter at 1 year was 19.2% in Group A, 16.3% in Group B, and 7.9% in Group C (p < 0.001). Risk factors for radiological complications included the patient's disease or the surgeon's experience in Group A and the patient's age or the surgeon's experience in Group C. In the relationship between postoperative pain around the greater trochanter and radiological complications, there were no significant differences in all groups; no group interaction was observed (p= 0.3875).

Conclusion

The UHMWPE fiber cable was effective to reduce complications of the reattached osteotomized greater trochanter in THA.  相似文献   

19.

Objective

This study investigated the relation between self-assessment of upper extremity function and locomotive syndrome in a general population.

Methods

Using the 25-question Geriatric Locomotive Function (GLFS-25) test, 320 Japanese people (115 men, 205 women, mean age 67.6 years, 40–92 years) were evaluated for locomotive dysfunction. All had completed a self-administered questionnaire including items for sex, weight, height, dominant hand, and the degree of frequency of hand in ADL. We measured the bilateral hand grip and key pinch strength as indicators of hand muscle function. Study participants were assessed for upper extremity dysfunction using Hand 10, a self-administered questionnaire for upper extremity disorders, and using the Japanese Society for Surgery of the Hand Version of Disability of the Arm, Shoulder, and Hand. Statistical analyses were conducted to clarify the association between upper extremity dysfunction and screening results for locomotive dysfunction.

Results

Participants reporting any upper extremity dysfunction were 137 (47 men, 90 women) out of 320 participants. The GLFS25 score was found to have significant positive correlation with age and Hand 10 scores. Significant negative correlation was found with the GLFS25 score and dominant grip strength, non-dominant grip strength, dominant key pinch strength, and non-dominant key pinch strength. Univariate analysis revealed a significant association with age, sex, bilateral hand grip, and key pinch, and with the Hand 10 score and Locomotive syndrome. Logistic regression analysis applied after adjustment for age, sex, height, and weight revealed a significant association between Locomotive syndrome and each of non-dominant hand grip (OR 0.73, 95%CI 0.61–0.87) and the Hand 10 questionnaire score (OR 1.10, 95%CI 1.06–1.14).

Conclusion

Locomotive syndrome is associated with the decline of self-assessed and observed upper extremity function.

Study design

Cross-sectional study.  相似文献   

20.

Background

The objective of the study was to determine the normalization curve of the serum C-reactive protein (CRP) in elective shoulder arthroplasty.

Methods

A prospective study including 58 consecutive patients who had undergone elective shoulder arthroplasty. Forty-one patients had received a Reverse Shoulder Arthroplasty, 13 a Total Shoulder Arthroplasty and 4 a Hemiarthroplasty. Based on a pilot study, blood samples to determine CRP values were obtained at baseline (1 h before surgery), on the 1st, 2nd, 6th, 8th and 14th postoperative days. All the patients included presented no postoperative complications during inpatient stay or any re-admission during the three months after surgery.

Results

Mean CRP values showed a rapid increase on the 1st postoperative day (7-fold higher than the baseline in cuff tear arthropathy, 11-fold higher in primary osteoarthritis, 1-fold higher in acute fracture) and reached a peak on the 2nd postoperative day (14-fold higher than the baseline in cuff tear arthropathy, 24-fold higher in primary osteoarthritis and 2-fold higher in acute fracture). After the 2nd postoperative day CRP values began to slowly decrease reaching the normal range in the 14th postoperative day.

Conclusions

Serum CRP levels after elective shoulder arthroplasty rapidly increase to reach a maximum peak after the 2nd surgery day and then slowly decrease to return to normality on the 14th day. Knowing the normalization curve of CRP can be a helpful tool to help in the diagnosis of acute infections in elective shoulder arthroplasty.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号