首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Purpose

To clarify whether there is any difference in mid-term clinical and radiologic outcomes between bone-grafted laminoplasty (BG LAMP) and non-bone-grafted laminoplasty (non-BG LAMP) when used to treat cervical spondylotic myelopathy.

Background

Conventional BG LAMP includes bone grafting at the lamina hinge site to prevent closure of the lamina postoperatively, but it often results in segmental fusion and sometimes causes loss of cervical mobility and lordotic alignment. Non-BG LAMP can now be performed to address this problem and preserve mobility postoperatively. However, there have been no studies comparing BG LAMP and non-BG LAMP to date.

Methods

Forty-one patients who underwent BG LAMP (n = 24) or non-BG LAMP (n = 17) and had 5 years of follow-up were enrolled in the study. Neurological status was assessed preoperatively and postoperatively using the Japanese Orthopedic Association (JOA) scoring system. The Numeric Rating Scale (NRS) was used to assess neck pain after surgery at the final visit. Radiographic parameters were evaluated at 1, 3, and 5 years after surgery. Postoperative segmental fusion was defined as the level at which the segmental flexion–extension range of motion was <1°.

Results

There was no significant difference in JOA score or recovery rate between the groups. NRS score was significantly lower in the BG group, indicating less neck pain (P < .01). The lordotic angle and range of motion at C2-C7 were significantly decreased in the BG group (P < .05). The segmental fusion was evident from 1 year postoperatively in both groups, but the fusion rate was significantly higher in the BG group (P < .05).

Conclusions

Neurologic outcomes were similar between the two groups, whereas axial symptom was lower in the BG group than in the non-BG group.

Level of evidence

Ⅳ  相似文献   

2.

Background

Cost-utility analysis of surgery for degenerative lumber spondylolisthesis (DS) is essential for healthcare providers and patients to select appropriate treatment. The purpose of this study was to review the cost-utility of decompression alone versus decompression with fusion for DS.

Methods

A retrospective review of 99 consecutive patients who were treated for Meyerding grade 1 DS at two representative spine centers was performed. Patients with significant spinal instability were treated by decompression with fusion (F group, 40 patients); all others were treated by decompression surgery alone (D group, 59 patients). All patients were followed for three years. Demographic and radiographic data, health-related quality of life (HRQoL), and the direct cost for surgery were analyzed, and the incremental cost-effectiveness ratio (ICER) was determined using cost/quality-adjusted life years (QALY).

Results

There were no differences between the groups in baseline demographics (D vs. F: age 68 ± 9 vs. 66 ± 7 years; 37% vs. 40% female) or HRQoL (ODI: D, 41 ± 16 vs. F, 46 ± 13%). The F group had a higher initial-surgery cost ($18,992 ± 2932) but lower reoperation frequency (7%) than the D group ($7660 ± 2182 and 12%, respectively). The three-year total direct cost was higher for F than for D ($19,222 ± 3332 vs. $9668 ± 6,168, p = .01). ICER was higher for F at one year ($136,408 ± 187,911 vs. $237,844 ± 212,049, p < .01), but was comparable for F and D at three years (D, $41,923 ± 44,503 vs. F, $51,313 ± 32,849, p = .17).

Conclusion

At the three-year follow-up, the two methods had comparable cost-utility. Both methods were cost-effective (defined as an ICER within three times the per-capita gross domestic product).  相似文献   

3.

Background

The Japanese Orthopaedic Association shoulder score cutoff values were calculated in patients with rotator cuff repair using the University of California at Los Angeles shoulder score.

Methods

Overall, 175 patients with rotator cuff repair were subjects in this study. The University of California at Los Angeles and Japanese Orthopaedic Association shoulder scores were evaluated before surgery and at 3, 6, 9, and 12 months after surgery. The cutoff value of the Japanese Orthopaedic Association shoulder score was determined using the 4-stage criteria of the University of California at Los Angeles shoulder score and a University of California at Los Angeles shoulder score of 28 points, which is the boundary between an excellent/good group and a fair/poor group.

Results

Both the JOA shoulder and UCLA shoulder scores showed significant improvement at 6, 9, and 12 months from the preoperative scores (p < 0.0001). There was a strong correlation between the total values of the two scores (r = 0.85, p < 0.0001). The cutoff value of the Japanese Orthopaedic Association shoulder score based on the highest accuracy from receiver operating characteristic curve analysis was 83 points.

Conclusion

A Japanese Orthopaedic Association shoulder score cutoff value of 83 was equivalent to a University of California at Los Angeles shoulder score cutoff value of 28 for distinguishing between excellent/good and fair/poor outcomes after rotator cuff repair.  相似文献   

4.

Background

The JOA (Japan Orthopaedic Association) score has been a standard outcome measure to evaluate cervical myelopathy in Japan. Despite its reliability and convenience, there can be a rating bias in the JOA score. The current study was conducted to delineate the rater's bias of the JOA score by comparing it with a new objective outcome measure.

Methods

Two hundred and thirty four operative candidates with cervical myelopathy were included in the study. The patients were divided into four groups according to the surgeon (92 patients in group A, 60 patients in group B, 38 patients in group C and 44 patients in group D). Each patient's preoperative JOA score was exclusively recorded by the surgeon himself, while JOACMEQ (Japanese Orthopaedic Association Cervical Myelopathy Evaluation Questionnaire) was recorded by each patient. Disease severity, the most important prognostic factor, was equalized between patient groups by a special statistical method called inverse-probability weighting (IPW). To define similarity of the two groups, Cohen's d was used.

Results

After the adjustment, the differences of the JOA score were only 0.1 between groups A and D and 0 between groups B and C. The values of Cohen's d were also very small both between groups A and D (3%), and between groups B and C (0.3%). The averaged JOA scores of groups A and D were higher by 0.4–0.8 than those of groups B and C, while the averaged JOA scores were almost the same both between groups A and D, and between groups B and C. Surgeons A and D had the same tendency to give higher JOA scores than surgeons B and C did.

Conclusions

The current study confirmed there is a definite rater's bias in the JOA score. JOACMEQ is to be applied as a more reliable outcome measure to evaluate myelopathy patients.  相似文献   

5.

Background

To increase the number of cadaveric kidney transplants in Japan, it is necessary to proactively use organs from all donors. Since the revision of the Organ Transplant Law, the number of organ donors after cardiac death (DCD) has decreased but the number of organ donors after brain death (DBD) has increased; however, the number of donor organs and awareness of cadaveric transplantation have increased.

Methods

At our institution, 28 patients underwent cadaveric kidney transplantation from January 2001 to December 2016. These patients were classified into 2 groups according to DBD or DCD. Furthermore, 10 patients received transplants from expanded criteria donors (ECD) and 18 received them from standard criteria donors (SCD).

Results

Kidney graft survival and engraftment were observed for all patients. There were no significant differences in renal function at 6 months for DBD and DCD transplant recipients. Renal function at 1, 3, and 5 years and serum creatinine levels were better for the ECD group. Renal function at 5 years after transplantation was significantly better for the SCD group than for the ECD group; however, there was no difference in delayed graft function between the SCD and ECD groups. Comparisons of the 3 groups showed good renal function for transplants from DBDs, but there was no significant difference in survival rates.

Conclusions

Results were good for all patients. There were no significant differences in outcomes of our patients who received transplants from ECD or SCD.  相似文献   

6.

Background

Extended-release tacrolimus (TacER), administered once daily, offers improved adherence with reduced side effects while still maintaining an immunosuppressive potency equivalent to that of conventional tacrolimus preparations.

Methods

The study included 83 patients who received consecutive living-donor kidney transplants at our facility from June 2013 to December 2016. Comparisons were made between 48 cases of induction with TacER and 35 cases of induction with cyclosporine (CyA). The observation period was 3 months after transplantation. Transplanted kidney function, rejection, infectious disease, lipid abnormalities, and glucose tolerance were compared.

Results

The 2 groups showed no significant difference in donor background or transplanted kidney function. Within the 3-month observation period, an acute rejection response was observed in 2 cases in the TacER group and in 8 cases in the CyA group. After transplantation, hyperlipidemia requiring medication was observed more frequently in the CyA group. The 2 groups did not show a marked difference in systemic infection or renal calcineurin inhibitor toxicity in histopathologic examination of the transplanted kidneys 3 months after surgery.

Discussion

Proactive use of TacER leads to improved adherence while yielding immunosuppressive potency equivalent to that of conventional tacrolimus preparations; however, tacrolimus has a potent blood sugar-elevating effect; thus, direct comparison with the CyA group is important for assessing the side effects.

Conclusion

TacER has the potential to also reduce side effects in the early stages after surgery, suggesting its potential as a drug of first choice.  相似文献   

7.

Objectives

An oversized cardiac allograft may have a negative impact on survival outcomes according to previous studies; however, due to the shortage of pediatric donor hearts, the use of oversized cardiac allografts is sometimes inevitable. In this study, we reported the survival outcomes of pediatric patients in relation with the donor-recipient weight ratio.

Methods

Twenty-eight children, aged 3 months to 17 years, with dilated cardiomyopathy underwent primary cardiac transplantation at the National Taiwan University Hospital between 1995 and 2012. We analyzed these patients according to the donor-recipient weight ratio: group 1 (n = 19) with donor-recipient weight ratio <2.5 (median 1.1, interquartile range 1.0–1.6), and group 2 (n = 9) with donor-recipient weight ratio ≥2.5 (median 3.0, inter-quartile range 2.87–3.5).

Results

The 30-day survival rate was 100% for both group 1 and group 2 (P = 1). The survival rates for group 1 and group 2 were 95% vs 100% at 1 year, 84% vs 89% at 5 years, and 73% vs 61% at 10 years. The median survival was 14.4 years vs 12.9 years (P = .6313).

Conclusion

In this cohort, the use of oversized cardiac allograft in pediatric patients for dilated cardiomyopathy did not have a negative effect on short-term and long-term survival.  相似文献   

8.

Objectives

We analyzed the outcome of patients with implantable left ventricular assist devices (LVADs) at the University of Tokyo Hospital to compare those with centrifugal pumps (CE group: Duraheart and Evaheart) and those with axial-flow pumps (AX group: Heartmate II and Jarvik 2000).

Methods

A total of 68 patients who underwent implantation of LVADs (Duraheart: n = 15; Evaheart: n = 23; Heartmate II: n = 22; Jarvik 2000: n = 8) as a bridge to transplantation at our institution from May 2011 to April 2015 were retrospectively reviewed. All patients were followed through December 2015.

Results

The mean follow-up time of the CE group was 1.95 ± 0.92 year (total 74.1 patient-years) and that of the AX group was 1.56 ± 0.56 year (total 46.8 patient-years). Whether the patients underwent centrifugal or axial-flow pump implantations was not associated with survival or driveline infection according to log-rank test (1-year survival rate: 89% vs 100% [P = .221]; 1-year freedom rate: 40% vs 43% [P = .952]). The rates of freedom from cerebrovascular accident (CVA) at 1 year after LVAD implantation in the CE and AX groups were 70% and 96%, respectively (P < .001). The CE group showed a higher frequency of CVA (0.472 vs 0.021 event per patient-year).

Conclusions

Our findings indicate that overall survival and driveline infection rates are similar between centrifugal and axial-flow pumps, but they suggest that patients with centrifugal pumps are more likely to develop CVAs than those with axial-flow pumps.  相似文献   

9.

Objectives

The techniques and outcomes of outflow reconstruction in living donor liver transplantation (LDLT) using cryopreserved homologous veins at the University of Tokyo Hospital are presented.

Methods

We performed 540 LDLTs from January 1996 to March 2015. Graft types included right liver graft (n = 262), left liver graft (n = 196), left lateral sector graft (n = 53), and posterior sector graft (n = 28). We routinely use cryopreserved homologous vein grafts for the hepatic vein reconstructions to secure the large outflow of the graft. In addition to the presentation of our techniques, the cases with symptomatic outflow obstruction and the treatments were also investigated.

Results

The 1-, 3-, and 5-year graft survival rates were 90.6%, 86.1%, and 83.5%, respectively. The incidence of severe complications (Clavien-Dindo grade IIIb and more) was 38%. The overall incidence of outflow obstruction requiring invasive treatment was 1.9% (10/540), including 3 left liver grafts (1.5%, 3/196) and 7 right liver grafts (2.7%, 7/262). Regarding the patency of the reconstructed veins, the left hepatic vein, middle hepatic vein, and right hepatic vein achieved nearly 100% patency. On the contrary, venous tributaries such as V5, V8, and inferior right hepatic vein were frequently occluded in the postoperative course.

Conclusions

Outflow reconstruction is a key for the successful LDLT. Cryopreserved homologous vein graft is useful for the promising hepatic vein reconstruction.  相似文献   

10.

Aim

We investigated clinical outcomes of patients in Japan with a history of long-term dialysis treatment.

Methods

We conducted 1171 kidney transplantations between 2000 and 2015. Sixty of the patients had undergone dialysis therapy for >20 years before the transplantation. We compared graft and patient survivals between the recipients with >20 years of dialysis (long dialysis group [LGD]) and those with <20 years (control group [CG]) in a case-control study, in which sex and age of both donors and recipients, ABO compatibility, and calendar year of transplantation were matched.

Results

Average age of LDG was 52.8 ± 8.9 years, and that of CG was 54.2 ± 12.6 (P > .05). Durations of dialysis were 25.4 ± 1.57 vs 5.8 ± 5.8 years, respectively (P < .05). The graft survival rates were 91.6%, 89.9%, and 81.8% at 3, 5, and 10 years in LDG vs 90.71%, 84.8%, and 78.3% in CG, respectively (P > .05). The patient survival rates were 96.6%, 93.2%, and 88.6% in LDG vs 94.5%, 91.0%, and 83.9%, respectively (P > .05). There was no significant difference in mean estimated glomerular filtration rates for post-transplant 10 years between them.

Conclusion

LDG showed satisfying clinical outcomes comparable to those of CG both in graft and patient survivals and renal function.  相似文献   

11.

Introduction

Subnormothermic machine perfusion (SNMP) shows some advantages for the preservation of grafts donated after cardiac death (DCD) and improvements in machine perfusion (MP) technology are important to enhance organ preservation outcomes for liver transplantation. In this study, we focused on purified subnormothermic machine perfusion (PSNMP) and volumes of perfusate removed to substitute for purification and replaced by modified University of Wisconsin-gluconate after the start of perfusion and investigated, in particular, the optimum perfusate purification volume. Several purification volumes under SNMP were compared. In addition, the perfusate purification during MP was indicated as a potential technique to enhance the organ quality of DCD grafts and extended-criteria donors.

Methods

The PSNMP at several volumes (0.5 L, 1.5 L, and 3 L) were compared with regular SNMP without any purification treatment (untreated control). In the PSNMP group, all perfusate was removed to substitute for purification of the perfusate by modified University of Wisconsin-gluconate solution after the start of perfusion. After removing the perfusate, new perfusate with the same components was perfused to preserve the porcine livers obtained under warm ischemia for 60 minutes using SNMP at 22°C porcine liver for 4 hours.

Results

The concentrations of aspartate aminotransferase and lactate dehydrogenase in the untreated group were significantly higher during perfusion compared to those of the intervention group. There are no significant differences among the volume conditions of the purification groups.

Conclusions

The optimal volume of perfusate purification was confirmed with a simple experimental comparison between untreated and PSNMP conditions.  相似文献   

12.

Background

It remains unclear whether long fusion including lumbar-sacral fixation is needed in corrective surgery to obtain good global sagittal balance (GSB) for the treatment of traumatic thoracolumbar kyphotic spine deformity. The purposes of this study were to evaluate compensatory mechanism of the spine after corrective surgery without lumbar-sacral fixation and to evaluate the parameters affecting the achievement of good GSB post-operatively.

Methods

Twenty (20) subjects requiring corrective surgery (distal end of fixation was L3) were included in this study. The radiographic parameters were measured pre-operatively and at one month after surgery. Sagittal Vertical Axis (SVA), Lumber Lordosis angle altered by fracture (fLL), Thoracic Kyphosis angle altered by fracture (fTK), Pelvic Tilt (PT), Sacral Slope (SS), Pelvic Incidence (PI), Segmental Lumbar Lordosis (sLL: L3-S/L4-S), and local kyphotic angle were measured. The correlation between correction of local kyphotic angle (CLA) and the change in radiographic parameters was evaluated. Post-operatively, subjects with SVA<50 mm and PI-fLL<10°were regarded as the “good GSB group (G group). The radiographic parameters affecting the achievement of G group were statistically evaluated.

Results

fLL, sLL:L3-S and sLL:L4-S were decreased indirectly because the local kyphosis was corrected directly (CLA: 26.5 ± 8.6°) (P < 0.001). CLA and the change in fLL showed significant correlation (r = 0.821), the regression equation being: Y = ?0.63X+3.31 (Y: The change in fLL, X: CLA). The radiographic parameters significantly affecting the achievement of G group were: SVA, PT, PI-fLL, sLL: L3-S, and sLL: L4-S (P < 0.01).

Conclusion

The main compensatory mechanism was the decrease of lordosis in the lumbar spine. fLL was decreased to approximately 60% of CLA after surgery. SVA was not corrected by the compensatory mechanism.  相似文献   

13.

Study Design

Multicenter retrospective study.

Background

Postoperative surgical site infection is one of the most serious complications following spine surgery. Previous studies do not appear to have investigated pyogenic discitis following lumbar laminectomy without discectomy. This study aimed to identify risk factors for postoperative pyogenic discitis following lumbar decompression surgery.

Methods

We examined data from 2721 patients undergoing lumbar laminectomy without discectomy in five hospitals from April 2007 to March 2012. Patients who developed postoperative discitis following laminectomy (Group D) and a 4:1 matched cohort (Group C) were included. Fisher's exact test was used to determine risk factors, with values of p < 0.05 considered statistically significant.

Results

The cumulative incidence of postoperative discitis was 0.29% (8/2721 patients). All patients in Group D were male, with a mean age of 71.6 ± 7.2 years. Postoperative discitis was at L1/2 in 1 patient, at L3/4 in 3 patients, and at L4/5 in 4 patients. Except for 1 patient with discitis at L1/2, every patient developed discitis at the level of decompression. The associated pathogens were methicillin-resistant Staphylococcus aureus (n = 3, 37.5%), methicillin-susceptible Staphylococcus epidermidis (n = 1, 12.5%), methicillin-sensitive S. aureus (n = 1, 12.5%), and unknown (n = 3, 37.5%). In the analysis of risk factors for postoperative discitis, Group D showed a significantly lower ratio of patients who underwent surgery in the winter and a significantly higher ratio of patients who had Modic type 1 in the lumbar vertebrae compared to Group C.

Conclusions

Although further prospective studies, in which other preoperative modalities are used for the evaluation, is needed, our data suggest the presence of Modic type 1 as a risk factor for discitis following laminectomy. Latent pyogenic discitis should be carefully ruled out in patients with Modic type 1. If lumbar laminectomy is performed for such patients, more careful observation is necessary to prevent the development of postoperative discitis.  相似文献   

14.

Background

Sarcoidosis is a chronic systemic disease that is characterized by the formation of noncaseating granuloma and whose etiology is unclear. It is unclear whether patients with sarcoidosis are suitable organ donors.

Case

We treated a 56-year-old woman with pulmonary sarcoidosis who donated her kidney. She was previously in good health and was diagnosed with pulmonary sarcoidosis during her preoperative examination. Because she presented with no symptoms and was otherwise in good condition, donor nephrectomy was performed.

Results

Baseline biopsy examination showed no evidence of sarcoidosis. One year after transplantation, both the donor and the recipient had not developed kidney dysfunction or recurrence of sarcoidosis.

Conclusion

This is a rare case in which a patient with pulmonary sarcoidosis donated a kidney for transplantation, and both the recipient and the donor were clinically healthy. A patient with sarcoidosis and no kidney lesion can donate a living kidney, because transplantation appears to be safe for both the recipient and the donor.  相似文献   

15.

Background

In patients eligible for organ transplantation, the Kidney Disease Improving Global Outcomes (KDIGO) guidelines specifically recommend avoiding red blood cell transfusions (RBCT) when possible to minimize the risk of allosensitization.

Objective

To assess the effect of perioperative RBCT on outcomes in living-related kidney transplantation (LRKT) recipients.

Methods

We retrospectively assessed 97 patients who underwent LRKT and whose data were evaluable at our institution between March 2009 and May 2016. We measured serum creatinine levels and calculated the estimated glomerular filtration rate (eGFR) at 3 months, 6 months, and 1 year after kidney transplantation (KTx). We evaluated the rejection rate within a year after KTx. We compared the renal function and rejection rate between those who received blood transfusions (n = 21) and those who did not (n = 76) during the perioperative period.

Results

Among patient characteristics, the rate of ABO-incompatible KTx and the mean hemoglobin levels before KTx differed significantly between the groups. The serum creatinine levels and eGFR within 1 year after KTx did not differ significantly between the two groups. The rejection rate in those who received blood transfusions and those who did not was 28.6% (6/21 patients) and 25.0% (19/76 patients) (P = .741), respectively.

Conclusions

We found that the rejection rate was slightly higher in patients who received perioperative RBCT than in those who did not, but the difference was not significant within a year after KTx. Perioperative RBCT may not affect renal function within a year after KTx.  相似文献   

16.

Objectives

Although soft tissue sarcoma (STS) is rare, its incidence is increasing among older patients. Few studies have compared the outcomes between conservative and surgical treatments for STS patients aged ≥80 years. We assessed the outcomes of both treatments in this population and the association between older age and surgical outcome.

Methods

We recruited consecutive patients with STS aged ≥80 years treated at our institution between January 2006 and May 2014. We recommended surgical resection for all patients without multiple distant metastases. Overall survival and sarcoma-specific survival were assessed using the Kaplan–Meier method.

Results

Of the 39 patients with STS who presented at our institution, 37 were included in this analysis (19 men and 18 women with a median age of 85 [range 80–94] years). Tumors were classified as Stage IB (n = 3), IIA (n = 6), IIB (n = 3) or III (n = 24). Four patients underwent conservative therapy and 33 underwent surgical resection. The most common tumor site was the lower extremity, and the majority of tumors were classified as undifferentiated pleomorphic sarcoma. The follow-up rate was 100%. One-year sarcoma-specific survival rates were 25.0% in the conservative therapy group and 90.9% in the surgical resection group. No associations were found between age ≥85 years and perioperative complications or clinical outcome.

Conclusions

Surgical resection had relatively few complications, given the age group, and improved the prognosis of older patients with STS. Surgical resection of STS with curative intent should be considered in older patients.  相似文献   

17.

Purpose

After undergoing the Kasai procedure for biliary atresia (BA), most patients develop severe splenomegaly that tends to be improved by liver transplantation. However, fluctuations in splenic volume long after transplantation remain to be elucidated.

Patients and Methods

Seventy-one consecutive patients who had undergone pediatric living donor liver transplantation (LDLT) for BA were followed up in our outpatient clinic for 5 years. They were classified into 3 groups according to their clinical outcomes: a good course group (GC, n = 41) who were maintained on only 1 or without an immunosuppressant, a liver dysfunction group (LD, n = 18) who were maintained on 2 or 3 types of immunosuppressants, and a vascular complication group (VC, n = 11). Splenic and hepatic volumes were calculated by computed tomography in 464 examinations and the values compared before and after the treatment, especially in the VC group.

Results

Splenic volume decreased exponentially in the GC group, with splenic volume to standard spleen volume ratio (SD) being 1.59 (0.33) 5 years after liver transplantation. Splenic volume to standard spleen volume ratios were greater in the VC and LD groups than in the GC group. Patients in the VC group with portal vein stenosis developed liver atrophy and splenomegaly, whereas those with hepatic vein stenosis developed hepatomegaly and splenomegaly. Interventional radiation therapy tended to improve the associated symptoms.

Conclusions

Fluctuations in splenic volume long after pediatric LDLT for BA may reflect various clinical conditions. Evaluation of both splenic and hepatic volumes can facilitate understanding clinical conditions following pediatric LDLT.  相似文献   

18.

Background

Although living donor liver transplantation for obese recipients has increased, it has not been determined that posttransplant outcomes in obese recipients are inferior compared with nonobese recipients.

Methods

From January 2001 to December 2016, there was a total of 58 (6%) obese patients (body mass index?≥30) in a cohort of 973 adult patients that underwent living donor liver transplantation. Propensity score matching and classification were performed based on the type of obesity, and there were 58 patients in the obese group and 141 patients in the nonobese group. We performed comparative analysis of posttransplant outcomes including Model for Early Allograft Function (MEAF) scoring and early allograft dysfunction (EAD).

Results

EAD was found in 11 (19%) and 31 (22%) patients in the obese and nonobese groups, respectively (P = .71). The obese group had a higher MEAF score than the nonobese group (5.2 vs 4.5, P = .007). The mean hospitalization of the obese group was shorter than in the nonobese group (32 vs 42 days, P = .003). Other posttransplant outcomes were similar between the obese and nonobese groups, including acute cellular rejection (8 vs 10 cases, P = .17), early graft failure (8 vs 12 cases, P = .30), index hospital mortality (6 vs 11 cases, P = .58), and comprehensive complication index (26.0 vs 24.6, P = .76).

Conclusion

Posttransplant outcomes of the obese group were not inferior to the nonobese group. However, obesity can impact the severity of EAD and the incidence of early graft failure, based on significantly higher MEAF scores.  相似文献   

19.

Background

A few cohort studies have determined which patients with lumbar spinal stenosis are likely to need surgery because of the deterioration of symptoms. However, there are still insufficient data regarding the management of lumbar spinal stenosis due to lack of prognostic factors associated with the need for surgery. The purpose of this study was to identify the prognostic factors associated with the need for surgical treatment in patients with lumbar spinal stenosis.

Methods

Patients with lumbar spinal stenosis from our hospital and related facilities were enrolled. Eligibility criteria were as follows: age 50–85 years and the patient's conditions met the definition of lumbar spinal stenosis; the presence of neurogenic intermittent claudication caused by numbness and/or pain in the lower limbs; and magnetic resonance imaging-confirmed symptomatic LSS. We followed 274 patients (151 men; mean age, 71 ± 7.4 years) for 3 years to identify prognostic factors. We used a multivariate logistic regression model to investigate the association between the indication for surgical treatment (within 3 years) and age, sex, complications, depression, illness duration, the presence of cauda equina symptoms, and the presence of degenerative spondylolisthesis/scoliosis.

Results

In the survey conducted 3 years after treatment, 185 patients responded (follow-up rate 67.5%). In 82 patients, surgery was performed during the follow-up period. The multivariate logistic regression model showed that the presence of cauda equina symptoms and the presence of degenerative spondylolisthesis/scoliosis were significantly associated with the indication for surgical treatment within 3 years.

Conclusions

This study showed that the presence of cauda equina symptoms and degenerative spondylolisthesis/scoliosis were prognostic factors associated with the indication for surgery in patients with lumbar spinal stenosis.  相似文献   

20.

Background

Meniscus surgery is the most commonly performed orthopedic surgery, and despite recent emphasis on saving the meniscus, the current status of meniscus surgeries is little known in many countries, including Japan. The National Database of Health Insurance Claims and Specific Health Checkups of Japan and the Statistics of Medical Care Activities in Public Health Insurance track meniscus surgeries through health insurance claims. The National Database provides the numbers for 2014 and 2015, and the Statistics of Medical Care Activities provides the numbers from June 2011 to June 2016. Our aim was to analyze isolated meniscus surgery numbers and meniscus repair ratios by age group based on the National Database and evaluate trends of meniscus repair ratios for the latest six years from the Statistics of Medical Care Activities.

Methods

Meniscus surgeries by age group were counted from the National Database for 2014–2015, and meniscus repair ratios (meniscus repairs/meniscus surgeries) were calculated. The numbers were also counted from the Statistics of Medical Care Activities in 2011–2016. For statistical analysis of annual trends of meniscus repair ratios, the Cochran–Armitage trend test was used. Meniscus surgeries with concomitant knee ligament surgeries were excluded.

Results

According to the National Database, isolated meniscus surgeries totaled 34,966 in 2015, with peak ages of patients in their late teens and 60s. The meniscus repair ratio was 19% in 2014 and 24% in 2015. According to the Statistics of Medical Care Activities, the meniscus repair ratio was 9% in 2011 and significantly increased to 25% in 2016 (p = 0.0008). The ratio also increased significantly in each age group between the early 20s and late 70s.

Conclusions

Approximately 35,000 meniscus surgeries are performed in Japan annually, with peak ages in the late teens and 60s. The number of meniscus repairs has increased over the past six years.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号