首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objective

Clinical characteristics of recipients of deceased donor renal transplantations were evaluated in the period before versus after implmentation of The National Allocation System (NAS).

Patients and Methods

We evaluated retrospectively clinical profiles of the 42 after NAS (June 2008-December 2010) versus 42 consecutive deceased donor renal transplantation patients before NAS. Patient and graft survival rates were assessed using the Kaplan-Meier method; graft function was assessed based on creatinine clearance with the Cockcroft Gault equation. Patient and donor data were obtained from medical records.

Results

Recipients were older in the pre-NAS group (39 ± 8 vs 33 ± 8 years, respectively; P = .001) and median duration of preoperative dialysis was longer in the post-NAS group (103 ± 61 months vs 50 ± 36 months, respectively; P = .000). The average number of human leukocyte antigen-mismatched antigens were pre-NAS 3.4 ± 1.0 versus post-NAS 3.9 ± 1.2 (P = .05). Considering the recipients serological status 9 were hepatitis C virus (HCV)(+) and 2 hepatitis B virus (HBV)(+) among the post-NAS versus no HBV(+) and only 1 HCV(+) patient pre-NAS. Kaplan-Meier analysis of graft survival rates showed 90% at 1 and 85% at 3 years pre-NAS. Similar to 95% at 1 and 86% at 3 years for the post-NAS group (P > .05). Likewise, patient survival rates for both groups at 1 and 3 years were 97%. The mean parameter of donor age, allograft loss, cold ischemia time, patient death, number of retransplantations, HBV(+) patients, and delayed graft function were similar between groups (P > .05).

Discussion

After NAS the transplant recipients were older, had a longer duration of dialysis, greater number of HLA mismatched antigens and, more HCV(+). No differences were observed in short-term patient and graft survival rates.  相似文献   

2.

Introduction

Grafts from older donors or those in recipients with a greater body mass index (BMI) as compared with the donor may develop hyperfiltration syndrome that shortens renal graft survival.

Objectives

To assess whether the differences in weight and BMI between donor and recipient correlated with renal function, proteinuria, or graft survival among recipients of grafts from expanded criteria donors.

Materials and methods

We undertook a prospective, observational study in 180 recipients of grafts from expanded criteria donors performed between 1999 and 2006. All grafts had been biopsied previously for viability. The recipients underwent immunosuppression with basiliximab, late introduction of tacrolimus, mycophenolate mofetil and steroids. The study population was divided into three groups, depending on the tertile of the donor-to-recipient weight ratio (<1, n = 64; 1-1.2, n = 56; >1.2, n = 60), and the donor-to-recipient BMI ratio (<0.97, n = 59; 0.97-1.13, n = 60; >1.13, n = 60). The glomerular filtration rate was estimated from the modified diet in renal disease (MDRD) equation.

Results

The mean age of the donors was 63.54 years and of the recipients, 58.38 years. The proportion of male-to-female donors was 52:48 and recipients 57.8:42.2 (P = NS). No significant differences in overall graft survival were observed between the tertiles. There was a negative correlation between the donor-to-recipient weight ratio and serum creatinine value at 1 (P < .001), 3 (P = .013), and 12 months (P = .005) after transplantation, and a positive correlation with the MDRD at 1 month (P < .001). No relation was noted between weight and proteinuria at 1 (P = .25), 3 (P = .51), or 12 months (P = .90). The results were similar after analyzing the ratio of the BMI to creatinine, MDRD or proteinuria, as well as in cases of a female donor to a male recipient.

Conclusions

Differences in weights between the donor and the recipient did not appear to affect graft survival or proteinuria among patients receiving grafts from expanded criteria donors, though it may be related to renal function during the early posttransplant stages.  相似文献   

3.

Objective

To study the influence of nonimmunologic factors on the outcome of extended criteria deceased donor (DD) kidney transplants.

Method

This is a retrospective study of DD transplantation carried out from January 1, 2003 to December 31, 2007, to investigate the impact on graft survival and function of donor renal function at retrieval, cold ischemia time (CIT), delayed graft function (DGF), acute rejection episodes (ARE), age, and weight of donors and recipients, transplant center activities, cause of donor death, donor-recipient gender pairing and size of the donating intensive care unit (ICU).

Results

At retrieval, the frequency of donors with a creatinine clearance <60 mL/min, using the Cockcroft-Gault formula, and age >40 years were 31.7% and 32%, respectively. CIT > 24 hours, DGF, and ARE occurred in 27.1%, 33.4%, and 16.5% of cases, respectively. The overall 1- and 5-year graft and patient survival rates were 88% and 79.8% and 96.6% and 92.3%, respectively. The graft function was inferior with occurrences of ARE (P = .0001), DGF (P = .0001), CIT > 20 hours (P = .005), nontraumatic the donor death (P = .022), and donor ICUs bed capacity <20 (P = .03). The odds ratio (OR) for graft loss with DGF, ARE, and donors right kidneys were 7.74 (95% confidence interval [CI] 6-13.4; P = .0001), 4.47 (95% CI, 2.6-7.6; P = .0001) and 1.7 (95% CI, 1-2.8; P = .045), respectively. Graft function was not influenced by donor renal function at retrieval, donor weight, or donor- recipient gender pairings.

Conclusion

CIT and ARE had an impact on both graft survival and function. DGF and cerebrovascular accidents as the cause of donor death negatively affected graft function during follow-up. ICU center experience had a positive impact on graft survival. Patient survival was affected by recipient age >50 years and female to male donation versus other gender pairings. Neither donor age nor acute terminal rise in the donor serum creatinine affected graft function or survival, or patient mortality.  相似文献   

4.

Introduction

In patients who receive a kidney transplant from expanded criteria donors (ECDs), few studies are available concerning the relation between the clinical characteristics, pretransplant biopsies, and graft outcomes.

Aim

To identify early clinical markers predicting worse graft survival in recipients of kidneys from ECDs.

Materials and methods

Between 1999 and 2006, we performed a prospective, observational study in 180 recipients of kidney grafts from ECDs that had undergone a preoperative biopsy to evaluate viability. The patients received immunosuppression with basiliximab, late introduction of tacrolimus, mycophenolate mofetil, and steroids. Data were gathered on demographic and posttransplantation clinical characteristics at 1, 3, 6, and 9 months, including estimates of proteinuria and of the glomerular filtration rate using the Modification of Diet in Renal Disease (MDRD) formula.

Results

The mean age of the donors was 63.54 years and of the recipients, 58.38 years. A creatinine clearance below the median (40 mL/min, interquartile range 32-50 mL/min) in the first posttransplant year was significantly associated with worse death-censored graft survival (log-rank 14.22, P < .0001). A proteinuria value above the median (100 mg/24 h, interquartile range 40-275 mg/24 h) at 1 year posttransplant significantly reduced the death-censored graft survival (log-rank 14.3, P < .0001). Multivariate Cox analysis showed that a creatinine clearance < 40 mL/min in the first year (hazardsratio [HR] 5.7, 95% Confidence Interval [CI] 1.62-20.37; P = .007) and proteinuria at 1 year greater tan 100 mg/24 h (HR 8.3, 95% CI 2.15-32.06; P = .002) were independent risk factors for death-censored graft loss after adjusting for donor age and acute rejection episodes.

Conclusions

Limited renal function and/or low proteinuria at 1 year posttransplant were associated with worse kidney graft survival among recipients of kidneys from ECDS.  相似文献   

5.

Background

Autosomal dominant polycystic kidney disease (ADPKD) is a hereditary disease that frequently leads to end-stage renal disease and is a common indication for kidney transplantation. We sought to evaluate the demographic characteristics, graft and patient survival, and some posttransplantation complications among ADPKD recipients.

Methods

This retrospective study included 445 renal transplant recipients, among whom 48 had ADPKD. We excluded patients with pretransplantation diabetes mellitus. We evaluated patient and graft survivals as well as posttransplantation complications.

Results

There was no difference between the 2 groups with respect to demographic or transplant characteristics, except for older age among the ADPKD group (51.2 ± 8.6 years vs 44 ± 13.1 years; P < .001). We also observed no significant difference with regard to immediate graft function, immunological graft, or patient survival. Although not significant, there was a lower incidence of proteinuria and a greater number of acute rejections among ADPKD patients. As for posttransplantation complications, there was no difference regarding the prevalence of hypertension, but there was more erythrocytosis among the ADPKD group. The incidence of posttransplantation diabetes mellitus was significantly greater in ADPKD patients (33.3% vs 17.1%; P = .009), and remained significant after adjusting for confounding variables by multivariate analysis with an adjusted odds ratio of 2.3 (95% confidence interval, 1.008-5.136; P = .048).

Conclusion

Our results suggested that ADPKD patients display a greater incidence of diabetes mellitus posttransplantation; ADPKD emerged as an independent predictor for this complication.  相似文献   

6.

Background

Multidetector computerized tomography (MDCT) is lesser invasive than conventional angiography and has the advantage of assessment of vessels and surrounding anatomic variants before laparoscopic nephrectomy.

Methods

From May 2005 to March 2011, 62 consecutive living kidney donors of mean age 45.3 ± 12.7 years (range 24-70 y, male:female 26:36) underwent laparoscopic nephrectomy to paired recipients of mean age 44.8 ± 14.0 years (range 17-74 y, male:female 38:24). The clinical characteristics and laboratory data of donors and recipients were collected for analysis. Graft function as indicated by estimated glomerular filtration rate (eGFR) was obtained from the last stable visit of the donors and the best value displayed by the recipients.

Results

There was no significant correlation between CT kidney volume and and eGFR. By univariate analysis, donor age was associated with worse graft function (−0.51 mL/min lower eGFR per 1 year of donor age; P < .0001). Female sex and higher effective renal plasma flow/body mass index ratio were associated with better graft function; conversely, body weight and BMI were associated with poor graft function upon univariate and multivariate analysis. An ERPF of <220 mL/min and a donor age >45 y showed significantly lower eGFR. There was no effect of CT kidney volume <100 mL.

Conclusions

Our preliminary data suggest that CT kidney volume does not predict posttransplantation graft function, but MDCT is still important for analysis of anatomy before laparoscopic nephrectomy among living donors.  相似文献   

7.

Background

It is generally recognized that living donor kidney transplantation (LDKT) grafts are superior to deceased donor kidney transplantation (DDKT) grafts. We compared survival and functional outcomes of LDKT and DDKT grafts.

Methods

Among 1000 kidneys transplanted from 1995 to 2008, we selected grafts surviving >5 years, excluding pediatric, multi-organ transplantation, and retransplantations (n = 454).

Results

There were 179 kidneys from deceased donors and 275 from living donors. Recipients showed no difference in age, gender, or cause of renal failure. Donors were younger in the DDKT group (30.6 vs 38.5 years; P < .05). There were more male donors in the DDKT group (73.2% vs 54.5%; P < .05). Deceased donors showed a greater mean number of HLA mismatches (4.2 vs 2.7; P < .05). Death-censored graft survival at 10 years showed no difference (DDKT 88.9% vs LDKT 88.9%; P = .99). Mean serum creatinine at 5 years was 1.41 mg/dL for DDKT and 1.44 mg/dL for LDKT (P = .75). Mean estimated glomerular filtration rate at 5 years was 67.8 mL/min/1.73 m2 for DDKT and 62.1 mL/min/1.73 m2 for LDKT (P = .23). Twenty-three DDKT grafts (12.8%) and 47 LDKT grafts (17.1%) experienced acute rejection episodes (P = .22). DDKT recipients showed more cases of viral and bacterial infections compared with LDKT recipients (viral, 11.7% vs 2.2% [P < .05]; bacterial, 21.8% vs 7.3% [P < .05]).

Conclusion

Among kidney grafts surviving >5 years, there was no difference in survival or serum creatinine levels at 5 and 10 years between DDKT and LDKT grafts.  相似文献   

8.

Objectives

The optimal use of kidneys from small pediatric cadaveric donors remains controversial. The aim of this study was to analyze short-term graft and patient survivals of en bloc kidney transplantations compared with single cadaveric adult donor kidney transplantations.

Patients and methods

We compared the 1-year evolution of 14 adult recipients of en bloc pediatric kidney donors (EBKT) of median age 13.5 ± 14.5 months (range = 3 days to 48 months) with 182 recipients of ideal adult cadaveric donors (ADT) showing a median age of 30 ± 21 years (range = 14-45 years).

Results

Besides the different age and weight of the donors, EBKT recipients were more commonly women (P = .05) and received thymoglobulin induction treatment (P = .00). Delayed graft function was higher in EBKT (46.2% vs 22.2%, P = .05), with no differences in the incidences of acute rejection episodes. Mean serum creatinine values at 3, 6, and 12 months after transplantation were 1.1 ± 0.3, 1.1 ± 0.2, and 1.0 ± 0.2 mg/dL in the EBKT group, compared with 1.3 ± 0.5 (P = .16), 1.3 ± 0.5 (P = .02), and 1.3 ± 0.6 (P < .01) in the ADT group. Vascular allograft complications were more frequent among EBKT. Graft survival rate at 1 year was 92% in both groups, with no differences in patient survival (100% in EBKT vs 92% in ADT; P = .49).

Conclusions

EBKT from small pediatric donors show excellent graft function and 1-year survival and should be considered for transplantation into adults.  相似文献   

9.

Introduction

Because kidneys show remarkable resilience and can recover function, we examined the impact on long-term graft survival in deceased donor renal transplants of both immediate graft function (IGF) and the rate of renal function recovery over the first 3 months after transplantation.

Methods

We included all cadaveric renal transplants from 1990 to 2007 (n = 583). Delayed graft function (DGF) was defined as the need for dialysis in the first 7 days posttransplant. Slow graft function (SGF) and IGF were defined by serum creatinine falls of <20% or >20% in the first 24 hours posttransplant respectively. Recovery of renal function was expressed as either the best creatinine clearance (CrCl) in the first 3 months post-renal transplantation (BCrCl-3mos) as calculated using the Cockcroft-Gault formula or as a percentage of actual versus expected value (as calculated from the donors' CrCl at procurement).

Results

There were 140 (23.6%) subjects who received extended criteria donor (ECD) organs. The overall graft survival at 1 and 5 years was 87.8% and 74%, respectively. The 5-year graft survivals for patients with IGF, SGF, and DGF were 85%, 76%, and 54%, respectively (P < .02). ECD kidneys showed twice the DGF rate (49% vs 23%, P < .001). BCrCl-3mos of <30 mL/min displayed a 5-year graft survival of 34%; 30 to 39 mL/min, 72%; 40 to 49 mL/min, 85%; and >50 mL/min, 82% (P < .001). Similarly, a recovery within 90% of expected CrCl in the first 3 months posttransplant correlated with 5-year graft survival of 81%; a recovery of 70% to 90%, with 65%; and a recovery of <70%, with 51% (P < .001).

Conclusion

Early graft function in the first 3 months showed a significant impact on long-term graft survival after deceased donor renal transplantation.  相似文献   

10.

Background

The advancing age of the population in the western world and improvements in surgical techniques and postoperative care have resulted in an increasing number of very elderly patients undergoing cardiac operations. Therefore, the aim of this study is to evaluate the surgical outcome in 115 octogenarians after aortic valve replacement.

Methods

We retrospectively identified 115 patients (47 men, 68 women) aged 82.3 ± 2.1 years (mean, 80 to 92 years) who underwent aortic valve replacement alone (71 patients, 62.1%) or in combination with coronary artery bypass grafting (44 patients, 37.9%), between January 1992 and April 2003. These patients had significant severe aortic stenosis with a mean valve area of 0.62 ± 0.15 cm2 and a mean gradient of 88.62 ± 24.06 mm Hg.

Results

The in-hospital mortality rate was 8.5%. The late follow-up was 100% complete. Actuarial survival at 1 and 5 years was 86.4% and 69.4%, respectively. Predictors of late mortality were ejection fraction (p < 0.01), preoperative heart failure (p < 0.03), and the type of prosthesis (p < 0.03).

Conclusions

The outcome after aortic valve replacement in octogenarians is excellent; the operative risk is acceptable and the late survival rate is good. Therefore, cardiac surgery should not be withheld on the basis of age alone.  相似文献   

11.

Introduction

Preoperative quantification of survival after transplantation would assist in assessing patients. We have developed a preliminary preoperative scoring system, called the Cambridge-Miami (CaMi) score, for transplantation of the small intestine either alone or as a composite graft.

Methods

The score combines putative risk factors for early-, medium-, and long-term survival. Factors included were loss of venous access and impairment of organs or systems not corrected by transplantation. Each factor was scored 0-3. A score of 3 indicated comorbidity approaching a contraindication for transplantation, that which might lead to but was not currently an adverse risk factor scored 1, and that presenting a definite but moderate increase in risk scored 2. The preoperative scores of 20 patients who had received intestinal transplants either isolated or as part of a cluster graft, who had either been followed up postoperatively for at least 10 years, or died within 10 years were compared with their survivals.

Results

Postoperative survival and CaMi score inversely correlated when analysed using Spearman test (rs = −0.82; P = .0001). A score of <3 associated with survival ≥3 years (12/12 patients) and >3 with survival of <6 months (4/4). Patient Kaplan-Meier (KM) survival curves for patients grouped according to CaMi score became significantly different from group 0 to group 3. Using this as a threshold score patients grouped as either >2 or <3 had significantly different survival rates (log-rank; P = .0001), KM median survival hazard ratio (HR) = 6, and rate of death KM HR = 5. Receiver-operator characteristics indicate a high degree of accuracy for prediction of death with an area under the curve (C statistic) at 3 years of 0.98, at 5 years of 0.82, and at 10 years of 0.65.

Conclusion

This initial validation suggested that the preoperative CaMi score predicted postoperative survival.  相似文献   

12.

Introduction

Posttransplant hepatitis C virus (HCV) recurrence has been shown to negatively impact graft and patient survivals. It has been suggested that HCV recurrence among HIV- and HCV-coinfected transplant recipients is even more aggressive.

Objective

To compare the histological severity and survival of posttransplant HCV recurrence between HIV- and HCV-coinfected and HCV-monoinfected patients.

Patients and methods

Among 72 adult patients who underwent primary liver transplantation at our institution for HCV-related cirrhosis between October 2001 and April 2007. We excluded one coinfected patient who died on postoperative day 5 leaving 12 HIV- and HCV-coinfected patients for comparison with 59 monoinfected patients. When listed, all coinfected patients fulfilled the criteria of the Spanish Consensus Document for transplantation in HIV patients. Immunosuppression did not differ between the two groups: all were treated with tacrolimus + steroids (slow tapering). Aggressive HCV recurrence was defined as cholestatic hepatitis and/or a fibrosis grade ≥2 during the first posttransplant year.

Results

Coinfected patients were younger than monoinfected patients: 45 ± 6 years vs 55 ± 9 years (P = .0008). There were no differences in Child score, Model for End-stage Liver Disease score, donor age, graft steatosis, ischemia time, HCV pretransplant viral load or genotype between the groups. Significant rejection episodes were also equally distributed (25% vs 14%; P = .38). Seven coinfected patients and 29 monoinfected patients developed aggressive HCV recurrences (58% vs 49%; P = .75). Median follow-up was 924 days. Global survival at 3 years was 80%. Survivals at 1, 2, and 3 years were 83%, 75%, 62% in the coinfected vs 98%, 89%, 84% in the monoinfected patients, respectively (log-rank test = 0.09).

Conclusions

The severity of histological recurrence was similar among HIV- and HCV-coinfected and monoinfected HCV liver recipients in the first posttransplant year. Mortality attributed to recurrent HCV was similar in the groups. There were no short-term (3-year) differences in survival between the two groups of patients.  相似文献   

13.

Introduction

Prolonged cold ischemic time (CIT) in cadaveric renal transplants has been associated with a high rate of delayed graft function, acute rejection, and even reduced graft survival. We analyzed the influence of CIT on both initial graft function (IGF) and survival rate.

Methods

We studied 2525 noncombined cadaveric cases in recipients over 17 years of age between 2000 and 2008, using data from the renal transplant records of Andalusia. We defined IGF as the need to resume dialysis within the first week or a nonfunctional kidney. The multivariate analyses were corrected by center and year of transplantation.

Results

The mean and median cold ischemic time was 17 hours. The duration of CIT was significantly associated (P < .001) with older donor and recipient age. The frequency of IGF increased progressively with longer CIT and older donors. However, the influence of CIT persisted among all donor age strata. Logistic regression analysis using both donor and recipient age as covariables showed a relative risk per hour of 1.05 (95% confidence interval = 1.04-1.07; P < .0001). In a univariable study, longer CIT led to a significant reduction in both recipient and graft survival rates. The multivariate study (Cox) using preprocedure covariables, showed CIT to produce significantly worse survival rates for both recipients (relative risk: 1.03, 1.005-1.05, P = .02) and for grafts (relative risk: 1.03, 1.01-1.04, P = .002). However, the survival rates showed no clear progression in terms of CIT within each individual donor age stratum.

Conclusions

A longer CIT was associated with an increase in IGF independent of the age of both the donor and the recipient. Our data also suggested that CIT influenced patient and graft survival rates.  相似文献   

14.

Introduction

In our previous prospective single-center study, using validated self-administered instruments, we demonstrated correlation between depression and nonadherence in recipients of kidney transplants. The purpose of this study was to confirm our finding that depression was associated with nonadherence in a large database of transplant recipients for which we used the United States Renal Data System (USRDS).

Methods

We conducted a retrospective cohort study of 32,757 Medicare primary renal transplant recipients in the USRDS who underwent transplantation from January 1, 2000 to July 31, 2004 and were followed up through December 31, 2004, assessing Medicare claims showing depression and nonadherence based on codes of the International Classification of Diseases, 9th Revision.

Results

Logistic regression analysis (adjusted hazards ratio 1.69 with 95% confidence interval, 1.48-1.92) and log rank test (P < .0005) showed that there was a strong association of depression and nonadherence. Depression was associated with nonadherence, irrespective of the time of depression, whether it was pretransplantation (P < .001) or posttransplantation (P < .001). Nonadherence was also associated with black race (P < .001), younger age (P < .001), less HLA mismatch (P < .005), recipients of living kidneys and patients who underwent transplantation a longer time ago (P < .001). Furthermore, patients with 12 or less years of education were more nonadherent (P < .001). Among the transplant donor factors we investigated, donor black race (P < .001) and expanded criteria donor kidneys were strongly associated with nonadherence (P < .001). However, donor age and delayed graft function were not significantly associated with nonadherence.

Conclusions

Future clinical trials of immunosuppressive therapy should assess the impact of depression on graft survival.  相似文献   

15.

Objective

We assessed the impact of hypertension on renal transplant function and survival in the past decade after introduction of mycophenolate mofetil and rituximab.

Methods

We examined the 184 patients who underwent renal transplantation from March 1982 to September 1999 and presented at our outpatient clinic from 2001 to 2011. They were divided into group 1 with mean systolic blood pressure (mSBP) >130 mm Hg and Group 2 with mSBP <130 mm Hg. We compared mean serum creatinine (sCr) levels for 9 years and 12-year actuarial graft survival rates. Risk factors for graft survival were assessed by Cox regression analysis.

Results

There were 75 group 1 and 109 group 2 recipients. The mean sCr level of group 1 was 1.59 ± 0.12 mg/dL and that of group 2 1.54 ± 0.10 mg/dL (P < .0001). Of note was that mean sCr levels of group 1 started to increase about 3 years after transplantation. Although 5-year graft survival rates of both groups were 100%, 9- and 12-year rates among group 1 were 97.3% and 90.5%, respectively, whereas among group 2 they were 99.1% and 98.1%, respectively (P = .0195). Cox univariate and multivariate analyses showed mean SBP to be the only significant risk factor for graft survival (P < .05).

Conclusions

We concluded that the hypertensive group showed deteriorating renal function from around 3 years after transplantation that lowered graft survival afterward, resulting in a clear distinction from the nonhypertensive group at around 10 years after transplantation. Mean SBP was a significant risk factor for graft survival. Hypertension may be a surrogate for a poor renal graft prognosis in the long run.  相似文献   

16.

Background

Kidney transplantation is widely recognized as the best treatment in patients who require renal replacement therapy. Although considered a clinical and surgical triumph, it is also a source of frustration because of lack of donor organs and the growth of waiting lists. Strategies need to be developed to increase the supply of organs. One measure is use of expanded criteria for donation.

Objective

To evaluate the effect of donor age on cadaver graft survival.

Materials and Methods

We reviewed the medical records for 454 patients who underwent kidney transplantation with cadaver donors from April 1987 to December 2003.

Results

Donor age had a significant effect on kidney transplant survival. Survival of grafts from donors aged 16 to 40 years (mean, 143.30 months) was significantly greater compared with that of grafts from donors older than 40 years (66.46 months) (P = .005). The HLA matching and cold ischemia time did not significantly affect transplant survival (P = .98 and P = .16, respectively).

Conclusions

Kidneys from cadaver donors older than 40 years significantly compromised graft survival, generating a negative effect via early return of recipients to waiting lists and increasing the rate of repeat transplantation, risk of death, and unnecessary costs.  相似文献   

17.
Delayed graft function (DGF), a frequent complication after kidney transplantation, occurs among about 60% of recipients of kidneys from deceased donors. DGF has a multifactorial etiology. It is characterized by acute tubular necrosis (ATN) upon biopsy. In this study we sought to identify among a group of recipients of kidneys from deceased donors, the incidence, risk factors, and impacts on patient and graft survivals of DGF.

Materials and Methods

We retrospectively analyzed medical records from renal transplant recipients aged >18 years who received a deceased donor kidney graft between January 2003 and December 2006. Kidneys lost during the first week posttransplantation were excluded from this series.

Results

Among 165 transplants, 111 (67%) displayed DGF, defined as the need for dialysis during the first week posttransplantation. The incidence of DGF was higher among patients with a cold ischemia time (CIT) > 24 hours: 85% vs 60%, DGF vs no DGF (P < .05), as well as for grafts from older donors. After 1-year follow-up, the DGF group showed worse graft function (serum creatinine 1.6 ± 0.7 vs 1.3 ± 0.4 mg/dL; P < .05) as well as a greater incidence of graft loss.

Conclusion

Prolonged cold ischemia and older donor age were associated with a greater incidence of DGF in this series, leading to prolonged hospitalization, increased risk for an acute rejection episode, and reduced graft function and survival after 1 year.  相似文献   

18.

Introduction

Cardiac allograft vasculopathy remains the leading cause of late morbidity and mortality in heart transplantation. The main diagnostic methods, coronary angiography or intracoronary ultrasound (when angiography is normal), are invasive. Other study methods, such as coronary computed tomography (CT) and virtual histological analysis, have not been widely assessed in this condition.

Objective

The objective of this study was to assess the correlation between data obtained from analysis of virtual histology compared with those obtained from the performance of coronary CT in cardiac transplant recipients.

Materials and Methods

During the same admission we performed coronary angiography and intravascular ultrasound with virtual histological analysis (automatic pull-back in anterior descending artery and one additional vessel if the former was normal) as well as coronary CT.

Results

The study included 10 patients. Virtual histology was done in segments with intimal thickening >0.5 mm, defining 2 groups of plaque, those with an inflammatory component (necrotic core >30% and calcium) versus those without it defined as the combination of both being <30%. A calcium component of the inflammatory plaque allowed coronary CT detection.

Conclusions

The detection of inflammatory plaque in graft vessel disease can be based on an initial noninvasive method, such as coronary CT, although confirmation requires further study.  相似文献   

19.

Purpose

To design effective pediatric trauma care delivery systems, it is important to correlate site of care with corresponding outcomes. Using a multistate administrative database, we describe recent patient allocation and outcomes in pediatric injury.

Methods

The 2000 Kids' Inpatient Database, containing 2,516,833 inpatient discharge records from 27 states, was filtered by E-code to yield pediatric injury cases. Injury Severity Scores (ISSs) were derived for each discharge using ICDMAP-90 (Tri-Analytics, Inc, Forest Hill, MD). After weighting to estimate national trends, cases were grouped by age (0-10, >10-20 years), ISS (≤15, >15), and National Association of Children's Hospitals and Related Institutions-designated site of care. Measured outcomes included mortality, length of stay, and total charges. Analysis was completed using Student's t test and χ2.

Results

Among 79,673 injury cases, mean age was 12.2 ± 6.2 years and ISS was 7.4 ± 7.6. Eighty-nine percent of injured children received care outside of children's hospitals. In the subgroup of patients aged 0 to 10 years with ISS of greater than 15, the mean ISS for adult hospitals and children's hospitals was not significantly different (18.9 ± 9.1 vs. 19.4 ± 9.3, P = .08). However, in-hospital mortality, length of stay, and charges were all significantly higher in adult hospitals (P < .0001).

Conclusions

Younger and more seriously injured children have improved outcomes in children's hospitals. Appropriate triage may improve outcomes in pediatric trauma.  相似文献   

20.

Objective

Kidney transplantation is the selective treatment of end-stage renal disease. Although most previous studies have concluded that living kidney donation achieves better graft survival, some factors may limit this type of donation. This study investigated the survival rates of living and deceased donor kidney transplantations among Iranian patients.

Materials and Methods

The records of kidney transplantations up to year 2005 were used to compare 50 deceased (group I) with 50 living donor transplants (group II). The recipients were matched by transplantation time. We used SPSS version 15 software to analyze the data.

Results

Group I patients included 28 males and 22 females of mean age of 38 ± 13 years, while 26 males and 24 females in group II had a mean age of 34.6 ± 14 years. The rejection and graft nephrectomy rates were significantly higher among group I than group II (P = .01, P = .02). The first-year graft survival was higher in group II (P = .001). The graft survival was significantly lower in recipients who needed a biopsy or dialysis (P = .006 and P = .02, respectively) and higher among those who had a urine volume >4200 mL within the first 24 hours after transplantation (P = .003). Patient survivals were not significantly different between the groups.

Conclusion

Living donor kidney transplantations showed higher graft survival and lower acute rejection rates compared with those from deceased donors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号