首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 491 毫秒
1.

Objective

To evaluate the influence of traditional risk factors on major kidney transplantation outcome.

Patients and Methods

Data from kidney transplantation procedures performed between 2003 and 2006 were retrospectively analyzed for the influence of traditional risk factors on transplantation outcome. Of 2364 transplants, 67% were from living donors, 27% were from donors who met standard criteria, and 6% were from donor who met expanded criteria. Two hundred thirty-nine procedures (10%) were performed in pediatric patients. Immunosuppression was selected on the basis of subgroup population.

Results

At 1 year posttransplantation, cumulative freedom from a treated acute rejection episode (ARE) was 76.7%, with no difference between black vs nonblack recipients (75.0% vs 73.4%; P = .79). At 2 years, survival for patients (95.3% vs 88.3% vs 82.1%; P < .001) and grafts 92.3% vs 80.3% vs 70.9%; P < .001) was better in recipients of living donor grafts compared with donors who met standard or expanded criteria, respectively. Moreover, graft survival was poorer in black vs nonblack patients (83.6% vs 88.7%; P < .05) because of high mortality (13% vs 7%; P<.001). Risk factors associated with death included cadaveric donor organ (odds ratio [OR], 2.4) and black race (OR, 1.8), and risk factors associated with graft loss included cadaveric donor organ (OR, 2.1), extended-criteria criteria donor organ (OR, 2.0), delayed graft function (OR, 1.8), and any ARE (OR, 3.5). At 6 months posttransplantation, risk factors associated with death included cadaveric donor organ (OR, 2.5) or ARE (OR, 2.4), and risk factors associated with graft loss included cadaveric donor organ (OR, 2.0), extended-criteria donor organ (OR, 2.6), ARE (OR, 9.5), and impaired graft function (creatinine concentration >1.5 mg/dL; OR, 2.1).

Conclusion

Traditional risk factors are still associated with transplantation outcome. Poorer graft survival in black vs nonblack recipients was due to higher mortality rather than graft loss.  相似文献   

2.

Introduction

The use of monoclonal antibodies in renal transplantation for induction therapy has been associated with a marked reduction in acute rejection rates with an impact on graft and patient survivals.

Objective

We sought to evaluate the efficacy of renal transplant induction protocols using Basiliximab based on the rates of acute rejection episodes (ARE) and delayed graft function (DGF) of infectious complications in the first 6 months posttransplant, as well as patient and graft survivals.

Methods

We retrospectively evaluated all renal transplants performed between 2000 and 2008 that were primary grafts from cadaveric heart-beating donors, into recipients with a panel reactive antibody titer <5% and who were treated with an immunosuppression scheme based on cyclosporine, mycophenolate mofetil/mycophenolic acid plus corticosteroids, with (group 1) or without basiliximab (group 2).

Results

We enrolled 52 recipients in group 1 (induction with basiliximab) and 189 in group 2 (without basiliximab). The baseline characteristics were similar among the groups, except for time on dialysis which was longer in group 1 and the number of HLA matches, which was lower in group 1. The ARE rate was lower among group 1 (7.8% vs 27.8%; P = .001); rates of DGF and infectious complications were similar. There was no significant difference in graft and patient survivals.

Conclusion

In this study, induction with basiliximab was associated with a reduced rate rate of ARE, despite a lower number of HLA matches and a longer previous time on dialysis. The use of this induction modality was not associated with a greater rate of infectious complications.  相似文献   

3.

Purpose

Gastric fundoplication (GF) for gastroesophageal reflux disease (GERD) may protect against the progression of chronic rejection in lung transplant (LT) recipients. However, the association of GERD with acute rejection episodes (ARE) is uncertain. This study sought to identify if ARE were linked to GERD in LT patients.

Methods

This single-center retrospective observational study, of patients transplanted from January 1, 2000, to January 31, 2009, correlated results of pH probe testing for GERD with ARE (≥International Society for Heart and Lung Transplantation A1 or B1). We compared the rates of ARE among patients with GERD (DeMeester Score > 14.7) versus without GERD as number of ARE per 1,000 patient-days after LT. Patients undergoing GF prior to LT were excluded.

Results

The analysis included 60 LT subjects and 9,249 patient-days: 33 with GERD versus 27 without GERD. We observed 51 ARE among 60 LT recipients. The rate of ARE was highest among patients with GERD: 8.49 versus 2.58, an incidence density ratio (IDR) of 3.29 (P = .00016). Upon multivariate negative binomial regression modeling, only GERD was associated with ARE (IDR 2.15; P = .009). Furthermore, GERD was associated with multiple ARE (36.4% vs 0%; P < .0001) and earlier onset compared with patients without GERD: ARE proportion at 2 months was 0.55 versus 0.26 P = .004).

Conclusion

In LT recipients, GERD was associated with a higher rate, multiple events, and earlier onset of ARE. The efficacy of GF to reduce ARE among patients with GERD needs further evaluation.  相似文献   

4.

Objectives

The aim of this study was a comparison of contrast-enhanced sonography (CEUS) and power Doppler ultrasound (US) findings in renal grafts within 30 days posttransplantation.

Methods

A total of 39 kidney recipients underwent CEUS (SonoVue bolus injection) and US examinations at 5 (T0), 15 (T1), and 30 (T2) days after grafting. The results were correlated with clinical findings and functional evolution. Fourteen patients displayed early acute kidney dysfunction: 10 had acute tubular necrosis (acute tubular necrosis [ATN] group); four acute rejection episodes (ARE group); 25 with normal evolution (as control, C group). Renal biopsies were performed to obtain a diagnosis in the four ATN cases and in all ARE patients. Creatinine and estimated glomerular filtration rate were used as kidney function parameters. CEUS analysis was performed both on cortical and medullary regions while US resistivity indexes (RI) were obtained on main, infrarenal, and arcuate arteries. From an analysis of CEUS time-intensity curves, we computed peak enhancement (PEAK), time to peak (TTP), mean transit time (MTT), regional blood flow (RBF) and volume (RBV), and cortical to medullary ratio of these indies (RATIO).

Results

An increased RI was present in the ATN and ARE groups as well as a reduced PEAK and RBF. RATIO-RBV and RATIO-MTT were lower than C among ATN cases, while TTP was higher compared to C in ARE. No statistical difference was evidence for RI between ATN and ARE groups. MTT (T0) was significantly related to creatinine at follow-up (T2).

Conclusions

US and CEUS identified grafts with early dysfunction, but only some CEUS-derived parameters distinguished ATN from ARE, adding prognostic information.  相似文献   

5.

Background

The immune rejection has been anticipated as one of the major causes of allograft aortic valve (AAV) degeneration. The purpose of this study was to prospectively serially measure the magnitude and evolution of the recipient anti-HLA class I antibody response up to 6 years from AAV implant and to correlate serologic data with valve performance by means of a concurrent echocardiographic survey.

Methods

Cryopreserved AAVs were obtained from multiorgan HLA-typed donors. Nineteen patients younger than 50 years (mean age, 43.3 ± 8 years) were prospectively studied. After successful surgery, all AAV recipient underwent at 3 and 6 months and each year postoperatively (mean follow-up, 71.9 months) concomitant serum sample collection and two-dimensional transthoracic echocardiography. The presence of anti-HLA antibodies was tested against a panel of lymphocytes obtained from 30 blood donors.

Results

Progressive structural valve deterioration was seen in 6 patients (31.5%) of whom 4 (21%) were reoperated. All pretransplant recipients sera were panel-reactive antibody negative. Seventeen patients (89.4%) demonstrated significant panel-reactive antibody levels, which peaked at 6 months postoperatively, declined from 6 to 24 months, and slowly decreased afterward. In 14 of 19 cases (73.6%) donor-specific HLA antibodies were identified. A strong immunization (6-year persistence of panel-reactive antibody > 70% and peak panel-reactive antibody > 80%) was detected in 31.5% and 36.8% of recipients, respectively. Strong immunization was found to be significantly associated with progressive structural deterioration.

Conclusions

The immune reaction after cryopreserved AAV implantation is a peculiar long-lasting response occurring in the majority of recipients younger than 50 years of age. An association between a sustained and pronounced immunization and an aggressive AAV degeneration was observed.  相似文献   

6.

Background

The number of obese kidney transplant candidates has been growing. However, there are conflicting results regarding to the effect of obesity on kidney transplantation outcome. The aim of this study was to investigate the association between the body mass index (BMI) and graft survival by using continuous versus categoric BMI values as an independent risk factor in renal transplantation.

Methods

We retrospectively reviewed 376 kidney transplant recipients to evaluate graft and patient survivals between normal-weight, overweight, and obese patients at the time of transplantation, considering BMI as a categoric variable.

Results

Obese patients were more likely to be male and older than normal-weight recipients (P = .021; P = .002; respectively). Graft loss was significantly higher among obese compared with nonobese recipients. Obese patients displayed significantly lower survival compared with nonobese subjects at 1 year (76.9% vs 35.3%; P = .024) and 3 years (46.2% vs 11.8%; P = .035).

Conclusions

Obesity may represent an independent risk factor for graft loss and patient death. Careful patient selection with pretransplantation weight reduction is mandatory to reduce the rate of early posttransplantation complications and to improve long-term outcomes.  相似文献   

7.

Background

Exocrine tissue is commonly cotransplanted with islets in autografting and allotransplantation of impure preparations. Proteases and insulin are released by acinar cells and islets, respectively, during pretransplantation culture and also systemically after transplantation. We hypothesized that released proteases could cleave insulin molecules and that addition of alpha-1 antitrypsin (A1AT) to impure islet cultures would block this cleavage, improving islet recovery and function.

Methods

Trypsin, chymotrypsin, and elastase (TCE) activity and insulin levels were measured in culture supernates of pure (n = 5) and impure (n = 5) islet fractions, which were isolated from deceased donors. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) was used to detect insulin after incubation with proteases. We assessed the effects of A1AT supplementation (0.5 mg/mL; n = 4] on TCE activity, insulin levels, culture recovery, and islet quality. The ultrastructure of islets exposed to TCE versus control medium was examined using electron microscopy (EM).

Results

Protease (TCE) activity in culture supernatants was indirectly proportional to the percentage purity of islets: pure, impure, or highly impure. Increasingly lower levels of insulin were detected in culture supernatants when higher protease activity levels were present. Insulin levels measured from supernatants of impure and highly impure islet preparations were 61 ± 23.7% and 34 ± 33% of that in pure preparations, respectively. Incubation with commercially available proteases (TCE) or exocrine acinar cell supernatant cleaved insulin molecules as assessed using SDS-PAGE. Addition of A1AT to impure islet preparations reduced protease activity and restored normal insulin levels as detected using enzyme-linked immunosorbent assay (ELISA) and SDS-PAGE of culture supernates. A1AT improved insulin levels to 98% ± 1.3% in impure and 78% ± 34.2% in highly impure fractions compared with pure islet fractions. A1AT supplementation improved postculture recovery of islets in impure preparations compared with nontreated controls (72% ± 9% vs 47% ± 15%). Islet viability as measured using membrane integrity assays was similar in both the control (98% ± 2%) and the A1AT-treated groups (99% ± 1%). EM results revealed a reduction or absence of secretory granules after exposure to proteases (TCE).

Conclusion

Culture of impure human islet fractions in the presence of A1AT prevented insulin cleavage and improved islet recovery. A1AT supplementation of islet culture media, therefore, may increase the proportion of human islet products that meet release criteria for transplantation.  相似文献   

8.

Introduction and Objectives

Inhaled nitric oxide (iNO) is a gaseous drug with known properties of specific pulmonary vasodilation and improved oxygenation. In some clinical trials on lung transplantation (LT) in animals, it has been demonstrated to reduce primary graft dysfunction (PGD) by limiting neutrophil adhesion and the inflammatory cascade. Our objective was to assess whether iNO showed this immunomodulatory effect by determining interleukin (IL)-6, -8, and -10 levels in blood and bronchoalveolar lavage (BAL) in LT patients, and its relationship with PGD incidence.

Materials and Methods

Forty-nine LT patients were recruited and included in the iNO or in the control group. Patients in the first group were given iNO (10 ppm) from the start of LT to 48 hours afterward. BAL and blood samples were taken preimplantation and at 12, 24, and 48 hours after graft reperfusion.

Results

The iNO group displayed a significantly lower incidence (P < .035) of PGD (17.2%) than the control group (45%). Significant differences (P < .05) were also observed in the iNO group with lower levels of IL-6 (in blood at 12 hours), IL-8 (in blood and BAL at 12 and 24 hours), and IL-10 (in blood at 12 and 24 hours and BAL at 24 hours).

Conclusions

PGD is associated with the development of an inflammatory process that is reduced by giving iNO to lung recipients. In our series, the iNO group displayed significantly lower content of IL-6, IL-8, and IL-10 in the majority of samples at 12 and 24 hours compared with the control group.  相似文献   

9.
10.

Background

Psychosocial status of donors before and after living kidney donor transplantation has been an important concern. Investigations of psychosocial issues in related recipients are not frequent.

Aim

The aims of this study were to evaluate and compare psychopathologic dimensions in donors and recipients before and after transplantation.

Methods

Thirty-five recipients and 45 donors completed a psychosocial evaluation before and after transplantation. We applied Pearson chi-square, McNemar, Fisher, Wilcoxon, and Mann-Whitney tests as well as linear and logistic regression statistical methods.

Results

Before transplantation 100% of the recipients presented total anxiety, compared with 64.4% of donors, with higher anxiety levels in all dimensions (P < .001). Also, 38.7% of recipients and 16.3% of donors had moderate/serious depression (P = .029). Men showed higher levels of cognitive anxiety before transplantation (odds ratio [OR] = 4.3; P = .008). After versus before transplantation central nervous system and cognitive anxiety had diminished in recipients (P = .031; P = .035, respectively); there were higher levels of cognitive anxiety than among the donors (P = .007). Depression showed no significant changes in recipients or donors; the differences were no longer significant. There were less severely depressed recipients but an increase among severely depressed donors. Male recipients and donors showed greater cognitive anxiety (P = .02; P = .04, respectively) at both times. Female recipients presented with more severe depression (P = .036).

Conclusions

Anxiety is an important symptom. Surgery had a positive impact to lower anxiety in recipients. Most protagonists displayed little or no depression; it was more prevalent among recipients. Donors and recipients maintained some psychopathologic symptoms after surgery. We defined vulnerable groups among these cohorts.  相似文献   

11.

Background

Neoadjuvant chemotherapy reduces tumor size before surgery in women with breast cancer. The aim of this study was to assess the ability of mammography and ultrasound to predict residual tumor size following neoadjuvant chemotherapy.

Methods

In a retrospective review of consecutive breast cancer patients treated with neoadjuvant chemotherapy, residual tumor size estimated by diagnostic imaging was compared with residual tumor size determined by surgical pathology.

Results

One hundred ninety-two patients with 196 primary breast cancers were studied. Of 104 tumors evaluated by both imaging modalities, ultrasound was able to size 91.3%, and mammography was able to size only 51.9% (χ2P < .001). Ultrasound also was more accurate than mammography in estimating residual tumor size (62 of 104 [59.6%] vs 33 of 104 [31.7%], P < .001). There was little difference in the ability of mammography and ultrasound to predict pathologic complete response (receiver operating characteristic, 0.741 vs 0.784).

Conclusions

Breast ultrasound was more accurate than mammography in predicting residual tumor size following neoadjuvant chemotherapy. The likelihood of a complete pathologic response was 80% when both imaging modalities demonstrated no residual disease.  相似文献   

12.

Study Objective

To evaluate the effect of clonidine when added to local anesthetics on duration of postoperative analgesia during retrobulbar block.

Design

Prospective, randomized controlled trial.

Setting

Operating room and Postanesthesia Care Unit of a university-affiliated hospital.

Subjects

80 ASA physical status 1, 2, and 3 patients undergoing vitreoretinal surgery with or without scleral buckling.

Interventions

Patients in the control group (n = 40) received a retrobulbar block with 4.5 mL of lidocaine-bupivacaine and 0.5 mL of saline. Clonidine group patients (n = 40) received 4.5 mL of lidocaine-bupivacaine and 0.5 μg/kg of clonidine in a 0.5 mL volume.

Measurements

The time to first analgesic request, frequency of postoperative pain, and number of postoperative analgesic requests per patient were assessed.

Main Results

37 patients in the control group (92.5%) versus 24 patients (60%) in the clonidine group reported pain postoperatively (P = 0.001), with a shorter time to first analgesic request noted in the control group (4.9 ± 3 vs 11.9 ± 5.3 hrs; P < 0.001). The median number of postoperative analgesic requests per patient during the first 24 hours was higher in the control group than the clonidine group [2 (0-3) vs. 1 (0-3); P < 0.001].

Conclusions

The addition of clonidine 0.5 μg/kg to the local anesthetics of a retrobulbar block for vitreoretinal surgery decreases the frequency of postoperative pain and prolongs the time of analgesia.  相似文献   

13.

Background/purpose

Previous reports indicate that complete resection of high-risk neuroblastoma improves outcome but may entail high surgical complication rates. The authors evaluated the effect of complete primary site resection on event-free survival (EFS), overall survival (OS), and complication rates in patients entered on a high-risk neuroblastoma treatment protocol.

Methods

A total of 539 eligible patients with high-risk neuroblastoma were entered on protocol CCG-3891. Patients were assigned randomly to continuation chemotherapy or autologous bone marrow transplantation. Surgical resection was performed at diagnosis or after induction chemotherapy. Surgeons assessed resection as complete (CR), minimal residual (<5%, MR), or partial (PR). Incomplete resections received secondary resection or 10 Gy of external beam radiation. Patients were evaluated for EFS, OS, and complications of surgery based on completeness of overall best resection.

Results

The proportion of patients resectable at diagnosis was 27% for CR and 14% for MR. This improved after chemotherapy to 45% and 25%. Complication rates based on completeness of resection were 29%, 38%, and 36% for CR, MR, and PR, respectively. Estimated 5-year EFS rate was 30% ± 3% for patients who achieved CR (n = 210) compared with 25% ± 3% (P = .1010) for those with less than CR (n = 258).

Conclusions

Resectability improved after neoadjuvant chemotherapy. Complete resection did not increase complications. There was a small survival benefit for complete resection. This study suggests that complete resection may still be important in the current era of intense chemotherapy and transplant.  相似文献   

14.

Background

Delayed graft function (DGF) has a negative effect on the results of living-donor kidney transplantation.

Objective

To investigate potential risk factors for DGF.

Methods

This prospective study included 200 consecutive living donors and their recipients between January 2002 and July 2007. Delayed graft function was defined as need for dialysis within the first postoperative week.

Results

Delayed graft function was diagnosed in 12 patients (6%). Intraoperative complications occurred in 10 donors (5%), and postoperative complications in 24 donors (13.5%). One-year kidney graft survival with vs without DGF was 52% and 98%, respectively (P < .002). In donors, 2 univariate risk factors for DGF identified were lower counts per second at peak activity during scintigraphy, and multiple renal veins. In recipients, only 2 or more kidney transplantations and occurrence of an acute rejection episode were important factors. At multivariate analysis, increased risk of DGF was associated with the presence of multiple renal veins (odds ratio, 151.57; 95% confidence interval, 2.53-9093.86) and an acute rejection episode (odds ratio, 78.87; 95% confidence interval, 3.17-1959.62).

Conclusion

Hand-assisted laparoscopic donor nephrectomy is a safe procedure. The presence of multiple renal veins and occurrence of an acute rejection episode are independent risk factors for DGF.  相似文献   

15.

Objectives

We investigated the incidence and risk factors for the metabolic syndrome (MS) and posttransplant diabetes mellitus (PTDM) among renal transplant recipients on tacrolimus-based immunosuppressive regimens during the first year posttransplant. In addition, we studied the relationship between MS and PTDM with transplant renal function at 1 year.

Methods

We included the 100 patients who received a renal transplant in our unit between January 2007 and June 2008, collecting demographic, clinical and biochemical characteristics at 1, 6, and 12 months posttransplantation. We excluded 15% of patients with pretransplantation diabetes. MS was defined according to the National Cholesterol Education Program criteria and PTDM according to World Health Organization criteria. Insulin resistance at one year posttransplant was measured using the homeostasis model assessment (HOMA) index.

Results

Insulin therapy was required in 46% of patients during the first hospitalization and hyperglycemia was present in 65% of the cases. The incidence of PTDM decreased throughout the first year posttransplant, namely, 44%, 24%, and 13% at 1, 6, and 12 months, respectively. The incidence of MS increased to 33%, 48% and 50% at 1, 6, and 12 months, respectively. Age, body mass index, plasma fasting glucose levels at 1 month posttransplant, and pretransplant fasting triglyceridemia predicted PTDM. Rejection and in-patient hyperglycemia predicted MS. PTDM and MS were closely correlated (P = .004). The HOMA index was higher among patients with MS than other subjects at 1 year posttransplant: 3.2 (1.2) versus 2.3 (0.9; P = .035). Neither PTDM nor MS was associated with impaired plasma creatinine levels at 1 year after kidney transplantation.

Conclusion

There was an high incidence of PTDM and MS among kidney transplant recipients treated with tacrolimus as the main immunosuppressive agent. The HOMA index was a good test of insulin resistance in this population. Screening and treatment of risk factors may avoid the development of these entities, which are related to poor cardiovascular outcomes.  相似文献   

16.

Objective

Renal allografts with excellent graft function show good long-term outcomes, while grafts with delayed function have been associated with poor long-term survivals, although few reports have analyzed outcomes among these groups. We compared first-week postoperative graft function among renal transplant patients to analyze the impact of slow graft function (SGF) and delayed graft function (DGF) on graft survival.

Materials and Methods

Renal transplantations were performed from 362 unrelated, 46 related, and 163 deceased donors. Kidney transplant patients were divided into 3 groups according to their initial graft function. First-week dialyzed patients formed the DGF group. Nondialyzed patients were divided into a SGF or an excellent graft function (EGF) cohort according to whether the serum creatinine at day 7 was higher vs lower than 2.5 mg/dL, respectively.

Results

Of the 570 renal transplant recipients, DGF was observed in 39 patients (6.8%), SGF in 64 (11.2%), and EGF in 467 (81.8%). There was no significant difference in SGF vs DGF between patients who received kidneys from unrelated vs related living or deceased donors. Graft survival was worse among the DGF than the SGF or EGF patients, with no significant difference between the last 2 groups. The 6-month graft survivals were 74%, 93%, and 96%; the 3-year graft survivals were 70%, 88%, and 90%, respectively (P < .001).

Conclusions

We observed a similar impact of EGF and SGF on kidney graft survival. Kidney transplant recipients who developed DGF showed worse graft survival than those with EGF or SGF.  相似文献   

17.

Background

Inhibitors of mammalian target of rapamycin (mTORi) have been suggested as an alternative to calcineurin inhibitors (CNIs) to treat stable renal transplant recipients. However, their use has been significantly limited owing to a high incidence of side effects.

Objective

To compare the rate of dropout (mTORi elimination and CNI reintroduction) caused by side effects among renal transplant patients converted to everolimus (EVL) or sirolimus (SRL).

Methods

Between October 1999 and February 2010, 409 subjects were converted to an mTORi at least 3 months after transplantation, including 220 (53.8%) to EVL and 189 (46.2%) to SRL. Most patients were under CNI therapy. Patients were followed for a median of 35 months (interquartile range [IQR], 18-50 months).

Results

mTORi treatment was prematurely eliminated due to adverse events in 112 patients. The median time between the initiation of mTORi and discontinuation was 5.7 months (IQR, 1.9-15.7 months; range, 0.2-48 months): 5.5 (IQR, 1.6-16.3) in the EVL group and 7.4 (IQR, 2.6-15.6) in the SRL group. In the EVL group, the drug was stopped in 69 patients (31.4%), and in the SRL group in 43 patients (22.8%; P = .051). The most important causes of discontinuation were severe infections (2.3% in EVL group and 4.8% in SRL group; P = .17), pneumonitis (6.8 % in EVL group and 4.8 in SRL group; P = .38), acute rejection episode (4.1% in EVL group and 1.6% in SRL group; P = .13), proteinuria (4.1% in EVL group and 1.6% in SRL group; P = .13), renal function deterioration (2.3% in EVL group and 2.1% in SRL group; P = .91), and severe dermal eruption (2.3% in EVL group and 0.5% in SRL group; P = .14).

Conclusions

Although the overall incidence discontinuations due to side effects was higher in the EVL group, there was no greater frequency of severe side effects, such as pneumonitis, proteinuria, acute rejection episodes, renal function deterioration, or dermal eruptions.  相似文献   

18.

Background

High levels of soluble CD30 (sCD30), a marker for T-helper 2-type cytokine-producing T cells, pre or post-renal transplantation serves as a useful predictor of acute rejection episodes. Over the course of 1-year, we evaluated the accuracy of serial sCD30 tests to predict acute rejection episodes versus other pathologies that affect graft outcomes.

Patients and methods

Fifty renal transplant recipients were randomly selected to examine sCD30 on days 0, 3, 5, 7, 14, and 21 followed by 1, 3, 6, and 12 months. The results were analyzed for development of an acute rejection episode, acute tubular necrosis (ATN), or other pathology as well as the graft outcome at 1 year.

Results

Compared with pretransplantation sCD30, there was a significant reduction in the average sCD30 immediately posttransplantation from day 3 onward (P < .0001). Patients were divided into four groups: (1) uncomplicated courses (56%); (2) acute rejection episodes (18%); (3) ATN (16%); and (4) other diagnoses (10%). There was a significant reduction in sCD30 immediately posttransplantation for groups 1, 2, and 3 (P < .0001, .004, and .002 respectively) unlike group 4 (P = .387). Patients who developed an acute rejection episode after 1 month showed higher pretransplantation sCD30 values than these who displayed rejection before 1 month (P = .019). All groups experienced significant improvement in graft function over 1-year follow-up without any significant differences.

Conclusion

Though a significant drop of sCD30 posttransplantation was recorded, serial measurements of sCD30 did not show a difference among subjects who displayed acute rejection episodes, ATN, or other diagnoses.  相似文献   

19.

Introduction

Cardiac allograft vasculopathy remains the leading cause of late morbidity and mortality in heart transplantation. The main diagnostic methods, coronary angiography or intracoronary ultrasound (when angiography is normal), are invasive. Other study methods, such as coronary computed tomography (CT) and virtual histological analysis, have not been widely assessed in this condition.

Objective

The objective of this study was to assess the correlation between data obtained from analysis of virtual histology compared with those obtained from the performance of coronary CT in cardiac transplant recipients.

Materials and Methods

During the same admission we performed coronary angiography and intravascular ultrasound with virtual histological analysis (automatic pull-back in anterior descending artery and one additional vessel if the former was normal) as well as coronary CT.

Results

The study included 10 patients. Virtual histology was done in segments with intimal thickening >0.5 mm, defining 2 groups of plaque, those with an inflammatory component (necrotic core >30% and calcium) versus those without it defined as the combination of both being <30%. A calcium component of the inflammatory plaque allowed coronary CT detection.

Conclusions

The detection of inflammatory plaque in graft vessel disease can be based on an initial noninvasive method, such as coronary CT, although confirmation requires further study.  相似文献   

20.

Background

Noncompliance to immunosuppressive treatment is 1 of the risk factors for kidney graft loss. The once-daily, prolonged-release tacrolimus formulation may improve treatment adherence. We sought to compare the pharmacokinetics of both tacrolimus formulations in older de novo recipients of a cadaveric renal transplant from an expanded-criteria donor.

Patients and Methods

This randomized study included 27 patients (14 on once daily prolonged-release formulation [QD] and 13, on the twice-daily formulation [BID]), who were treated with 0.1 mg/kg per day of tacrolimus (target blood level, 5-8 ng/mL) mycophenolate mofetil prednisone and basiliximab induction.

Results

At 24 hours, incombination with the blood levels were 4.70 ± 2.50 versus 4.70 ± 3.04 ng/mL (P = NS). There were no significant differences in the AUC0-24 of tacrolimus (QD/BID) at 3 days (300.8 ± 60.15 vs 287.7 ± 125.78 ng.h/mL) or 21 days (303.05 ± 99.79 vs 275.26 ± 75.37 ng.h/mL), nor in blood levels (ng/mL) at 1 month (8.76 ± 2.46 vs 8.8 ± 1.89), 3 months (7.30 ± 1.72 vs 8.80 ± 1.89) and 6 months (7.19 ± 1.89 vs 6.60 ± 1.71). At 3 days, there was a strong correlation between AUC0-24 and Cmin both for tacrolimus QD (r = .872) and BID (r = 1.0). The incidences of acute rejection episodes were: 0% versus 16.6%; graft survivals, 100% versus 92.3% (P = NS); and patient survivals, both 100%.

Conclusion

For older de novo recipients of kidneys from expanded criteria donors tacrolimus QD is comparable to the same dose in the BID formulation with similar at least short-term transplant outcomes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号