首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Coding variants in the apolipoprotein L1 gene (APOL1) are strongly associated with nephropathy in African Americans (AAs). The effect of transplanting kidneys from AA donors with two APOL1 nephropathy risk variants is unknown. APOL1 risk variants were genotyped in 106 AA deceased organ donors and graft survival assessed in 136 resultant kidney transplants. Cox‐proportional hazard models tested for association between time to graft failure and donor APOL1 genotypes. The mean follow‐up was 26.4 ± 21.8 months. Twenty‐two of 136 transplanted kidneys (16%) were from donors with two APOL1 nephropathy risk variants. Twenty‐five grafts failed; eight (32%) had two APOL1 risk variants. A multivariate model accounting for donor APOL1 genotype, overall African ancestry, expanded criteria donation, recipient age and gender, HLA mismatch, CIT and PRA revealed that graft survival was significantly shorter in donor kidneys with two APOL1 risk variants (hazard ratio [HR] 3.84; p = 0.008) and higher HLA mismatch (HR 1.52; p = 0.03), but not for overall African ancestry excluding APOL1. Kidneys from AA deceased donors harboring two APOL1 risk variants failed more rapidly after renal transplantation than those with zero or one risk variants. If replicated, APOL1 genotyping could improve the donor selection process and maximize long‐term renal allograft survival.  相似文献   

2.
We conducted this study using the updated 2005‐2016 Organ Procurement and Transplantation Network database to assess clinical outcomes of retransplant after allograft loss as a result of BK virus–associated nephropathy (BKVAN). Three hundred forty‐one patients had first graft failure as a result of BKVAN, whereas 13 260 had first graft failure as a result of other causes. At median follow‐up time of 4.70 years after the second kidney transplant, death‐censored graft survival at 5 years for the second renal allograft was 90.6% for the BK group and 83.9% for the non‐BK group. In adjusted analysis, there was no difference in death‐censored graft survival (P = .11), acute rejection (P = .49), and patient survival (P = .13) between the 2 groups. When we further compared death‐censored graft survival among the specific causes for first graft failure, the BK group had better graft survival than patients who had prior allograft failure as a result of acute rejection (P < .001) or disease recurrence (P = .003), but survival was similar to those with chronic allograft nephropathy (P = .06) and other causes (P = .05). The better allograft survival in the BK group over acute rejection and disease recurrence remained after adjusting for potential confounders. History of allograft loss as a result of BKVAN should not be a contraindication to retransplant among candidates who are otherwise acceptable.  相似文献   

3.
Allografts from living kidney donors with hypertension may carry subclinical kidney disease from the donor to the recipient and, thus, lead to adverse recipient outcomes. We examined eGFR trajectories and all-cause allograft failure in recipients from donors with versus without hypertension, using mixed-linear and Cox regression models stratified by donor age. We studied a US cohort from 1/1/2005 to 6/30/2017; 49 990 recipients of allografts from younger (<50 years old) donors including 597 with donor hypertension and 21 130 recipients of allografts from older (≥50 years old) donors including 1441 with donor hypertension. Donor hypertension was defined as documented predonation use of antihypertensive therapy. Among recipients from younger donors with versus without hypertension, the annual eGFR decline was −1.03 versus −0.53 ml/min/m2 (P = 0.002); 13-year allograft survival was 49.7% vs. 59.0% (adjusted allograft failure hazard ratio [aHR] 1.23; 95% CI 1.05–1.43; P = 0.009). Among recipients from older donors with versus without hypertension, the annual eGFR decline was −0.67 versus −0.66 ml/min/m2 (P = 0.9); 13-year allograft survival was 48.6% versus 52.6% (aHR 1.05; 95% CI 0.94–1.17; P = 0.4). In secondary analyses, our inferences remained similar for risk of death-censored allograft failure and mortality. Hypertension in younger, but not older, living kidney donors is associated with worse recipient outcomes.  相似文献   

4.
Renal transplant recipients have an increased risk of non‐melanoma skin cancer (NMSC) compared to in the general population. Here, we show polygenic risk scores (PRS) calculated from genome‐wide association studies (GWAS) of NMSC in a general, nontransplant setting, can predict risk of, and time to posttransplant skin cancer. Genetic variants, reaching predefined P‐value thresholds were chosen from published squamous cell carcinoma (SCC) and basal cell carcinoma (BCC) nontransplant GWAS. Using these GWAS, BCC and SCC PRS were calculated for each sample across three European ancestry renal transplant cohorts (n = 889) and tested as predictors of case:control status and time to NMSC posttransplant. BCC PRS calculated at P‐value threshold 1 × 10?5 was the most significant predictor of case:control status of NMSC posttransplant (OR = 1.61; adjusted P = .0022; AUC [full model adjusted for clinical predictors and PRS] = 0.81). SCC PRS at P‐value threshold 1 × 10?5 was the most significant predictor of time to posttransplant NMSC (adjusted P = 9.39 × 10?7; HR = 1.41, concordance [full model] = 0.74). PRS of nontransplant NMSC is predictive of case:control status and time to NMSC posttransplant. These results are relevant to how genomics can risk stratify patients to help develop personalized treatment regimens.  相似文献   

5.
Ganciclovir (GCV) inhibits spermatogenesis in preclinical studies but long-term effects on fertility in renal transplant patients are unknown. In a prospective, multicenter, open-label, nonrandomized study, male patients were assigned to Cohort A [valganciclovir (VGCV), a prodrug of GCV] (n = 38) or B (no VGCV) (n = 21) by cytomegalovirus prophylaxis requirement. Changes in semen parameters and DNA fragmentation were assessed via a mixed-effects linear regression model accounting for baseline differences. Sperm concentration increased post-transplant, but between baseline and treatment end (mean 164 days Cohort A, 211 days Cohort B), the model-based change was lower in Cohort A (difference: 43.82 × 106/ml; P = 0.0038). Post-treatment, sperm concentration increased in Cohort A so that by end of follow-up (6 months post-treatment) changes were comparable between cohorts (difference: 2.09 × 106/ml; P = 0.92). Most patients’ sperm concentration improved by end of follow-up; none with normal baseline concentrations (≥20 × 106/ml) were abnormal at end of follow-up. Changes in seminal volume, sperm motility/morphology, DNA fragmentation, and hormone levels were comparable between cohorts at end of follow-up. Improvement in semen parameters after renal transplant was delayed in men receiving VCGV, but 6 months post-treatment parameters were comparable between cohorts.  相似文献   

6.
Apolipoprotein L1 gene (APOL1) nephropathy variants in African American deceased kidney donors were associated with shorter renal allograft survival in a prior single‐center report. APOL1 G1 and G2 variants were genotyped in newly accrued DNA samples from African American deceased donors of kidneys recovered and/or transplanted in Alabama and North Carolina. APOL1 genotypes and allograft outcomes in subsequent transplants from 55 U.S. centers were linked, adjusting for age, sex and race/ethnicity of recipients, HLA match, cold ischemia time, panel reactive antibody levels, and donor type. For 221 transplantations from kidneys recovered in Alabama, there was a statistical trend toward shorter allograft survival in recipients of two‐APOL1‐nephropathy‐variant kidneys (hazard ratio [HR] 2.71; p = 0.06). For all 675 kidneys transplanted from donors at both centers, APOL1 genotype (HR 2.26; p = 0.001) and African American recipient race/ethnicity (HR 1.60; p = 0.03) were associated with allograft failure. Kidneys from African American deceased donors with two APOL1 nephropathy variants reproducibly associate with higher risk for allograft failure after transplantation. These findings warrant consideration of rapidly genotyping deceased African American kidney donors for APOL1 risk variants at organ recovery and incorporation of results into allocation and informed‐consent processes.  相似文献   

7.
Uncovering additional causal clinical traits and exposure variables is important when studying osteoporosis mechanisms and for the prevention of osteoporosis. Until recently, the causal relationship between anthropometric measurements and osteoporosis had not been fully revealed. In the present study, we utilized several state-of-the-art Mendelian randomization (MR) methods to investigate whether height, body mass index (BMI), waist-to-hip ratio (WHR), hip circumference (HC), and waist circumference (WC) are causally associated with two major characteristics of osteoporosis, bone mineral density (BMD) and fractures. Genomewide significant (p ≤ 5 × 10−8) single-nucleotide polymorphisms (SNPs) associated with the five anthropometric variables were obtained from previous large-scale genomewide association studies (GWAS) and were utilized as instrumental variables. Summary-level data of estimated bone mineral density (eBMD) and fractures were obtained from a large-scale UK Biobank GWAS. Of the MR methods utilized, the inverse-variance weighted method was the primary method used for analysis, and the weighted-median, MR-Egger, mode-based estimate, and MR pleiotropy residual sum and outlier methods were utilized for sensitivity analyses. The results of the present study indicated that each increase in height equal to a single standard deviation (SD) was associated with a 9.9% increase in risk of fracture (odds ratio [OR] = 1.099; 95% confidence interval [CI] 1.067–1.133; p = 8.793 × 10−10) and a 0.080 SD decrease of estimated bone mineral density (95% CI −0.106–(−0.054); p = 2.322 × 10−9). We also found that BMI was causally associated with eBMD (beta = 0.129, 95% CI 0.065–0.194; p = 8.113 × 10−5) but not associated with fracture. The WHR adjusted for BMI, HC adjusted for BMI, and WC adjusted for BMI were not found to be related to fracture occurrence or eBMD. In conclusion, the present study provided genetic evidence for certain causal relationships between anthropometric measurements and bone mineral density or fracture risk. © 2021 American Society for Bone and Mineral Research (ASBMR).  相似文献   

8.
Morbid obesity is a barrier to kidney transplantation due to inferior outcomes, including higher rates of new‐onset diabetes after transplantation (NODAT), delayed graft function (DGF), and graft failure. Laparoscopic sleeve gastrectomy (LSG) increases transplant eligibility by reducing BMI in kidney transplant candidates, but the effect of surgical weight loss on posttransplantation outcomes is unknown. Reviewing single‐center medical records, we identified all patients who underwent LSG before kidney transplantation from 2011‐2016 (n = 20). Post‐LSG kidney recipients were compared with similar‐BMI recipients who did not undergo LSG, using 2:1 direct matching for patient factors. McNemar's test and signed‐rank test were used to compare groups. Among post‐LSG patients, mean BMI ± standard deviation (SD) was 41.5 ± 4.4 kg/m2 at initial encounter, which decreased to 32.3 ± 2.9 kg/m2 prior to transplantation (P < .01). No complications, readmissions, or mortality occurred following LSG. After transplantation, one patient (5%) experienced DGF, and no patients experienced NODAT. Allograft and patient survival at 1‐year posttransplantation was 100%. Compared with non‐LSG patients, post‐LSG recipients had lower rates of DGF (5% vs 20%) and renal dysfunction–related readmissions (10% vs 27.5%) (P < .05 each). Perioperative complications, allograft survival, and patient survival were similar between groups. These data suggest that morbidly obese patients with end‐stage renal disease who undergo LSG to improve transplant candidacy, achieve excellent posttransplantation outcomes.  相似文献   

9.
Renal transplant recipients are at an increased risk of developing Methicillin‐resistant Staphylococcus aureus due to their immunosuppressed status. Herein, we investigate the incidence of MRSA infection in patients undergoing renal transplantation and determine the effect of MRSA colonisation on renal allograft function and overall mortality. Between January 1st 2007 and December 31st 2012, 1499 consecutive kidney transplants performed in our transplant unit and a retrospective 1:2 matched case‐control study was performed on this patient cohort. The 1‐, 3‐ and 5‐year overall graft survival rates were 100%, 86% and 78%, respectively, in MRSA positive recipients compared with 100%, 100% and 93%, respectively, in the control group (P < 0.05). The 1‐, 3‐ and 5‐year overall patient survival rates were 100%, 97% and 79%, respectively, in MRSA positive recipients compared with 100%, 100% and 95%, respectively, in the control group (P = 0.1). In a multiple logistic regression analysis, colonisation with MRSA pre‐operatively was an independent predictor for renal allograft failure at 5 years (hazard ratio: 4.6, 95% confidence interval: 1–30.7, P = 0.048). These findings demonstrate that the incidence of long‐term renal allograft failure is significantly greater in this patient cohort compared with a matched control population.  相似文献   

10.
Long‐term outcomes in renal transplant recipients withdrawn from steroid and submitted to further minimization of immunosuppressive regimen after 1 year are lacking. In this multicenter study, 204 low immunological risk kidney transplant recipients were randomized 14.2 ± 3.7 months post‐transplantation to receive either cyclosporine A (CsA) + azathioprine (AZA; n = 53), CsA + mycophenolate mofetil (MMF; n = 53), or CsA monotherapy (n = 98). At 3 years postrandomization, the occurrence of biopsy for graft dysfunction was similar in bitherapy and monotherapy groups (21/106 vs. 26/98; P = 0.25). At 10 years postrandomization, patients’ survival was 100%, 94.2%, and 95.8% (P = 0.25), and death‐censored graft survival was 94.9%, 94.7%, and 95.2% (P = 0.34) in AZA, MMF, and CsA groups, respectively. Mean estimated glomerular filtration rate was 70.4 ± 31.1, 60.1 ± 22.2, and 60.1 ± 19.0 ml/min/1.73 m2, respectively (P = 0.16). The incidence of biopsy‐proven acute rejection was 1.4%/year in the whole cohort. None of the patients developed polyomavirus‐associated nephropathy. The main cause of graft loss (n = 12) was chronic antibody‐mediated rejection (n = 6). De novo donor‐specific antibodies were detected in 13% of AZA‐, 21% of MMF‐, and 14% of CsA‐treated patients (P = 0.29). CsA monotherapy after 1 year is safe and associated with prolonged graft survival in well‐selected renal transplant recipient ( ClinicalTrials.gov number: 980654).  相似文献   

11.
Antibody-mediated rejection (ABMR) is a major cause of graft loss in renal transplantation. We assessed the predictive value of clinical, pathological, and immunological parameters at diagnosis for graft survival. We investigated 54 consecutive patients with biopsy-proven ABMR. Patients were treated according to our current standard regimen followed by triple maintenance immunosuppression. Patient characteristics, renal function, and HLA antibody status at diagnosis, baseline biopsy results, and immunosuppressive treatment were recorded. The risk of graft loss at 24 months after diagnosis and the eGFR slope were assessed. Multivariate analysis showed that eGFR at diagnosis and chronic glomerulopathy independently predict graft loss (HR 0.94; P = 0.018 and HR 1.57; P = 0.045) and eGFR slope (beta 0.46; P < 0.001 and beta −5.47; P < 0.001). Cyclophosphamide treatment (6× 15 mg/m2) plus high-dose intravenous immunoglobulins (IVIG) (1.5 g/kg) was superior compared with single-dose rituximab (1× 500 mg) plus low-dose IVIG (30 g) (HR 0.10; P = 0.008 and beta 10.70; P = 0.017) and one cycle of bortezomib (4× 1.3 mg/m2) plus low-dose IVIG (HR 0.16; P = 0.049 and beta 11.21; P = 0.010) regarding the risk of graft loss and the eGFR slope. In conclusion, renal function at diagnosis and histopathological signs of chronic ABMR seem to predict graft survival independent of the applied treatment regimen. Stepwise modifications of the treatment regimen may help to improve outcome.  相似文献   

12.
Apolipoprotein L‐1 (APOL1) gene variants are associated with end‐stage renal disease in African Americans (AAs). Here we investigate the impact of recipient APOL1 gene distributions on kidney allograft outcomes. We conducted a retrospective analysis of 119 AA kidney transplant recipients, and found that 58 (48.7%) carried two APOL1 kidney disease risk variants. Contrary to the association seen in native kidney disease, there is no difference in allograft survival at 5‐year posttransplant for recipients with high‐risk APOL1 genotypes. Thus, we were able to conclude that APOL1 genotypes do not increase risk of allograft loss after kidney transplantations, and carrying 2 APOL1 risk alleles should not be an impediment to transplantation.  相似文献   

13.
Simultaneous liver‐kidney transplantation (SLKT) is indicated for patients with end‐stage liver disease (ESLD) and concurrent renal insufficiency. En bloc SLKT is an alternative to traditional separate implantations, but studies comparing the two techniques are limited. The en bloc technique maintains renal outflow via donor infrahepatic vena cava and inflow via anastomosis of donor renal artery to donor splenic artery. Comparison of recipients of en bloc (n = 17) vs traditional (n = 17) SLKT between 2013 and 2017 was performed. Recipient demographics and comorbidities were similar. More recipients of traditional SLKT were dialysis dependent (82.4% vs 41.2%, P = .01) with lower baseline pretransplant eGFR (14 vs 18, P = .01). En bloc SLKT was associated with shorter kidney cold ischemia time (341 vs 533 minutes, P < .01) and operative time (374 vs 511 minutes, P < .01). Two en bloc patients underwent reoperation for kidney allograft inflow issues due to kinking and renal steal. Early kidney allograft dysfunction (23.5% in both groups), 1‐year kidney graft survival (88.2% vs 82.4%, P = 1.0), and posttransplantation eGFR were similar between groups. In our experience, the en bloc SLKT technique is safe and feasible, with comparable outcomes to the traditional method.  相似文献   

14.
Serial monitoring of peripheral blood lymphocyte subpopulations (PBLSs) counts might be useful in predicting post‐transplant opportunistic infection (OI) after kidney transplantation (KT). PBLSs were prospectively measured in 304 KT recipients at baseline and post‐transplant months 1 and 6. Areas under receiver operating characteristic curves were used to evaluate the accuracy of different subpopulations in predicting the occurrence of overall OI and, specifically, cytomegalovirus (CMV) disease. We separately analyzed patients not receiving (n = 164) or receiving (n = 140) antithymocyte globulin (ATG) as induction therapy. In the non‐ATG group, a CD8+ T‐cell count at month 1 <0.100 × 103 cells/μl had negative predictive values of 0.84 and 0.86 for the subsequent occurrence of overall OI and CMV disease, respectively. In the multivariate Cox model, a CD8+ T‐cell count <0.100 × 103 cells/μl was an independent risk factor for OI (adjusted hazard ratio: 3.55; P‐value = 0.002). In the ATG group, a CD4+ T‐cell count at month 1 <0.050 × 103 cells/μl showed negative predictive values of 0.92 for the subsequent occurrence of overall OI and CMV disease. PBLSs monitoring effectively identify KT recipients at low risk of OI, providing an opportunity for individualizing post‐transplant prophylaxis practices.  相似文献   

15.
IgA nephropathy (IgAN) is a frequent cause of chronic kidney disease (CKD) and progressive renal impairment. A native renal biopsy diagnosis of IgAN is a predictor of graft loss, with a relative risk of 47% but it is difficult to predict graft survival and progressive allograft dysfunction in these patients. Deletion of complement factor H-related genes 1 and 3 (delCFHR3-1) has been associated with a decreased risk of developing IgAN on native kidneys, but the impact on the graft in IgAN-transplanted patients is unknown. We hypothesized that delCFHR3-1 is also associated with the processes that influence graft survival in transplant recipients with IgAN and tested whether cellular senescence is involved in mediating graft damage. We found that patients carrying two copies of CFHR1-3 had a worse outcome (P = .000321) and presented increased FHR1 deposits at glomerular and tubulointerstitial level associated with higher expression of the senescence marker p16INK4a (P = .001) and tubulointerstitial fibrosis (P = .005). Interestingly, FHR1 deposits were associated with increased complement activation as demonstrated by C5b-9 deposits. These data support both the role of FHR1 in mediating complement activation and tubular senescence, and suggest the possibility of genotyping delCFHR3-1 to predict graft survival in IgAN-transplanted patients.  相似文献   

16.
The advent of direct-acting antivirals (DAAs) has provided the impetus to transplant kidneys from hepatitis C virus-positive donors into uninfected recipients (D+/R−). Thirty D+/R− patients received DAA treatment. Sustained virologic response (SVR12) was defined as an undetectable viral load in 12 weeks after treatment. An age-matched cohort of uninfected donor and recipient pairs (D−/R−) transplanted during same time period was used for comparison. The median day of viral detection was postoperative day (POD) 2. The detection of viremia in D+/R− patients was 100%. The initial median viral load was 531 copies/μL (range: 10-1 × 108 copies/μL) with a median peak viral load of 3.4 × 105 copies/μL (range: 804-1.0 × 108 copies/μL). DAAs were initiated on median POD 9 (range: 5-41 days). All 30 patients had confirmed SVR12. During a median follow-up of 10 months, patient and graft survival was 100%, and acute rejection was 6.6% with no major adverse events related to DAA treatment. Delayed graft function was significantly decreased in D+/R− patients as compared to the age-matched cohort (27% vs 60%; P = .01). D+/R− transplantation offers patients an alternative strategy to increase access.  相似文献   

17.
Early pancreas loss in simultaneous pancreas–kidney (SPK) transplants has been associated with longer perioperative recovery and reduced kidney allograft function. We assessed the impact of early pancreas allograft failure on transplant outcomes in a contemporary cohort of SPK patients (n = 218). Early pancreas allograft loss occurred in 12.8% (n = 28) of recipients. Delayed graft function (DGF) was more common (21.4% vs. 7.4%, p = 0.03) in the early pancreas loss group, but there were no differences in hospital length of stay (median 6.5 vs. 7.0, p = 0.22), surgical wound complications (p = 0.12), or rejection episodes occurring in the first year (p = 0.87). Despite differences in DGF, both groups had excellent renal function at 1 year post‐transplant (eGFR 64.1 ± 20.8 vs. 65.8 ± 22.9, p = 0.75). There were no differences in patient (HR 0.58, 95% CI 0.18–1.87, p = 0.26) or kidney allograft survival (HR 0.84, 95% CI 0.23–3.06, p = 0.77). One‐ and 2‐year protocol kidney biopsies were comparable between the groups and showed minimal chronic changes; the early pancreas loss group showed more cv changes at 2 years (p = 0.04). Current data demonstrate good outcomes and excellent kidney allograft function following early pancreas loss.  相似文献   

18.
The post-transplant outcomes of patients with Model for End-stage Liver Disease (MELD) score primarily driven by renal dysfunction are poorly understood. This was a retrospective cohort study of liver transplant (LT) alone recipients between 2005 and 2017 using the United Network for Organ Sharing (UNOS) database. The proportion of MELD Sodium score attributable to creatinine (“KidneyMELD”) was calculated: (9.57 × ln (creatinine) × 100)/(MELD-Na − 6.43). The association of KidneyMELD with (a) all-cause mortality and (b) estimated glomerular filtration rate ≤30 mL/min/1.732 at 1-year post-LT were evaluated. Recipients with KidneyMELD ≥50% had a 52% higher risk of post-LT mortality (adjusted hazard ratio 1.52 vs KidneyMELD 0%, 95% CI: 1.36-1.69; P < .001). This risk was significantly greater for older patients, particularly when >50 years at LT (interaction P < .001). KidneyMELD ≥50% was also associated with an 11-fold increase in the odds of advanced renal dysfunction at 1-year post-LT (adjusted odds ratio 11.53 vs KidneyMELD 0%; 95% CI 8.9-14.93; P < .001). Recipients prioritized for LT primarily on the basis of renal dysfunction have marked post-LT mortality and morbidity independent of MELD Sodium score. The implications of these results in the context of the new UNOS “safety net” kidney transplant policy require further study.  相似文献   

19.
Among factors determining long-term kidney allograft outcome, pretransplant renal replacement therapy (RRT) is the most easily modifiable. Previous studies analysing RRT modality impact on patient and graft survival are conflicting. Studies on allograft function are scarce, lack sufficient size and follow-up. We retrospectively studied patient and allograft survival together with allograft function and its decline in 2277 allograft recipients during 2000–2014. Pretransplant RRT modality ≥60 days as grouped into “no RRT” (n = 136), “haemodialysis (HD)” (n = 1847), “peritoneal dialysis (PD)” (n = 159), and “HD + PD” (n = 135) was evaluated. Kaplan–Meier analysis demonstrated superior 5-/10-/15-year patient (93.0/81.8/73.1% vs. 86.2/71.6/49.8%), death-censored graft (90.8/85.4/71.5% vs. 84.4/75.2/63.2%), and 1-year rejection-free graft survival (73.8% vs. 63.8%) in PD versus HD patients. Adjusted Cox regression revealed 34.5% [1.5–56.5%] lower hazards of death, whereas death-censored graft loss was similar [HR = 0.707 (0.469–1.064)], and rejection was less frequent [HR = 0.700 (0.508–0.965)]. Allografts showed higher 1-/3-/5-year estimated glomerular filtration rate (eGFR) in “PD” versus “HD” groups. Living donation benefit for allograft function was most pronounced in groups “no RRT” and “PD”. Functional allograft decline (eGFR slope) was lowest for “PD”. Allograft recipients on pretransplant PD versus HD demonstrated superior all-cause patient and rejection-free graft survival along with better allograft function (eGFR).  相似文献   

20.

Purpose

This study was conducted to determine the association between single-nucleotide polymorphisms (SNPs) in apoptosis-related genes and survival outcomes of patients with early-stage non-small-cell lung cancer (NSCLC).

Methods

Three hundred ten consecutive patients with surgically resected NSCLC were enrolled. Twenty-five SNPs in 17 apoptosis-related genes were genotyped by a sequenome mass spectrometry-based genotyping assay. The genotype associations with overall survival (OS) and disease-free survival (DFS) were analyzed.

Results

Three SNPs (TNFRSF10B rs1047266, TNFRSF1A rs4149570, and PPP1R13L rs1005165) were significantly associated with survival outcomes on multivariate analysis. When the three SNPs were combined, OS and DFS were decreased as the number of bad genotypes increased (P trend for OS and DFS = 7 × 10?5 and 1 × 10?4, respectively). Patients with one bad genotype, and patients with two or three bad genotypes had significantly worse OS and DFS compared with those with no bad genotypes [adjusted hazard ratio (aHR) for OS = 2.27, 95% confidence interval (CI) = 1.22–4.21, P = 0.01, aHR for DFS = 1.74, 95% CI = 1.08–2.81, P = 0.02; aHR for OS = 4.11, 95% CI = 2.03–8.29, P = 8 × 10?5; and aHR for DFS = 2.89, 95% CI = 1.64–5.11, P = 3 × 10?4, respectively].

Conclusion

Three SNPs in apoptosis-related genes were identified as possible prognostic markers of survival in patients with early-stage NSCLC. The SNPs, and particularly their combined genotypes, can be used to identify patients at high risk for poor disease outcome.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号