首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Hydrogen sulfide (H2S), produced from metabolism of dietary sulfur-containing amino acids, is allegedly a renoprotective compound. Twenty-four-hour urinary sulfate excretion (USE) may reflect H2S bioavailability. We aimed to investigate the association of USE with graft failure in a large prospective cohort of renal transplant recipients (RTR). We included 704 stable RTR, recruited at least 1 year after transplantation. We applied log-rank testing and Cox regression analyses to study association of USE, measured from baseline 24 h urine samples, with graft failure. Median age was 55 [45–63] years (57% male, eGFR was 45 ± 19 ml/min/1.73 m2). Median USE was 17.1 [13.1–21.1] mmol/24 h. Over median follow-up of 5.3 [4.5–6.0] years, 84 RTR experienced graft failure. RTR in the lowest sex-specific tertile of USE experienced a higher rate of graft failure during follow-up than RTR in the middle and highest sex-specific tertiles (18%, 13%, and 5%, respectively, log-rank P < 0.001). In Cox regression analyses, USE was inversely associated with graft failure [HR per 10 mmol/24 h: 0.37 (0.24–0.55), P < 0.001]. The association remained independent of adjustment for potential confounders, including age, sex, eGFR, proteinuria, time between transplantation and baseline, BMI, smoking, and high sensitivity C-reactive protein [HR per 10 mmol/24 h: 0.51 (0.31–0.82), P = 0.01]. In conclusion, this study demonstrates a significant inverse association of USE with graft failure in RTR, suggesting high H2S bioavailability as a novel, potentially modifiable factor for prevention of graft failure in RTR.  相似文献   

3.
4.
There has been an increase in the number of older patients on the transplant waiting list and acceptance of older donor kidneys. Although kidneys from older donors have been associated with poorer graft outcomes, whether there is a differential impact of donor age on outcomes in older recipients remains unclear. The aim of this study was to evaluate the effect of donor age on graft and patient survival in renal transplant (RT) recipients ≥60years. Using the Australia and New Zealand Dialysis and Transplant Registry, outcomes of 1,037 RT recipients ≥60years between 1995 and 2009 were analyzed. Donor age groups were categorized into 0-20, >20-40, >40-60, and >60years. Compared with recipients receiving donor kidneys >60years, those receiving donor kidneys >20-40years had lower risk of acute rejection (odds ratio 0.46, 95% CI 0.27, 0.79; P<0.01) and death-censored graft failure (HR 0.37, 95% CI 0.19, 0.72; P<0.01). There was no association between donor age groups and death. With a corresponding growth in the availability of older donor kidneys and the observed lack of association between donor age and patient survival in RT recipients ≥60years, preferential allocation of older donor kidneys to RT recipients ≥60years may not disadvantage the life expectancy of these patients.  相似文献   

5.
In this study we aimed to compare patient and graft survival of kidney transplant recipients who received a kidney from a living-related donor (LRD) or living-unrelated donor (LUD). Adult patients in the ERA-EDTA Registry who received their first kidney transplant in 1998–2017 were included. Ten-year patient and graft survival were compared between LRD and LUD transplants using Cox regression analysis. In total, 14 370 patients received a kidney from a living donor. Of those, 9212 (64.1%) grafts were from a LRD, 5063 (35.2%) from a LUD and for 95 (0.7%), the donor type was unknown. Unadjusted five-year risks of death and graft failure (including death as event) were lower for LRD transplants than for LUD grafts: 4.2% (95% confidence interval [CI]: 3.7–4.6) and 10.8% (95% CI: 10.1–11.5) versus 6.5% (95% CI: 5.7–7.4) and 12.2% (95% CI: 11.2–13.3), respectively. However, after adjusting for potential confounders, associations disappeared with hazard ratios of 0.99 (95% CI: 0.87–1.13) for patient survival and 1.03 (95% CI: 0.94–1.14) for graft survival. Unadjusted risk of death-censored graft failure was similar, but after adjustment, it was higher for LUD transplants (1.19; 95% CI: 1.04–1.35). In conclusion, patient and graft survival of LRD and LUD kidney transplant recipients was similar, whereas death-censored graft failure was higher in LUD. These findings confirm the importance of both living kidney donor types.  相似文献   

6.
7.
The aim was to evaluate the association of molecular-level human leukocyte antigen (HLA) mismatching with post-transplant graft survival, rejection, and cardiac allograft vasculopathy (CAV). We retrospectively analyzed all primary cardiac transplant recipients between 01/1984-06/2016. 1167 patients fulfilled inclusion criteria and had HLA typing information available. In 312 donor-recipient pairs, typing at serological split antigen level was available. We used the Epitope MisMatch Algorithm to calculate the number of amino acid differences in antibody-verified HLA eplets (amino acid mismatch load (AAMM)) between donor and recipient. Patients with a higher HLA-DR AAMM load had inferior 1-year graft survival (hazard ratio [HR], 1.14; 95% confidence interval [CI], 1.01–1.28). The HLA-AB AAMM load showed no impact on graft survival. In the subgroup with available split-level information, we observed an inferior graft survival for a higher HLA-DR AAMM load 3 months after transplantation (HR, 1.22; 95% CI, 1.04–1.44) and a higher risk for rejection for an increasing HLA-AB (HR, 1.70; 95% CI, 1.29–2.24) and HLA-DR (HR, 1.32; 95% CI, 1.09–1.61) AAMM load. No impact on the development of CAV was found. Molecular-level HLA mismatch analysis could serve as a tool for risk stratification after heart transplantation and might take us one step further into precision medicine.  相似文献   

8.
Extracorporeal circulation is accompanied by changes in red blood cell morphology and structural integrity that affect cell function and survival, and thereby may contribute to the various side effects of heart–lung machine-assisted surgery. Our main objectives were to determine the effect of circulation of red blood cells in a stand-alone extracorporeal circuit on several parameters that are known to be affected by, as well as contribute to red blood cell aging. As a source of RBCs, we employed blood bank storage units of different ages. In order to assess the relevance of our in vitro observations for the characterization of extracorporal circulation technology, we compared these changes in those of patients undergoing extracorporeal circulation-assisted cardiac surgery. Our results show that circulation in a heart–lung machine is accompanied by changes in red blood cell volume, an increase in osmotic fragility, changes in deformability and aggregation behavior, and alterations in the exposure of phosphatidylserine and in microvesicle generation. RBCs from 1-week-old concentrates showed the highest similarities with the in vivo situation. These changes in key characteristics of the red blood cell aging process likely increase the susceptibility of red blood cells to the various mechanical, osmotic, and immunological stress conditions encountered during and after surgery in the patient’s circulation, and thereby contribute to the side effects of surgery. Thus, aging-related parameters in red blood cell structure and function provide a foundation for the validation and improvement of extracorporeal circulation technology.  相似文献   

9.
Hypertension and nephrotoxicity are frequent complications of cyclosporine-induced immunosuppression in renal transplant recipients. Long-term antihypertensive treatment is obligatory for hypertensive transplant patients, to protect allograft function. The use of angiotensin-converting enzyme (ACE) inhibitors in the anti-hypertensive treatment of renal transplant recipients who receive immunosuppression with cyclosporine has long been discussed controversially. The aim of this prospective study, with a duration of 2 years and a follow-up of another 3 years, was to estimate the long-term antihypertensive potential of quinapril compared with that of the beta-blocker atenolol and to compare their effects on renal allograft function and proteinuria in 96 hypertensive renal transplant recipients who received cyclosporine A as immunosuppressive therapy. Patients were randomly assigned to receive either quinapril (group Q) or atenolol (group A) as anti-hypertensive treatment. Forty patients of each group completed the 5-year observation period according to protocol. Intention-to-treat and according-to-protocol analyses were performed. With the patients starting at similar baseline blood pressure values, both agents, atenolol and quinapril, decreased systolic and diastolic blood pressure (SBP, DBP) as well as middle arterial pressure (MAP) and pulse pressure (PP) to a similar extent (Delta SBP: group Q: -8+/-3 vs group A mmHg: -5+/-3; Delta DBP: -5+/-2 vs -4+/-2 mmHg; Delta MAP: -6+/-2 vs -5+/-2 mmHg; Delta PP: -2+/-2 vs -1+/-3 mmHg; mean +/- SEM). Neither serum creatinine levels nor Cockcroft-Gault clearance had changed significantly in either group after the 5-year period (Delta creatinine: 0.1+/-0.1 vs 0.2+/-0.2 mg/dl; Delta Cockcroft-Gault clearance: 3.9+/-4.6 vs 2.8+/-4.3 ml/min; mean +/- SEM). Urinary protein excretion remained stable among the quinapril-treated patients, whereas a significant increase was observed in the atenolol group during the 5-year study period (group Q: from 0.52+/-0.08 to 0.54+/-0.14 g/24 h; group A: from 0.34+/-0.03 to 0.72+/-0.13 g/24 h, P<0.02; mean +/- SEM). Albuminuria increased comparably in both groups, while the excretion of alpha-microglobuline increased slightly in the atenolol group, but decreased slightly in the quinapril group. The difference between the groups failed to be statistically significant (ANOVA, P<0.056). In conclusion, quinapril and atenolol may be considered suitable and safe substances in the long-term treatment of hypertensive renal transplant recipients, since both agents prove to be effective in anti-hypertensive treatment, and keep allograft function stable over a period of 5 years.  相似文献   

10.
11.
The COVID-19 pandemic has significantly changed the landscape of kidney transplantation in the United States and worldwide. In addition to adversely impacting allograft and patient survival in postkidney transplant recipients, the current pandemic has affected all aspects of transplant care, including transplant referrals and listing, organ donation rates, organ procurement and shipping, and waitlist mortality. Critical decisions were made during this period by transplant centers and individual transplant physicians taking into consideration patient safety and resource utilization. As countries have begun administering the COVID vaccines, new and important considerations pertinent to our transplant population have arisen. This comprehensive review focuses on the impact of COVID-19 on kidney transplantation rates, mortality, policy decisions, and the clinical management of transplanted patients infected with COVID-19.  相似文献   

12.
13.
14.
Kidney transplant recipients (KTRs) have an increased cancer risk compared to the general population, but absolute risks that better reflect the clinical impact of cancer are seldom estimated. All KTRs in Sweden, Norway, Denmark, and Finland, with a first transplantation between 1995 and 2011, were identified through national registries. Post-transplantation cancer occurrence was assessed through linkage with cancer registries. We estimated standardized incidence ratios (SIR), absolute excess risks (AER), and cumulative incidence of cancer in the presence of competing risks. Overall, 12 984 KTRs developed 2215 cancers. The incidence rate of cancer overall was threefold increased (SIR 3.3, 95% confidence interval [CI]: 3.2–3.4). The AER of any cancer was 1560 cases (95% CI: 1468–1656) per 100 000 person-years. The highest AERs were observed for nonmelanoma skin cancer (838, 95% CI: 778–901), non-Hodgkin lymphoma (145, 95% CI: 119–174), lung cancer (126, 95% CI: 98.2–149), and kidney cancer (122, 95% CI: 98.0–149). The five- and ten-year cumulative incidence of any cancer was 8.1% (95% CI: 7.6–8.6%) and 16.8% (95% CI: 16.0–17.6%), respectively. Excess cancer risks were observed among Nordic KTRs for a wide range of cancers. Overall, 1 in 6 patients developed cancer within ten years, supporting extensive post-transplantation cancer vigilance.  相似文献   

15.
Accurate assessment of renal function is of key importance, given its prognostic value. However, gold standard measures are cumbersome, and serum creatinine itself is an insensitive predictor, especially in renal transplant recipients. Though GFR-estimating formulae have been relied upon, they do have their own limitations. Nevertheless, renal biomarkers such as neutrophil gelatinase-associated lipocalin (NGAL) and cystatin C, among others, are now emerging as potentially useful indicators of GFR. We aimed to evaluate the diagnostic performance of NGAL versus cystatin C and eGFR using CKD-EPI, MDRD and cystatin C in renal transplant recipients and non-transplant CKD patients. We found a significant correlation between NGAL, serum creatinine, cystatin C and eGFR. The latter parameters were also strong predictors of serum NGAL levels. However, performance of NGAL, based on receiver operating characteristic curves, was inferior to that of the reference tests. It appears that in renal transplant recipients NGAL correlates well with cystatin C and eGFR, most strongly with cystatin-based formula. Though this suggests potential use of NGAL as a screening test, its weaker diagnostic performance raises some concern about its clinical usefulness. Larger studies are needed to explore this further.  相似文献   

16.
Renal replacement therapy (RRT) in the setting of acute kidney injury (AKI) is generally provided by either tunneled or nontunneled dialysis catheters (TDCs or NTDCs), used immediately after insertion. Current consensus guidelines suggest using NTDCs rather than TDCs for vascular access in AKI primarily for logistical reasons, including ease of insertion and timeliness. However, there is increasing evidence that, compared to NTDCs, TDCs are associated with fewer complications (mechanical and infectious) and better dialysis delivery. Nevertheless, this evidence must be balanced by the feasibility and practicality of implementing a “TDC‐first approach.” In this paper, we assess the current evidence base for vascular access choice for AKI requiring RRT. We make the case for increased use of TDCs as first‐line vascular access given growing observational evidence for improved patient outcomes; including decreased risk of infection and thrombosis, increased blood flow rates and decreased treatment interruptions, compared to NDTCs. We advocate for further research to test the feasibility and outcomes associated with a TDC‐first approach to AKI‐RRT access. A TDC‐first approach has the potential to improve RRT clinical outcomes and reduce resource utilization and cost.  相似文献   

17.
18.
Study objectiveTo assess the risk for postoperative acute kidney injury (AKI) after major urologic surgery for different intraoperative hypotension thresholds in form of time below a fixed threshold. We hypothesize that the duration of hypotension below a certain hypotension threshold is a risk factor for AKI also in major urologic procedures.DesignRetrospective observational cohort series.SettingSingle tertiary high caseload center.Patients416 consecutive patients undergoing open radical cystectomy, pelvic lymph node dissection and urinary diversion between 2013 and 2019.InterventionsNone.MeasurementsWe analyzed intraoperative data and their correlation to postoperative AKI judged according to the Acute Kidney Injury Network criteria. Patients were divided into groups falling below MAP <65 mmHg, MAP <60 mmHg and MAP <55 mmHg. The probability of developing postoperative AKI using all risk variables as well as the hypotension threshold variables (minutes under a certain threshold) was calculated using logistic regression methods.Main resultsPostoperative AKI was diagnosed in 128/416 patients (30.8%). Multiple logistic regression analysis showed that minutes below a threshold of 65 mmHg (OR 1.010 [1.005–1.015], P < 0.001) and 60 mmHg (OR 1.012 [1.001–1.023], P = 0.02) are associated with an increased risk of AKI. On average, 26.5% (MAP <65 mmHg), 50.0% (MAP <60 mmHg) and 76.5% (MAP <55 mmHg) of minutes below a certain threshold occurred between induction of anesthesia and start of surgery and are thus fully attributable to anesthesiological management.ConclusionsOur results suggest that avoiding intraoperative MAP lower than 65 mmHg and especially lower than 60 mmHg will protect postoperative renal function in cystectomy patients. The time between induction of anesthesia and surgical incision warrants special attention as a relevant share of hypotension occur in this period.  相似文献   

19.
The dynamic response of heart and its injury involving chest impact   总被引:2,自引:0,他引:2  
CResearchInstituteofSurgery,DapingHospital,ThirdMilitaryMedicalUniversity,Chongqing400042,China(LiuBS,WangZG,WengGW,YangZH,L...  相似文献   

20.
Ganciclovir (GCV) inhibits spermatogenesis in preclinical studies but long-term effects on fertility in renal transplant patients are unknown. In a prospective, multicenter, open-label, nonrandomized study, male patients were assigned to Cohort A [valganciclovir (VGCV), a prodrug of GCV] (n = 38) or B (no VGCV) (n = 21) by cytomegalovirus prophylaxis requirement. Changes in semen parameters and DNA fragmentation were assessed via a mixed-effects linear regression model accounting for baseline differences. Sperm concentration increased post-transplant, but between baseline and treatment end (mean 164 days Cohort A, 211 days Cohort B), the model-based change was lower in Cohort A (difference: 43.82 × 106/ml; P = 0.0038). Post-treatment, sperm concentration increased in Cohort A so that by end of follow-up (6 months post-treatment) changes were comparable between cohorts (difference: 2.09 × 106/ml; P = 0.92). Most patients’ sperm concentration improved by end of follow-up; none with normal baseline concentrations (≥20 × 106/ml) were abnormal at end of follow-up. Changes in seminal volume, sperm motility/morphology, DNA fragmentation, and hormone levels were comparable between cohorts at end of follow-up. Improvement in semen parameters after renal transplant was delayed in men receiving VCGV, but 6 months post-treatment parameters were comparable between cohorts.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号