首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 24 毫秒
1.
The outcomes of split-liver transplantation are controversial. This study compared outcomes and morbidity after extended right lobe liver transplantation (ERLT) and whole liver transplantation (WLT) in adults. MEDLINE and Web of Science databases were searched systematically and unrestrictedly for studies on ERLT and its impact on graft and patient survival, and postoperative complications. Graft loss and patient mortality odds ratios (OR) and 95% confidence intervals (CI) were assessed by meta-analyses using Mantel–Haenszel tests with a random-effects model. Vascular and biliary complications, primary nonfunction, 3-month, 1-, and 3-year graft and patient survival, and retransplantation after ERLT and WLT were analyzed. The literature search yielded 10 594 articles. After exclusion, 22 studies (n = 75 799 adult transplant patients) were included in the analysis. ERLT was associated with lower 3-month (OR = 1.43, 95% CI = 1.09–1.89, P = 0.01), 1-year (OR = 1.46, 95% CI = 1.08–1.97, P = 0.01), and 3-year (OR = 1.37, 95% CI = 1.01–1.84, P = 0.04) graft survival. WL grafts were less associated with retransplantation (OR = 0.57; 95% CI = 0.41–0.80; P < 0.01), vascular complications (OR = 0.53, 95% CI = 0.38–0.74, P < 0.01) and biliary complications (OR = 0.67; 95% CI = 0.47–0.95; P = 0.03). Considering ERLT as major Extended Donor Criteria is justified because ERL grafts are associated with vasculobiliary complications and the need for retransplantation, and have a negative influence on graft survival.  相似文献   

2.
The success of direct-acting antiviral (DAA) therapy has led to near-universal cure for patients chronically infected with hepatitis C virus (HCV) and improved post–liver transplant (LT) outcomes. We investigated the trends and outcomes of retransplantation in HCV and non-HCV patients before and after the introduction of DAA. Adult patients who underwent re-LT were identified in the Organ Procurement and Transplantation Network/United Network for Organ Sharing database. Multiorgan transplants and patients with >2 total LTs were excluded. Two eras were defined: pre-DAA (2009-2012) and post-DAA (2014-2017). A total of 2112 re-LT patients were eligible (HCV: n = 499 pre-DAA and n = 322 post-DAA; non-HCV: n = 547 pre-DAA and n = 744 post-DAA). HCV patients had both improved graft and patient survival after re-LT in the post-DAA era. One-year graft survival was 69.8% pre-DAA and 83.8% post-DAA (P < .001). One-year patient survival was 73.1% pre-DAA and 86.2% post-DAA (P < .001). Graft and patient survival was similar between eras for non-HCV patients. When adjusted, the post-DAA era represented an independent positive predictive factor for graft and patient survival (hazard ratio [HR]: 0.67; P = .005, and HR: 0.65; P = .004) only in HCV patients. The positive post-DAA era effect was observed only in HCV patients with first graft loss due to disease recurrence (HR: 0.31; P = .002, HR 0.32; P = .003, respectively). Among HCV patients, receiving a re-LT in the post-DAA era was associated with improved patient and graft survival.  相似文献   

3.
We aimed to assess the effect of donor pancreas extraction time (ET) on postoperative complications and graft function after pancreas transplantation (PT). We analyzed all consecutive donor pancreas procurements for the simultaneous pancreas and kidney transplantation (SPK) and the associated PT in a Swiss transplant center over a 20-year period. Pancreas ET was defined as the time from cold flush to static storage of the pancreas on ice. The primary endpoint was the effect of extraction time on surgical complications. Secondary endpoints comprised the effect of ET on graft function (insulin-free survival) and graft pancreatitis. Of 115 procured pancreas grafts the median donor pancreas ET was 65 min (IQR: 48–78 min). In multivariable analysis, ET did not negatively affect major complications (OR 1.41 [95% CI: .59–3.36]; p = .438) and insulin-free survival (HR 1.42 [95% CI: .55–3.63]; p = .459). The median CIT was 522 (441–608) min. CIT was associated with major complications (OR 2.51 [95% CI: 1.11–5.68]; p = .027), but without impact on insulin-free survival (HR 1.94 [95% CI: .84–4.48]; p = .119). Patients with and without graft pancreatitis had no statistically significant differences in ET and CIT (p = .164 and p = .47, respectively). In multivariable analysis, Amylase levels > 270 U/L on postoperative day 1 were significantly associated with major complications (OR 3.61 [95% CI: 1.06–12.32]; p = .040). Our results suggest that although no effect of ET on complications and graft function after PT was found, shorter CIT and less graft pancreatitis can have a positive impact on surgical complications. Results could possibly be influenced by the exceptional quality of the pancreas donors, with short travel distances and preservation times in Switzerland.  相似文献   

4.
BackgroundLiver transplantation (LT) has gained interest in the treatment of unresectable colorectal liver metastases (CRLM) over the last two decades. Despite the initial poor outcomes, recent reports from countries with graft abundance have provided further insights in the potential of LT as a treatment for unresectable CRLM.MethodsA systematic literature search was conducted in the MEDLINE (PubMed), Embase, Scopus, Cochrane Library, Google Scholar, Virtual Health Library, Clinicaltrials.gov, and Web of Science databases (end-of-search date: January 27th, 2020) to identify relevant studies. Pooled overall and recurrence-free survival analysis at 6 months, 1, 2, 3, and 5 years was conducted with the Kaplan-Meier (Product Limit) method.ResultsEighteen studies comprising 110 patients were included. The population consisted of 59.8% males with a mean age of 52.3 ± 9.3 years. CRLM diagnosis was synchronous in 83%, while 99% received chemotherapy, and 39% received liver resection prior to LT. The mean time from primary tumor resection to LT was 39.5 ± 32.5 months, the mean post-LT follow-up was 32.1 ± 22.2 months, and the mean time to recurrence was 15.0 ± 11.3 months. The pooled 6-month, 1-, 2-, 3-, and 5-year overall survival rates were 95.7% (95%CI: 89.1%–98.4%), 88.1% (95%CI: 79.6%–93.2%), 74.6% (95%CI: 64.2%–82.3%), 58.4% (95%CI: 47.2%–62.0%), and 50.5% (95%CI: 39.0%–61.0%), respectively. The pooled 6-months, 1-, 2-, 3-, and 5-year recurrence-free survival rates were 77.2% (95%CI: 67.2%–84.5%), 59.9% (95%CI: 49.0%–69.2%), 42.4% (95%CI: 31.8%–52.6%), 30.7% (95%CI: 20.9%–41.1%), and 25.6% (95%CI: 16.2%–36.0%), respectively.ConclusionLT should be considered in patients with unresectable liver-only CRLM under strict selection criteria and only under well-designed research protocols. Ongoing studies are expected to further elucidate the indications and prognosis of patients undergoing LT for unresectable CRLM.  相似文献   

5.
Small series have suggested that split liver transplantation (SLT) has an increased frequency of peri‐operative acute kidney injury (AKI). However, the optimal donor selection in this setting could have a favourable impact on renal outcomes. This was a retrospective single‐centre study of 76 adults who underwent SLT (right extended lobe) and 301 adults who underwent elective full‐size donation after brain death liver transplantation (FSLT). SLT recipients were less likely than unmatched FSLT recipients to develop AKI (≥stage 1 KDIGO criteria) (40.3% vs. 56.1%, P = 0.016) and had a reduced frequency of renal replacement therapy (11.8% vs. 21.9%, P = 0.049). In 72 pairs of SLT patients and propensity risk score‐matched FSLT controls the incidence of AKI was not significantly different (40.3% vs. 47.2%, P = 0.473). However, SLT patients were less likely to require renal replacement therapy (11.1% vs. 23.6%, P = 0.078; adjusted OR 0.32; 95% CI 0.11–0.87, P = 0.026). There was no association between SLT and the development of chronic kidney disease (eGFR<60 ml/min/1.73 m2, log rank P = 0.534). In conclusion, SLT is not associated with an increased frequency of AKI. These observations support the postulation that the optimal donor status of SLT may result in less graft injury with renal sparing effects.  相似文献   

6.
《Transplantation proceedings》2022,54(7):1786-1794
BackgroundThe aim of this study was to evaluate the effect of a recipient's obesity on posttransplant complications and patient and graft survival.MethodsA single-institution, retrospective study was performed on obese renal transplant recipients (BMI ≥ 30 kg/m2, n = 102) from January 2010 to December 2018, matched with non-obese recipients (BMI < 30 kg/m2, n = 204). For comparison, for every obese patient we selected 2 nonobese patients with a similar age, sex, and period of transplantation. The comparative analysis included patient and graft survival as primary outcomes and graft function and postoperative complications as a secondary outcome.ResultsRecipient demographics were comparable in both groups except for diabetic nephropathy in obese patients (P = .0006). Obesity was strongly related to a poorer patient survival (risk ratio [RR] = 2.83 confidence interval [CI] 95% 1.14-7.04; P = .020) but there was no observed difference in graft survival (P = .6). While early graft function was inferior in the obese population (RR = 2.41; CI 95% 1.53-3.79; P = .00016), during late follow-up, no statistically significant differences were observed between both groups (P = .36). Obese recipients had a significantly higher risk of delayed graft function (RR = 1.93; CI 95% (1.19-3.1), P = .0077), heart infarction (RR = 7; CI 95% 1.68-29.26; P = .0042), wound infections (RR = 8; CI 95% 1.96-32.87; P = .0015), diabetes aggravation (RR = 3.13; CI 95% 1.29-7.6; P = .011), and surgical revision for eventration (RR = 8; CI 95% 1.22-52.82; P = .026) when compared with nonobese recipients.ConclusionsDespite the inferior early kidney graft function in obese recipients, there was no difference observed at the long-term follow-up. However, recipient obesity demonstrated a negative effect on patient survival and postoperative complications.  相似文献   

7.
J. Zhang  W. Jiang  Q. Zhou  M. Ni  S. Liu  P. Zhu  Q. Wu  W. Li  M. Zhang  X. Xia 《Andrologia》2016,48(9):970-977
CAG‐repeat in the polymerase γ (POLG) gene encoding polymerase γ for mitochondria is important to spermatogenesis. Compared with a few researchers who raised alteration of CAG‐repeat‐affected male reproductive ability, others did not find the association between CAG‐repeat polymorphisms and male infertility. Therefore, a comprehensive meta‐analysis is necessary to determine the association; 13 case–control studies were screened out using keywords search. From these studies, characteristics were extracted for conducting meta‐analysis. Odds ratio (OR) and 95% confidence interval (CI) were used to describe the results; the results indicated that CAG‐repeat allele was not a risk factor to male infertility (pooled OR = 1.03, 95% CI: 0.79–1.34, = 0.828). Four different genetic comparisons also demonstrated a negative result: heterozygote comparison (not 10/10 versus 10/10. Pooled OR = 0.99, 95% CI: 0.77–1.27, = 0.948), homozygote comparison (not 10/not 10 versus 10/10. Pooled OR = 1.08, 95% CI: 0.56–2.06, = 0.816), the recessive genetic comparison (not 10/not 10 versus not 10/10 + 10/10. Pooled OR = 1.07, 95% CI: 0.58–1.95, = 0.829) and the dominant genetic comparison (not 10/not 10 + not 10/10 versus 10/10. Pooled OR = 0.97, 95% CI: 0.72–1.29, = 0.804); based on current researches, this meta‐analysis demonstrated no apparent association between POLG‐CAG‐repeat and male infertility. Similarly, CAG‐repeat was not a sensitive site to male infertility.  相似文献   

8.
Background/purpose  Graft survival is affected by various factors, such as preoperative state and the ages of the recipient and donor, as well as graft size. The objective of this study was to analyze the risk factors for graft survival. Methods  From September 1997 to July 2005, 24 patients who had undergone living-donor liver transplantation (LDLT) were retrospectively analyzed. Sixteen patients survived and the eight graft-loss cases were classified into two groups according to the cause of graft loss: graft dysfunction without major post-transplantation complications (graft dysfunction group; = 3), and graft dysfunction with such complications (secondary graft dysfunction group; = 5). Various factors were compared between these groups and the survival group. Results  Mean donor age was 31.9 years in the survival group and 49.2 years in the secondary graft dysfunction group (= 0.024). Graft weight/recipient standard liver volume ratios (G/SLVs) were 36.7% in the survival group, and 26.2% in the graft dysfunction group (= 0.037). The postoperative mean PT% for 1 week was 48.6% in the survival group and 38.1% in the secondary graft dysfunction group (= 0.05). Conclusions  Our surgical results demonstrated that G/SLV and donor age were independent factors that affected graft survival rates.  相似文献   

9.
Post‐transplantation lymphoproliferative disorders (PTLD) are associated with poor patient and graft survival. The risk of rejection and subsequent graft loss are increased by the reduction of immunosuppression therapy, the cornerstone of PTLD treatment. This multicentre, retrospective, nonrandomized cohort study includes 104 adults who developed PTLD after renal or simultaneous renal/pancreatic transplantation between 1990 and 2007. It examines the effect of calcineurin inhibitor (CNI) withdrawal on long‐term graft and patient survival. At 10 years postonset of PTLD, the Kaplan–Meier graft loss rate was 43.9% and graft loss or death with functioning graft was 64.4%. Cox multivariate analysis determined risk factors of graft loss as PTLD stage greater than I‐II and CNI withdrawal, and for graft loss and mortality, these remained risk factors along with age over 60 years. Type and location of PTLD, year of diagnosis, and chemotherapy regime were not independent risk factors. Multivariate analysis determined CNI withdrawal as the most important risk factor for graft loss (HR = 3.07, CI 95%: 1.04–9.09; P = 0.04) and death (HR: 4.00, CI 95%: 1.77–9.04; P < 0.001). While long‐term stable renal function after definitive CNI withdrawal for PTLD has been reported, this review determined that withdrawal is associated with reduced graft and patient survival.  相似文献   

10.
Allocation of donors with regard to human leukocyte antigen (HLA) is controversial in heart transplantation. This paper is a systematic review and meta‐analysis of the available evidence. PubMed, Embase, and the Cochrane Library were searched systematically for studies that addressed the effects of HLA matching on outcome after heart transplantation. Fifty‐seven studies met the eligibility criteria. 34 studies had graft rejection as outcome, with 26 of the studies reporting a significant reduction in graft rejection with increasing degree of HLA matching. Thirteen of 18 articles that reported on graft failure found that it decreased significantly with increasing HLA match. Two multicenter studies and nine single‐center studies provided sufficient data to provide summary estimates at 12 months. Pooled comparisons showed that graft survival increased with fewer HLA‐DR mismatches [0–1 vs. 2 mismatches: risk ratio (RR) = 1.09 (95% confidence interval (CI): 1.01–1.19; = 0.04)]. Having fewer HLA‐DR mismatches (0–1 vs. 2) reduced the incidence of acute rejection [(RR = 0.81 (0.66–0.99; P = 0.04)]. Despite the considerable heterogeneity between studies, the short observation time, and older data, HLA matching improves graft survival in heart transplantation. Prospective HLA‐DR matching is clinically feasible and should be considered as a major selection criterion.  相似文献   

11.
Neighborhood socioeconomic deprivation is associated with adverse outcomes after pediatric liver transplant. We sought to determine if this relationship varies by transplant center. Using SRTR, we included patients <18 years transplanted 2008–2013 (N = 2804). We matched patient ZIP codes to a deprivation index (range [0,1]; higher values indicate increased socioeconomic deprivation). A center-level patient-mix deprivation index was defined by the distribution of patient-level deprivation. Centers (n = 66) were classified as high or low deprivation if their patient-mix deprivation index was above or below the median across centers. Center quality was classified as low or high graft failure if graft survival rates were better or worse than the overall 10-year graft survival rate. Primary outcome was patient-level graft survival. We used random-effect Cox models to evaluate center-level covariates on graft failure. We modeled center quality using stratified Cox models. In multivariate analysis, each 0.1 increase in the patient-mix deprivation index was associated with increased hazard of graft failure (HR 1.32; 95%CI: 1.05, 1.66). When stratified by center quality, patient-mix deprivation was no longer significant (HR 1.07, 95%CI: 0.89, 1.28). Some transplant centers care for predominantly high deprivation children and maintain excellent outcomes. Revealing and replicating these centers’ practice patterns should enable more equitable outcomes.  相似文献   

12.
Current short-term kidney post–transplant survival rates are excellent, but longer-term outcomes have historically been unchanged. This study used data from the national Scientific Registry of Transplant Recipients (SRTR) and evaluated 1-year and 5-year graft survival and half-lives for kidney transplant recipients in the US. All adult (≥18 years) solitary kidney transplants (n = 331,216) from 1995 to 2017 were included in the analysis. Mean age was 49.4 years (SD +/-13.7), 60% male, and 25% Black. The overall (deceased and living donor) adjusted hazard of graft failure steadily decreased from 0.89 (95%CI: 0.88, 0.91) in era 2000–2004 to 0.46 (95%CI: 0.45, 0.47) for era 2014–2017 (1995–1999 as reference). Improvements in adjusted hazards of graft failure were more favorable for Blacks, diabetics and older recipients. Median survival for deceased donor transplants increased from 8.2 years in era 1995–1999 to an estimated 11.7 years in the most recent era. Living kidney donor transplant median survival increased from 12.1 years in 1995–1999 to an estimated 19.2 years for transplants in 2014–2017. In conclusion, these data show continuous improvement in long-term outcomes with more notable improvement among higher-risk subgroups, suggesting a narrowing in the gap for those disadvantaged after transplantation.  相似文献   

13.
Pressure injuries (PIs) are one of the major and costliest medical problems with severe implications for patients. Cardiovascular surgery patients are at the higher risk of developing surgery-related PIs. So this study was conducted with the aim of investigating the prevalence and factors associated with PIs in patients undergoing open heart surgery. We identified articles through electronic databases such as Web of Science, Scopus, PubMed, ProQuest; and Persian Databases: SID, Magiran and Irandoc without restriction on language or publication period (from inception through June 2022). Finally, 17 studies that fulfilled eligibility criteria were included in final systematic review and meta-analysis. Data analyses were conducted using STATA version 14. The pooled prevalence of PI in patients undergoing open heart surgery was 24.06% (95% CI: 17.85–30.27). High heterogeneity was observed across the included studies (I2 = 96.0, P < 0.000). The prevalence by gender was reported as 25.19% (95% CI: 13.45–36.93) in men and 33.36% (95 CI%: 19.99–46.74) in women. The result showed there was statistically significant association between PI and Female sex (Pooled Est: 1.551, 95% CI: 1.199–2.006, z = 3.345, P = 0.001), diabetes (Pooled Est: 1.985, 95% CI: 1.383–2.849, z = 3.719, P = 0.000), advanced age (SMD: 0.33 years; 95% CI: 0.09–0.57), Duration of surgery (SMD: 0.47; 95% CI: 0.19–0.75) and preoperative serum albumin level (SMD: 0.56; 95% CI: 0.14–0.98). The relatively high PIs incidence among patients undergoing open heart surgery suggests that typical PI prevention methods are insufficient for this population. Targeted prevention measures must be developed and implemented.  相似文献   

14.
Despite a continuously growing knowledge of the impact of factors on kidney graft function, such as donor age, body mass index, and cold ischemia time, few data are available regarding anastomosis time (AT) and its impact on long‐term results. We investigated whether surgical AT correlates with patient and graft survival after kidney transplantation performing a retrospective analysis of 1245 consecutive deceased donor kidney transplantations between 01/2000 and 12/2010 at Innsbruck Medical University. Kaplan–Meier and log‐rank analyses were carried out for 1‐ and 5‐year patient and graft survival. AT was defined as time from anastomosis start until reperfusion. Median AT was 30 min. Five‐year survival of allografts with an AT >30 min was 76.6% compared with 80.6% in the group with AT <30 min (P = 0.027). Patient survival in the group with higher AT similarly was inferior with 85.7% after 5 years compared with 89.6% (P < 0.0001) [Correction added on February 18, 2015, after first online publication: the percentage value for patient survival was previously incorrect and have now been changed to 89.6%]. Cox regression analysis revealed AT as an independent significant factor for patient survival (HR 1.021 per minute; 95% CI 1.006–1.037; P = 0.006). As longer AT closely correlates with inferior long‐term patient survival, it has to be considered as a major risk factor for inferior long‐term results after deceased donor kidney transplantation.  相似文献   

15.
Donor cardiac arrest and cardiopulmonary resuscitation (CACPR) has been considered critically because of concerns over hypoperfusion and mechanical trauma to the donor organs. We retrospectively analyzed 371 first simultaneous pancreas–kidney transplants performed at the Medical University of Innsbruck between 1997 and 2017. We evaluated short- and long-term outcomes from recipients of organs from donors with and without a history of CACPR. A total of 63 recipients received a pancreas and kidney graft from a CACPR donor. At 1, and 5-years, patient survival was similar with 98.3%, and 96.5% in the CACPR and 97.0%, and 90.2% in the non-CACPR group (log rank P = 0.652). Death-censored pancreas graft survival was superior in the CACPR group with 98.3%, and 91.4% compared to 86.3%, and 77.4% (log rank P = 0.028) in the non-CACPR group, which remained statistically significant even after adjustment [aHR 0.49 (95% CI 0.24–0.98), P = 0.044]. Similar relative risks for postoperative complications Clavien Dindo > 3a, pancreatitis, abscess, immunologic complications, delayed pancreas graft function, and relative length of stay were observed for both groups. Donors with a history of CACPR are, in the current practice, safe for transplantation. Stringent donor selection and short CPR durations may allow for outcomes surpassing those of donors without CACPR.  相似文献   

16.
The aim of this retrospective single-center study was to investigate the short- and long-term impact of neutropenia occurring within the first year after kidney transplantation, with a special emphasis on different neutropenia grades. In this unselected cohort, 225/721 patients (31%) developed 357 neutropenic episodes within the first year post-transplant. Based on the nadir neutrophil count, patients were grouped as neutropenia grade 2 (<1.5–1.0*109/l; = 105), grade 3 (<1.0–0.5*109/l; = 65), and grade 4 (<0.5*109/l; = 55). Most neutropenia episodes were presumably drug-related (71%) and managed by reduction/discontinuation of potentially responsible drugs (mycophenolic acid [MPA] 51%, valganciclovir 25%, trimethoprim/sulfamethoxazole 19%). Steroids were added/increased as replacement for reduced/discontinued MPA. Granulocyte colony-stimulating factor was only used in 2/357 neutropenia episodes (0.6%). One-year incidence of (sub)clinical rejection, one-year mortality, and long-term patient and graft survival were not different among patients without neutropenia and neutropenia grade 2/3/4. However, the incidence of infections was about 3-times higher during neutropenia grade 3 and 4, but not increased during grade 2. In conclusion, neutropenia within the first year after kidney transplantation represents no increased risk for rejection and has no negative impact on long-term patient and graft survival. Adding/increasing steroids as replacement for reduced/discontinued MPA might supplement management of neutropenia.  相似文献   

17.
PurposeMammographic screening programmes have increased detection rates of non-palpable breast cancers. In these cases, wire-guided localization (WGL) is the most common approach used to guide breast conserving surgery (BCS). Several RCTs have compared WGL to a range of novel localization techniques. We aimed to perform a network meta-analysis (NMA) of randomized controlled trials (RCTs) comparing methods of non-palpable breast cancer localization.MethodsA NMA was performed according to PRISMA-NMA guidelines. Analysis was performed using R packages and Shiny.Results24 RCTs assessing 9 tumour localization methods in 4236 breasts were included. Margin positivity and reoperation rates were 16.9% (714/4236) and 14.3% (409/2870) respectively. Cryo-assisted localization had the highest margin positivity (28.2%, 58/206) and reoperation (18.9%, 39/206) rates. Compared to WGL (n = 2045 from 24 RCTs) only ultrasound guided localization (USGL) (n = 316 from 3 RCTs) significantly lowered margin positivity (odds ratio (OR): 0.192, 95% confidence interval (CI): 0.079–0.450) and reoperation rates (OR: 0.182, 95%CI: 0.069–0.434). Anchor-guided localization (n = 52, 1 RCT) significantly lowered margin positivity (OR: 0.229, 95%CI: 0.050–0.938) and magnetic-marker localization improved patient satisfaction (OR: 0.021, 95%CI: 0.001–0.548). There was no difference in operation duration, overall complications, haematoma, seroma, surgical site infection rates, or specimen size/vol/wt between methods.ConclusionUSGL and AGL are non-inferior to WGL for the localization of non-palpable breast cancers. The reported data suggests that these techniques confer reduced margin positivity rates and requirement for re-operation. However, caution when interpreting results relating to RCTs with small sample sizes and further validation is required in larger prospective, randomized studies.  相似文献   

18.
Lung transplant recipients are at high risk for herpes zoster and preventive measures are a significant unmet need. We investigated the safety and immunogenicity of two doses of a recombinant zoster vaccine (RZV) in lung transplant recipients (≥50 years). We enrolled 50 patients of which 49 received at least one vaccine dose. Anti-glycoprotein E (gE) antibody levels (n = 43) increased significantly compared to baseline (median optical density [OD] 1.96; interquartile range [IQR]: 1.17–2.89) after the first (median OD 3.41, IQR 2.54–3.81, p < .0001) and second vaccine dose (median OD 3.63, IQR 3.39–3.86, p < .0001). gE-specific polyfunctional CD4+ T cell frequencies (n = 38) also increased from baseline (median 85 per 106 CD4+ T cells; IQR: 46–180) to the first (median 128 per 106 CD4+ T cells; IQR: 82–353; p = .023) and after the second dose (median 361 per 106 CD4+ T cells; IQR: 146–848; p < .0001). Tenderness (83.0%; 95%CI: 69.2–92.4%) and redness (31.9%; 95%CI: 19.1–47.1%) at injection site were common. One rejection episode within 3 weeks of vaccination was observed. This is the first study demonstrating that RZV was safe and elicited significant humoral and cell-mediated immunity in lung transplant recipients. RZV is a new option for the prevention of shingles in this population.  相似文献   

19.
Rift Valley fever (RVF) is a zoonotic viral disease of domestic ruminants in Africa and the Arabian Peninsula caused by a mosquito‐borne Phlebovirus. Outbreaks in livestock and humans occur after heavy rains favour breeding of vectors, and the virus is thought to survive dry seasons in the eggs of floodwater‐breeding aedine mosquitoes. We recently found high seroconversion rates to RVF virus (RVFV) in cattle and goats, in the absence of outbreaks, in far northern KwaZulu‐Natal (KZN), South Africa. Here, we report the prevalence of, and factors associated with, neutralizing antibodies to RVFV in 326 sera collected opportunistically from nyala (Tragelaphus angasii) and impala (Aepyceros melampus) culled during 2016–2018 in two nature reserves in the same area. The overall seroprevalence of RVFV, determined using the serum neutralization test, was 35.0% (114/326; 95%CI: 29.8%–40.4%) and tended to be higher in Ndumo Game Reserve (11/20; 55.0%; 95%CI: 31.5%–76.9%) than in Tembe Elephant Park (103/306; 33.6%; 95%CI: 28.4%–39.3%) (p = .087). The presence of antibodies in juveniles (6/21; 28.6%; 95%CI: 11.3%–52.2%) and sub‐adults (13/65; 20.0%; 95%CI: 11.1%–37.8%) confirmed that infections had occurred at least until 2016, well after the 2008–2011 RVF outbreaks in South Africa. Odds of seropositivity was higher in adults than in sub‐adults (OR = 3.98; 95%CI: 1.83–8.67; p = .001), in males than in females (OR = 2.66; 95%CI: 1.51–4.68; p = .001) and in animals collected ≤2 km from a swamp or floodplain compared with those collected further away (OR = 3.30; 95%CI: 1.70–6.38; p < .001). Under similar ecological conditions, domestic and wild ruminants may play a similar role in maintenance of RVFV circulation and either or both may serve as the mammalian host in a vector–host reservoir system. The study confirms the recent circulation of RVFV in the tropical coastal plain of northern KZN, providing the basis for investigation of factors affecting virus circulation and the role of wildlife in RVF epidemiology.  相似文献   

20.
Pediatric en bloc kidney transplants (EBKs) from small deceased pediatric donors are associated with increased early graft loss and morbidity. Yet, urologic complications post‐EBK and their potential impact on graft survival have not been systematically studied. We retrospectively studied urological complications requiring intervention for 225 EBKs performed at our center January 2005 to September 2017 from donors ≤20 kg into recipients ≥18 years. Overall ureteral complication incidence after EBK was 9.8% (n = 22) (12% vs 2% for EBK donors 10 vs 10 kg, respectively [P = .031]). The most common post‐EBK urologic complication was a stricture (55%), followed by urine leak (41%). In all, 95% of all urologic complications occurred early within 5 months posttransplant (median, 138 days). Urologic complications could be successfully managed nonoperatively in 50% of all cases and had no impact on graft or patient survival. In summary, urologic complications after EBK were common, associated with lower donor weights, occurred early posttransplant, and were often amenable to nonoperative treatment, without adversely affecting survival. We conclude that the higher urologic complication rate after EBK (1) should not prevent increased utilization of small pediatric donor en bloc kidneys for properly selected recipients, and (2) warrants specific discussion with EBK recipients during the preoperative consent process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号