首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
ObjectivesBelatacept may provide benefit in delayed graft function, but its association with infectious complications is understudied. We aim to assess the incidence of CMV and BK viremia in patients treated with sirolimus or belatacept as part of a three-drug immunosuppression regimen after kidney transplantation.Materials and methodsKidney transplant recipients from 01/01/2015 to 10/01/2021 were retrospectively reviewed. Maintenance immunosuppression was either tacrolimus, mycophenolate and sirolimus (B0) or tacrolimus, mycophenolate, and belatacept (5.0 mg/kg monthly) (B1). Primary outcomes of interest were BK and CMV viremia which were followed until the end of the study period. Secondary outcomes included graft function (serum creatinine, eGFR) and acute rejection through 12 months.ResultsBelatacept was initiated in patients with a higher mean kidney donor profile index (B0:0.36 vs. B1:0.44, p = .02) with more delayed graft function (B0:6.1% vs. B1:26.1%, p < .001). Belatacept therapy was associated with more “severe” CMV viremia >25,000 copies/mL (B0:1.2% vs. B1:5.9%, p = .016) and CMV disease (B0:0.41% vs. B1:4.2%, p = .015). However, there was no difference in the overall incidence of CMV viremia >200 IU/mL (B0:9.4% vs. B1:13.5%, p = .28). There was no difference in the incidence of BK viremia >200 IU/mL (B0:29.7% vs. B1:31.1%, p = .78) or BK-associated nephropathy (B0:2.4% vs. B1:1.7%, p = .58), but belatacept was associated with “severe” BK viremia, defined as >10,000 IU/mL (B0:13.0% vs. B1:21.8%, p = .03). The mean serum Cr was significantly higher with belatacept therapy at 1-year follow up (B0:1.24 mg/dL vs. B1:1.43 mg/dL, p = .003). Biopsy-proven acute rejection (B0:1.2% vs. B1:2.6%, p = .35) and graft loss (B0:1.2% vs. B1:0.84%, p = .81) were comparable at 12 months.ConclusionsBelatacept therapy was associated with an increased risk of CMV disease and “severe” CMV and BK viremia. However, this regimen did not increase the overall incidence of infection and facilitated comparable acute rejection and graft loss at 12-month follow up.  相似文献   

2.
《Transplantation proceedings》2023,55(7):1568-1574
BackgroundThe incidence of delayed graft function (DGF) among kidney transplant recipients (KTRs) in the United States continues to increase. The effect of immediate-release tacrolimus (tacrolimus) compared with extended-release tacrolimus (Envarsus) among recipients with DGF is unknown.MethodsThis was a single-center open-label randomized control trial among KTRs with DGF (ClinicalTrials. gov, NCT03864926). KTRs were randomized either to continue on tacrolimus or switch to Envarsus at a 1:1 ratio. Duration of DGF (study period), number of dialysis treatments, and need for adjustment of calcineurin inhibitor (CNI) doses during the study period were outcomes of interest.ResultsA total of 100 KTRs were enrolled, 50 in the Envarsus arm and 50 in the tacrolimus arm; of those, 49 in the Envarsus arm and 48 in the tacrolimus arm were included for analysis. There were no differences in the baseline characteristics, all P > .5, except donors in the Envarsus arm had higher body mass index (mean body mass index 32.9 ± 11.3 vs 29.4 ± 7.6 kg/m2 [P = .007]) compared with the tacrolimus arm. The median duration of DGF (5 days vs 4 days, P = .71) and the number of dialysis treatments (2 vs 2, P = .83) were similar between the groups. However, the median number of CNI dose adjustments during the study period in the Envarsus group was significantly lower (3 vs 4, P = .002).ConclusionsEnvarsus patients had less fluctuation in the CNI level, requiring fewer CNI dose adjustments. However, there were no differences in the DGF recovery duration or number of dialysis treatments.  相似文献   

3.
AimTo examine and characterize post-transplant eosinophilic gastrointestinal disorders (PTEGID) and post-transplant lymphoproliferative disorder (PTLD) in pediatric liver transplant recipients.MethodsThis is a single center retrospective study of all liver transplant recipients aged 0–18 years from 1999 to 2019 who received tacrolimus as their primary immunosuppressant. Demographic data and clinical/laboratory data including PTEGID, PTLD, liver transplant types, Epstein-Barr virus status, and blood eosinophil count were reviewed. Analysis was done with logistic regression and Mann-Whitney U test.ResultsNinety-eight pediatric liver transplant recipients were included with median age at transplantation of 3.3 years (IQR: 1.1–9.3). The major indication for transplantation was biliary atresia, 51 (52%) cases. Eight (8%) children had PTLD and 14 (14%) had PTEGID. Receiving liver transplantation at an age of ≤1 year was associated with developing PTEGID (OR = 11.9, 95% CI = 3.5–45.6, p < 0.001). Additionally, eosinophilic count of ≥500/μL was associated with having PTLD (OR = 10.7, 95% CI = 1.8–206.0, p = 0.030) as well as having at least one liver rejection (OR = 2.8, 95% CI = 1.2–7.0, p = 0.024). The frequency of food-induced anaphylaxis significantly increased post-transplantation (p = 0.023).ConclusionsPTEGID and PTLD are common in this cohort and are associated with certain risk factors that help screen children to improve recipient survival. Further studies are needed to evaluate the clinical benefits of these findings.  相似文献   

4.
In the phase II IM103‐100 study, kidney transplant recipients were first randomized to belatacept more‐intensive‐based (n = 74), belatacept less‐intensive‐based (n = 71), or cyclosporine‐based (n = 73) immunosuppression. At 3‐6 months posttransplant, belatacept‐treated patients were re‐randomized to receive belatacept every 4 weeks (4‐weekly, n = 62) or every 8 weeks (8‐weekly, n = 60). Patients initially randomized to cyclosporine continued to receive cyclosporine‐based immunosuppression. Cumulative rates of biopsy‐proven acute rejection (BPAR) from first randomization to year 10 were 22.8%, 37.0%, and 25.8% for belatacept more‐intensive, belatacept less‐intensive, and cyclosporine, respectively (belatacept more‐intensive vs cyclosporine: hazard ratio [HR] = 0.95; 95% confidence interval [CI] 0.47‐1.92; P = .89; belatacept less‐intensive vs cyclosporine: HR = 1.61; 95% CI 0.85‐3.05; P = .15). Cumulative BPAR rates from second randomization to year 10 for belatacept 4‐weekly, belatacept 8‐weekly, and cyclosporine were 11.1%, 21.9%, and 13.9%, respectively (belatacept 4‐weekly vs cyclosporine: HR = 1.06, 95% CI 0.35‐3.17, P = .92; belatacept 8‐weekly vs cyclosporine: HR = 2.00, 95% CI 0.75‐5.35, = .17). Renal function trends were estimated using a repeated‐measures model. Estimated mean GFR values at year 10 for belatacept 4‐weekly, belatacept 8‐weekly, and cyclosporine were 67.0, 68.7, and 42.7 mL/min per 1.73 m2, respectively (P<.001 for overall treatment effect). Although not statistically significant, rates of BPAR were 2‐fold higher in patients administered belatacept every 8 weeks vs every 4 weeks.  相似文献   

5.
We performed a prospective, 12‐month, single‐center, nonrandomized, open‐label pilot study to investigate the use of belatacept therapy combined with alemtuzumab induction in renal allografts with preexisting pathology, as these kidneys may be more susceptible to additional toxicity when exposed to calcineurin inhibitors posttransplant. Nineteen belatacept recipients were matched retrospectively to a cohort of tacrolimus recipients on the basis of preimplantation pathology. The estimated glomerular filtration rate was not significantly different between belatacept and tacrolimus recipients at either 3 or 12 months posttransplant (59 vs 45, P = 0.1 and 56 vs 48 mL/min/1.72/m2, P = 0.3). Biopsy‐proven acute rejection rates at 12 months were 26% in belatacept recipients and 16% in tacrolimus recipients (P = 0.7). Graft survival at 1 year was 89% in both groups. Alemtuzumab induction combined with either calcineurin inhibitor or costimulatory blockade therapies resulted in similar acceptable one‐year outcomes in kidneys with preexisting pathologic changes. Longer‐term follow‐up may be necessary to identify preferential strategies to improve outcomes of kidneys at a higher risk for poor function (ClinicalTrials.gov—NCT01496417).  相似文献   

6.
BackgroundBelatacept has been demonstrated as an effective alternative immunosuppressant in kidney transplant recipients. This study focuses on outcomes of early and late conversion to Belatacept-based immunosuppression after kidney transplant.Materials and methodsThis retrospective analysis of a prospectively collected database included all adult kidney transplants patients at SUNY Upstate Medical Hospital from 1 January 2014 to 30 December 2022. Early conversion was defined as all conversions done at <6 months after kidney transplantation, and late conversion to belatacept was defined as conversion at >6 months after kidney transplantation.ResultsOut of 61 patients included in this study, 33 patients (54%) were in the early conversion group, and 28 patients (46%) were in the late conversion group. The mean eGFR in the early conversion group was 26.73 ± 16.26 ml/min/1.73 m2 before conversion to belatacept, which improved to 45.3 ± 21.01 ml/min/1.73 m2 at one-year post-conversion (p = 0.0006). Furthermore, eGFR changes in the late conversion group were insignificant, with 46.30 ± 15.65 ml/min/1.73 m2 before conversion to belatacept, and 44.76 ± 22.91 ml/min/1.73 m2 after one year of follow-up (p = 0.72). All four biopsy-proven allograft rejections in the early conversion group were acute T-cell-mediated rejections (ATMR). In the late conversion group, out of three biopsy-proven rejections, one was chronic antibody-mediated rejection (CAMR), one was ATMR, and one was mixed ATMR/CAMR. All four patients with ATMR rejection received mycophenolic acid (MPA) as part of their immunosuppressive regimen, and none received tacrolimus. The one-year post-conversion allograft survival rate in early and late conversion groups was 100%. However, the one-year post-conversion patient survival rate was 90.9% in the early conversion group and 100% in the late conversion group (P = 0.11).ConclusionsEarly post-transplant conversion to belatacept can improve the eGFR more meaningful when compared to late conversion. Patients who receive belatacept and MPA rather than tacrolimus may have increased rates of T-cell-mediated rejection.  相似文献   

7.
《Transplantation proceedings》2021,53(6):1998-2003
BackgroundAlthough effective for curtailing alloimmune responses, calcineurin inhibitors (CNIs) have an adverse-effect profile that includes nephrotoxicity. In lung transplant (LTx) recipients, the optimal serum levels of the CNI tacrolimus necessary to control alloimmune responses and minimize nephrotoxicity are unknown.MethodsThis retrospective, single-center study reviewed tacrolimus whole blood trough levels (BTLs), grades of acute cellular rejection (ACR), acute rejection scores, and creatinine clearance (CrCl) obtained in LTx recipients within the first year after their transplant procedure. Comparisons were made between the first 90 days post LTx (when tacrolimus BTLs were maintained >10 µg/L) and the remainder of the post-LTX year (when BTLs were <10 µg/L).ResultsDespite tacrolimus mean BTLs being higher during the first 90 days post LTx compared with the remainder of the first post-LTx year (10.4 ± 0.3 µg/L vs 9.5 ± 0.3 µg/L, P < .0001) there was no association with lower grades of ACR (P = .24). The intensity of ACR (as determined by acute rejection scores) did not correlate with tacrolimus mean BTLs at any time during the first posttransplant year (P = .79). During the first 90 days post LTx there was a significant decline in CrCl and a correlation between increasing tacrolimus mean BTLs and declining CrCl (r = −0.26, P = .03); a correlation that was not observed during the remainder of the year (r = −0.09, P = .52).ConclusionsIn LTx recipients, maintaining BTLs of the CNI tacrolimus >10µg/L did not result in superior control of acute rejection responses but was associated with declining renal function.  相似文献   

8.
ObjectiveTo compare the safety and short-term outcomes between robotic-assisted and laparoscopic left hemi-hepatectomies in a single academic medical center.MethodsA cohort of 52 patients, who underwent robotic-assisted or laparoscopic left hemi-hepatectomies between April 2015 and January 2020 in Department of Pancreatobiliary Surgery, the First Affiliated Hospital of Sun Yat-Sen University was recruited into the study. Their clinicopathological features and short-term outcomes were analyzed retrospectively.ResultsThere were 25 robotic-assisted and 27 laparoscopic cases, with a median age of 55 years (34–77 years). There was one conversion to open in laparoscopic group. There were no significant differences in clinicopathological features between two groups, except robotic group had higher body mass index (23.9 vs. 22.0 kg/m2, p = 0.047). Robotic-assisted and laparoscopic groups had similar operative time (300 vs. 310 min, p = 0.515), length of hospital stay (8 vs. 8 days, p = 0.981) and complication rates (4.0% vs. 14.8%, p = 0.395), but the former had less blood loss (100 vs. 200 ml, p < 0.001) and lower incidence of blood transfusion (0% vs. 22.2%, p = 0.023) in comparison with laparoscopic group. R0 resection was achieved for all patients with malignancies. There was no perioperative mortality in both groups. The cost of robotic group was higher than laparoscopic group (105,870 vs. 64,191 RMB yuan, p = 0.02).ConclusionThe robotic-assisted and laparoscopic approaches had similar safety and short-term outcomes in left hemi-hepatectomy, and the former can reduce operative blood loss and blood transfusion. However, the costs were higher in robotic group.  相似文献   

9.
《Journal of pediatric surgery》2021,56(12):2299-2304
Background/PurposeTo examine the influence of parenteral nutrition (PN) on clinical outcomes and cost in children with complicated appendicitis.MethodsRetrospective study of 1,073 children with complicated appendicitis from 29 hospitals participating in the NSQIP-Pediatric Appendectomy Pilot Collaborative (1/2013–6/2015). Mixed-effects regression was used to compare 30-day postoperative outcomes between high and low PN-utilizing hospitals after propensity matching on demographic characteristics, BMI and postoperative LOS as a surrogate for disease severity.ResultsOverall PN utilization was 13.6%, ranging from 0–10.3% at low utilization hospitals (n = 452) and 10.3–32.4% at high utilization hospitals (n = 621). Outcomes were similar between low and high utilization hospitals for rates of overall complications (12.3% vs. 10.5%, OR: 0.80 [0.46,1.37], p = 0.41), SSIs (11.3% vs. 8.8%, OR: 0.72 [0.40,1.32], p = 0.29) and revisits (14.7% vs. 15.9%, OR: 1.10 [0.75,1.61], p = 0.63). Adjusted mean 30-day cumulative hospital cost was 22.9% higher for patients receiving PN ($25,164 vs. $20,478, p < 0.01) after controlling for postoperative LOS.ConclusionFollowing adjustment for patient characteristics and postoperative length of stay, higher rates of PN utilization in children with complicated appendicitis were associated with higher cost but not with lower rates of overall complications, surgical site infections or revisits.Level of Evidence Level III: Treatment study - Retrospective comparative study  相似文献   

10.
IntroductionGastro-esophageal reflux disease (GERD) is the most frequent long-term morbidity of congenital diaphragmatic hernia (CDH) survivors. Performing a preventive fundoplication during CDH repair remains controversial. This study aimed to: (1) Analyze the variability in practices regarding preventive fundoplication; (2) Identify predictive factors for fundoplication.(3) Evaluate the impact of preventive fundoplication on gastro-intestinal outcomes in children with a CDH patch repair;MethodsThis prospective multi-institutional cohort study (French CDH Registry) included CDH neonates born in France between January 1st, 2010-December 31st, 2018. Patch CDH was defined as need for synthetic patch or muscle flap repair. Main outcome measures included need for curative fundoplication, tube feed supplementation, failure to thrive, and oral aversion.ResultsOf 762 CDH neonates included, 81 underwent fundoplication (10.6%), either preventive or curative. Median follow-up was 3.0 years (IQR: 1.0–5.0).(1) Preventive fundoplication is considered in only 31% of centers. The rates of both curative fundoplication (9% vs 3%, p = 0.01) and overall fundoplication (20% vs 3%, p < 0.0001) are higher in centers that perform preventive fundoplication compared to those that do not.(2) Predictive factors for preventive fundoplication were: prenatal diagnosis (p = 0.006), intra-thoracic liver (p = 0.005), fetal tracheal occlusion (p = 0.002), CDH-grade C-D (p < 0.0001), patch repair (p < 0.0001). After CDH repair, 8% (n = 51) required curative fundoplication (median age: 101 days), for which a patch repair was the only independent predictive factors identified upon multivariate analysis.(3) In neonates with patch CDH, preventive fundoplication did not decrease the need for curative fundoplication (15% vs 11%, p = 0.53), and was associated with higher rates of failure to thrive (discharge: 81% vs 51%, p = 0.03; 6-months: 81% vs 45%, p = 0.008), tube feeds (6-months: 50% vs 21%, p = 0.02; 2-years: 65% vs 26%, p = 0.004), and oral aversion (6-months: 67% vs 37%, p = 0.02; 1-year: 71% vs 40%, p = 0.03).ConclusionsChildren undergoing a CDH patch repair are at high risk of requiring a curative fundoplication. However, preventive fundoplication during a patch repair does not decrease the need for curative fundoplication and is associated with worse gastro-intestinal outcomes in children.Level of evidenceII – Prospective Study.  相似文献   

11.
While belatacept has shown favorable short‐ and midterm results in kidney transplant recipients, only projections exist regarding its potential impact on long‐term outcome. Therefore, we performed a retrospective case‐match analysis of the 14 belatacept patients originally enrolled in the phase II multicenter trial at our center. Fifty six cyclosporine (CyA)‐treated patients were matched according to age at transplantation, first/retransplant, and donor type. Ten years after kidney transplantation, kidney function remained superior in belatacept‐treated patients compared with the CyA control group. Moreover, none of the belatacept‐treated patients had donor‐specific antibodies ≥10 years post‐transplantation compared with 38.5% of tested CyA‐treated subject (0/10 vs. 5/13; P = 0.045). Notably, however, patient and graft survival was virtually identical in both groups (71.4% vs. 71.3%; P = 0.976). In the present single‐center study population, patients treated with belatacept demonstrated a patient and graft survival at 10 years post‐transplant which was comparable to that of similarly selected CNI‐treated patients. Larger studies with sufficient statistical power are necessary to definitively determine long‐term graft survival with belatacept.  相似文献   

12.
BackgroundNo factors influencing the blood everolimus (EVL) concentrations has been identified until date. Our aim was to identify factors that can affect the ratio of the trough blood concentration to dose level (C0/D ratio) of EVL in kidney transplant recipients.MethodsWe retrospectively analyzed 448 patients who had undergone kidney transplantation and were subsequently being managed as our hospital between 2011 and 2015. Multivariate analysis were performed in an attempt to identify factors affecting the EVL C0/D ratio.ResultsThe numbers of patients receiving calcineurin inhibitor (CNI)-free regimen and regimen containing cyclosporine (CsA) or tacrolimus (TAC) were 47, 137 and 264 respectively. The EVL C0/D ratio did not differ significantly between the TAC(+) group and the CNI-free group, while it was significantly higher in the CsA(+) group than the TAC(+) group (p < 0.0001) and CNI-free group (p = 0.0003). In the multivariate analysis, age, gender, diabetes mellitus as a cause of the end-stage renal disease (ESRD), CsA, serum creatinine, and hemoglobin were selected as factors affecting the EVL C0/D ratio (R2 = 0.269, p < 0.0001). Using the stepwise method, although mycophenolate mofetil treatment was selected, did not differ significantly according to multiple linear regression analysis. Furthermore, concomitant use of CsA was identified as the most impactful factor, based on the standardized partial regression coefficients (β = 0.341).ConclusionsWe obtained an indication of patient characteristics that influences the EVL C0/D ratio. But the accuracy of the regression equation obtained from multiple regression analysis was poor (R2 = 0.269), and it was not accurate enough to predict the EVL C0/D ratio and use it for therapeutic drug monitoring.  相似文献   

13.
BackgroundImmunosuppressive treatment is often interrupted in the first months following kidney transplant failure (KTF) to limit side effects. The aim of this study was to assess the effect of prolonged treatment (PT) of more than 3 months’ duration after KTF on HLA sensitization and treatment tolerance.MethodsWe performed a retrospective observational study involving 119 patients with KTF in 3 French kidney transplant centers between June 2007 and June 2017. Sensitization was defined as the development of HLA donor-specific antibodies (DSA).ResultsIn the PT group receiving calcineurin inhibitor (CNI) treatment, 30 of 52 patients (57.7%) were sensitized vs 52 of 67 patients (77.6%) who had early cessation of treatment (P = .02). The results were confirmed by multivariate analysis (odds ratio [OR] = 0.39, 95% confidence interval [CI] [0.16; 0.98], P = .04). The development of de novo DSAs after CNI treatment (n = 63/90 [70.0%]) was significantly more frequent than during CNI treatment, (n = 18/52 [34.6%], P = .01). Panel-reactive antibody ≥85% was lower in the PT group in multivariate analysis (OR = 0.28, 95% CI [0.10; 0.78], P = .02). No differences in the rates of infection, cardiovascular complications, neoplasia, and deaths were observed between the 2 groups. In multivariate analysis, continuation of corticosteroid treatment had no influence on sensitization but was associated with a higher rate of infection (OR = 2.66, 95% CI [1.09; 6.46], P = .03).ConclusionMaintenance of CNI treatment after return to dialysis in patients requesting a repeat transplant could avoid the development of anti-HLA sensitization with a good tolerance.  相似文献   

14.
BackgroundInformation is needed regarding the complex relationships between long-term functional outcomes and health-related quality of life (HRQoL) in Hirschsprung's Disease (HSCR). We describe long-term outcomes across multiple domains, completing a core outcome set through to adulthood.MethodsHSCR patients operated at a single center over a 35-year period (1978–2013) were studied. Patients completed detailed questionnaires on bowel and urologic function, and HRQOL. Patients with learning disability (LD) were excluded. Outcomes were compared to normative data. Data are reported as median [IQR] or mean (SD).Results186 patients (median age 28 [18–32] years; 135 males) completed surveys. Bowel function was reduced (BFS 17 [14–19] vs. 19 [19–20], p < 0•0001;η2 = 0•22). Prevalence and severity of fecal soiling and fecal awareness improved with age (p < 0•05 for both). Urinary incontinence was more frequent than controls, most of all in 13–26y females (65% vs. 31%,p = 0•003). In adults, this correlated independently with constipation symptoms (OR 3.18 [1.4–7.5],p = 0.008). HRQoL outcomes strongly correlated with functional outcome: 42% of children demonstrated clinically significant reductions in overall PedsQL score, and poor bowel outcome was strongly associated with impaired QOL (B = 22•7 [12•7–32•7],p < 0•001). In adults, GIQLI scores were more often impacted in patients with extended segment disease. SF-36 scores were reduced relative to population level data in most domains, with large effect sizes noted for females in General Health (g = 1.19) and Social Wellbeing (g = 0.8).ConclusionFunctional impairment is common after pull-through, but bowel function improves with age. Clustering of poor functional outcomes across multiple domains identifies a need for early recognition and long-term support for these patients.  相似文献   

15.
Post‐transplantation lymphoproliferative disorders (PTLD) are associated with poor patient and graft survival. The risk of rejection and subsequent graft loss are increased by the reduction of immunosuppression therapy, the cornerstone of PTLD treatment. This multicentre, retrospective, nonrandomized cohort study includes 104 adults who developed PTLD after renal or simultaneous renal/pancreatic transplantation between 1990 and 2007. It examines the effect of calcineurin inhibitor (CNI) withdrawal on long‐term graft and patient survival. At 10 years postonset of PTLD, the Kaplan–Meier graft loss rate was 43.9% and graft loss or death with functioning graft was 64.4%. Cox multivariate analysis determined risk factors of graft loss as PTLD stage greater than I‐II and CNI withdrawal, and for graft loss and mortality, these remained risk factors along with age over 60 years. Type and location of PTLD, year of diagnosis, and chemotherapy regime were not independent risk factors. Multivariate analysis determined CNI withdrawal as the most important risk factor for graft loss (HR = 3.07, CI 95%: 1.04–9.09; P = 0.04) and death (HR: 4.00, CI 95%: 1.77–9.04; P < 0.001). While long‐term stable renal function after definitive CNI withdrawal for PTLD has been reported, this review determined that withdrawal is associated with reduced graft and patient survival.  相似文献   

16.
Kidney transplant recipients who switched from a calcineurin inhibitor (CNI) to belatacept demonstrated higher calculated glomerular filtration rates (cGFRs) at 1 year in a Phase II study. This report addresses whether improvement was sustained at 2 years in the long‐term extension (LTE). Patients receiving cyclosporine or tacrolimus were randomized to switch to belatacept or continue CNI. Of 173 randomized patients, 162 completed the 12‐month main study and entered the LTE. Two patients (n = 1 each group) had graft loss between Years 1–2. At Year 2, mean cGFR was 62.0 ml/min (belatacept) vs. 55.4 ml/min (CNI). The mean change in cGFR from baseline was +8.8 ml/min (belatacept) and +0.3 ml/min (CNI). Higher cGFR was observed in patients switched from either cyclosporine (+7.8 ml/min) or tacrolimus (+8.9 ml/min). The frequency of acute rejection in the LTE cohort was comparable between the belatacept and CNI groups by Year 2. All acute rejection episodes occurred during Year 1 in the belatacept patients and during Year 2 in the CNI group. There were more non‐serious mucocutaneous fungal infections in the belatacept group. Switching to a belatacept‐based regimen from a CNI‐based regimen resulted in a continued trend toward improved renal function at 2 years after switching.  相似文献   

17.
Study objectiveTo determine the effect of cognitive impairment (CI) and dementia on adverse outcomes in older surgical patients.DesignA systematic review and meta-analysis of observational studies and randomized controlled trials (RCTs). Various databases were searched from their inception dates to March 8, 2021.SettingPreoperative assessment.PatientsOlder patients (≥ 60 years) undergoing non-cardiac surgery.MeasurementsOutcomes included postoperative delirium, mortality, discharge to assisted care, 30-day readmissions, postoperative complications, and length of hospital stay. Effect sizes were calculated as Odds Ratio (OR) and Mean Difference (MD) based on random effect model analysis. The quality of included studies was assessed using the Cochrane Risk Bias Tool for RCTs and Newcastle-Ottawa Scale for observational cohort studies.ResultsFifty-three studies (196,491 patients) were included. Preoperative CI was associated with a significant risk of delirium in older patients after non-cardiac surgery (25.1% vs. 10.3%; OR: 3.84; 95%CI: 2.35, 6.26; I2: 76%; p < 0.00001). Cognitive impairment (26.2% vs. 13.2%; OR: 2.28; 95%CI: 1.39, 3.74; I2: 73%; p = 0.001) and dementia (41.6% vs. 25.5%; OR: 1.96; 95%CI: 1.34, 2.88; I2: 99%; p = 0.0006) significantly increased risk for 1-year mortality. In patients with CI, there was an increased risk of discharge to assisted care (44.7% vs. 38.3%; OR 1.74; 95%CI: 1.05, 2.89, p = 0.03), 30-day readmissions (14.3% vs. 10.8%; OR: 1.36; 95%CI: 1.00, 1.84, p = 0.05), and postoperative complications (40.7% vs. 18.8%; OR: 1.85; 95%CI: 1.37, 2.49; p < 0.0001).ConclusionsPreoperative CI in older surgical patients significantly increases risk of delirium, 1-year mortality, discharge to assisted care, 30-day readmission, and postoperative complications. Dementia increases the risk of 1-year mortality. Cognitive screening in the preoperative assessment for older surgical patients may be helpful for risk stratification so that appropriate management can be implemented to mitigate adverse postoperative outcomes.  相似文献   

18.
BackgroundPostoperative feeding practices are not uniform in children undergoing bowel anastomosis surgery. Primary aim of this review was to evaluate the safety and efficacy of early enteral nutrition (EEN) as an isolated component of enhanced recovery in children undergoing bowel anastomosis surgery.MethodsMedical search engines (PubMed, CENTRAL, Google scholar) were accessed from inception to January 2021. Randomized Controlled Trials (RCT)s, non-randomized controlled trials, observational studies and retrospective studies comparing EEN, initiated within 48 h vs late enteral nutrition (LEN), initiated after 48 h in children ≤ 18 years undergoing bowel anastomosis surgery were included. Primary outcome measure was the incidence of postoperative complications (anastomotic leak, abdominal distension, surgical site infection, wound dehiscence, vomiting and septic complications). Secondary outcome measures were the time to passage of first feces and the length of hospital stay.ResultsTwelve hundred and eighty-six children from 10 studies were included in this review. No difference was seen between the EEN and LEN groups in the incidence of anastomotic leak (1.69% vs 4.13%; p = 0.06), abdominal distention (13.87% vs 12.31%; p = 0.57), wound dehiscence (3.07% vs 2.69%; p = 0.69) or vomiting (8.11% vs 8.67%; p = 0.98). The incidence of surgical site infections (7.51% vs 11.72%; p = 0.04), septic complications (14.02% vs 26.22%; p = 0.02) as well as pooled overall complications (8.11% vs 11.27%; RR 0.71; 95% CI = 0.56 to 0.89; p = 0.003; I2 = 33%) were significantly lower in the EEN group. The time to passage of first feces (MD – 17.23 h; 95% CI -23.13 to -11.34; p < 0.00001; I2 = 49%) and the length of hospital stay (MD -2.95 days; 95% CI -3.73 to -2.17; p < 0.00001; I2 = 93%) were significantly less in the EEN group.ConclusionEEN is safe and effective in children following bowel anastomosis surgery and is associated with a lower overall incidence of complications as compared to LEN. EEN also promotes early bowel recovery and hospital discharge. However, further well designed RCTs are required to validate these findings.Level of evidence: V  相似文献   

19.
《Journal of pediatric surgery》2021,56(12):2180-2191
BackgroundEsophageal growth using the Foker process (FP) for long-gap esophageal atresia (LGEA) has evolved over time.MethodsContemporary LGEA patients treated from 2014–2020 were compared to historical controls (2005 to <2014).Results102 contemporary LGEA patients (type A 50%, B 18%, C 32%; 36% prior anastomotic attempt; 20 with esophagostomy) underwent either primary repair (n=23), jejunal interposition (JI; n = 14), or Foker process (FP; n = 65; 49 primary [p], 16 rescue [r]). The contemporary p-FP cohort experienced significantly fewer leaks on traction (4% vs 22%), bone fractures (2% vs 22%), anastomotic leak (12% vs 37%), and Foker failure (FP→JI; 0% vs 15%), when compared to historical p-FP patients (n = 27), all p ≤ 0.01. Patients who underwent a completely (n = 11) or partially (n = 11) minimally invasive FP experienced fewer median days paralyzed (0 vs 8 vs 17) and intubated (9 vs 15 vs 25) compared to open FP patients, respectively (all p ≤ 0.03), with equivalent leak rates (18% vs 9% vs 26%, p = 0.47). At one-year post-FP, most patients (62%) are predominantly orally fed.ConclusionWith continued experience and technical refinements, the Foker process has evolved with improved outcomes, less morbidity and maximal esophageal preservation.  相似文献   

20.
《Journal of pediatric surgery》2021,56(11):2052-2057
PurposeTrauma team activation is essential to provide rapid assessment of injured patients, however excessive utilization can overburden systems. We aimed to identify predictors of over triage and evaluate impact of prehospital personal discretion trauma activations on the over triage rate.MethodsRetrospective comparative study of pediatric trauma patients (<18 years) evaluated after activation of the trauma team to those evaluated as a trauma consult treated between 2010 and 2013. Cohort matching of trauma activated and consult patients was done on the basis of patients’ age and ISS.Results1363 patients including 359 trauma team activations were evaluated. Median age was 6 years, Injury Severity Score (ISS) 4, 116 (8.5%) required operative intervention and 20 (1.4%) died.Matched analysis using age and ISS showed trauma activated patients were more likely to have penetrating MOI (4.7% vs.1.7%; p = 0.03) and need ICU admission(32.9% vs.16.7%; p = 0.0001). State of Florida discrete criteria based trauma activated patients when compared to paramedic discretion activations had a higher ISS (9 vs.5; p = 0.014), need for ICU admission (36.5% vs.20.4%; p = 0.004), ICU LOS(2 vs.0 days; p = 0.02), hospital LOS(2 vs.2 days; p = 0.014) and higher likelihood of death(4.9% vs.0%;p = 0.0001). Moreover, paramedic discretion trauma activated patients were similar to trauma consult patients in terms of ISS score(p = 0.86), need for ICU admission(p = 0.86), operative intervention(p = 0.86), death(p = 0.86) and hospital LOS(p = 0.86), with a considerably higher cost of care(p = 0.0002).ConclusionDiscrete criteria-based trauma team activations appear to more reliably identify patients likely to benefit from initial multidisciplinary management.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号