首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BackgroundAnti-HLA immunization determined by Panel Reactive Antibody (PRA) is known to have a negative impact on patient and graft survival. The predictive value of peak PRA (pPRA) on immunologic outcome, however, and the individual effects of anti-HLA class I and II antibodies remain uncertain.MethodsThe influence of HLA immunization on immunologic outcome parameters and graft survival was investigated in 1150 adult patients without pretransplant donor-specific antibodies (DSA) and in a subgroup of elderly kidney recipients aged ≥ 65 (n = 264). Anti-HLA immunization was defined as a pPRA > 0%. We investigated the influence of class I and II pPRA by dividing all kidney recipients into four pPRA groups (0%, 1–20%, 21–80%, >80%).ResultsPatients with non-donor-specific pretransplant anti-HLA immunization were at a higher risk for developing de novo DSA (49.9% vs. 18.7% p < 0.001), antibody mediated rejections (ABMR) (15.7% vs. 5.1%; p < 0.001), had a poorer death censored graft survival (69.2% vs. 86.2%; p < 0.001) and a higher decline of the calculated GFR. In elderly patients anti-HLA immunization only had a significant influence on the development of DSA (40.5% vs. 27.4%; p = 0.004). A multivariate model adjusted for all relevant factors revealed only class I but not class II pretransplant HLA immunization as a significant independent risk factor for de novo DSA, ABMR and death censored graft loss (HR 2.76, p < 0.001, HR 4.16, p < 0.001 and HR 2.07, p < 0.001, respectively).ConclusionMainly non-donor-specific pretransplant HLA class I immunization is an independent risk factor for the development of de novo DSA, ABMR and graft loss.  相似文献   

2.
Intruduction and aim: Angiotensin II type 1 receptor antibodies (AT1R-Ab) are associated with graft rejection and poor graft outcomes in kidney transplantation (KT). We aimed to assess the frequency of preformed AT1R-Ab and their impact on graft function and survival at 1 year after KT in a low immunological risk cohort. Methods: We performed a prospective, observational cohort study in 67 adult KT recipients, transplanted between 2018 and 2019. A cut-off value >10 U/ml was used for AT1R-Ab detection. Results: The frequency of preformed AT1R-Ab was 10.4% and the median value of their level was 8.4 U/ml (IQR: 6.8–10.4). Donor-specific anti-human leukocyte antigen antibodies (HLA-DSA) were absent, no case of biopsy-proven rejection was reported and the incidence of graft failure was 7.5%. Estimated glomerular filtration rate (eGFR) was significantly reduced in the AT1R-Ab group [35 (29.8–55.2) vs 56.1 (41.3–66.5) ml/min, p = 0.02] at 1 year after KT. After multivariate linear regression analysis, preformed AT1R-Ab were found as an independent determinant of eGFR at 1 year after KT (β: -15.395; 95% CI: −30.49 - -0.30; p = 0.04). By Cox multivariate regression analysis, preformed AT1R-Ab were not associated with graft failure (HR: 1.36; 95% CI:0.10–14.09; p = 0.80). Conclusion: Preformed AT1R-Ab are an independent determinant of graft function but do not impact graft survival at 12 months after transplantation in a prospective low immunological risk cohort of KT recipients.  相似文献   

3.
BackgroundThe presence of neutrophils in the lung was identified as a factor associated with CLAD but requires invasive samples. The aim of this study was to assess the kinetics of peripheral blood neutrophils after lung transplantation as early predictor of CLAD.MethodsWe retrospectively included all recipients transplanted in our center between 2009 and 2014. Kinetics of blood neutrophils were evaluated to predict early CLAD by mathematical modeling using unadjusted and adjusted analyses.Results103 patients were included, 80 in the stable group and 23 in the CLAD group. Bacterial infections at 1 year were associated with CLAD occurrence. Neutrophils demonstrated a high increase postoperatively and then a progressive decrease until normal range. Recipients with CLAD had higher neutrophil counts (mixed effect coefficient beta over 3 years = +1.36 G/L, 95% Confidence Interval [0.99–1.92], p < .001). A coefficient of celerity (S for speed) was calculated to model the kinetics of return to the norm before CLAD occurrence. After adjustment, lower values of S (slower decrease of neutrophils) were associated with CLAD (Odds Ratio = 0.26, 95% Confidence Interval [0.08–0.66], p = .01).ConclusionA slower return to the normal range of blood neutrophils was early associated with CLAD occurrence.  相似文献   

4.
BackgroundDespite improvements in general health and life expectancy in people with cystic fibrosis (CF), lung function decline continues unabated during adolescence and early adult life.MethodsWe examined factors present at age 5-years that predicted lung function decline from childhood to adolescence in a longitudinal study of Australasian children with CF followed from 1999 to 2017.ResultsLung function trajectories were calculated for 119 children with CF from childhood (median 5.0 [25%-75%=5.0–5.1]) years) to early adolescence (median 12.5 [25%-75%=11.4–13.8] years). Lung function fell progressively, with mean (standard deviation) annual change -0.105 (0.049) for forced vital capacity (FVC) Z-score (p<0.001), -0.135 (0.048) for forced expiratory volume in 1-second (FEV1) Z-score (p<0.001), -1.277 (0.221) for FEV1/FVC% (p<0.001), and -0.136 (0.052) for forced expiratory flow between 25% and 75% of FVC Z-score (p<0.001). Factors present in childhood predicting lung function decline to adolescence, in multivariable analyses, were hospitalisation for respiratory exacerbations in the first 5-years of life (FEV1/FVC p = 0.001, FEF25–75 p = 0.01) and bronchoalveolar lavage neutrophil elastase activity (FEV1/FVC% p = 0.001, FEV1 p = 0.05, FEF25–75 p = 0.02). No examined factor predicted a decline in the FVC Z-score.ConclusionsAction in the first 5-years of life to prevent and/or treat respiratory exacerbations and counteract neutrophilic inflammation in the lower airways may reduce lung function decline in children with CF, and these should be targets of future research.  相似文献   

5.
6.
BackgroundThis study assessed inter-hospital variability in operative-vs-nonoperative management of pediatric adhesive small bowel obstruction (ASBO).MethodsA multi-institutional retrospective study was performed examining patients 1–21 years-of-age presenting with ASBO from 2010 to 2019 utilizing the Pediatric Health Information System. Multivariable mixed-effects logistic regression was performed assessing inter-hospital variability in operative-vs-nonoperative management of ASBO.ResultsAmong 6410 pediatric ASBO admissions identified at 46 hospitals, 3,239 (50.5%) underwent surgery during that admission. The hospital-specific rate of surgery ranged from 35.3% (95%CI: 28.5–42.6%) to 74.7% (66.3–81.6%) in the unadjusted model (p < 0.001), and from 35.1% (26.3–45.1%) to 73.9% (66.7–79.9%) in the adjusted model (p < 0.001). Factors associated with operative management for ASBO included admission to a surgical service (OR 2.8 [95%CI: 2.4–3.2], p < 0.001), congenital intestinal and/or rotational anomaly (OR 2.5 [2.1–3.1], p < 0.001), diagnostic workup including advanced abdominal imaging (OR 1.7 [1.5–1.9], p < 0.001), non-emergent admission status (OR 1.5 [1.3–1.8], p < 0.001), and increasing number of complex chronic comorbidities (OR 1.3 [1.2–1.4], p < 0.001). Factors associated with nonoperative management for ASBO included increased hospital-specific annual ASBO volume (OR 0.98 [95%CI: 0.97–0.99], p = 0.002), older age (OR 0.97 [0.96–0.98], p < 0.001), public insurance (OR 0.87 [0.78–0.96], p = 0.008), and presence of coinciding non-intestinal congenital anomalies, neurologic/neuromuscular disease, and/or medical technology dependence (OR 0.57 [95%CI: 0.47–0.68], p < 0.001).ConclusionsRates of surgical intervention for ASBO vary significantly across tertiary children's hospitals in the United States. The variability was independent of patient and hospital characteristics and is likely due to practice variation.Level of evidenceIII  相似文献   

7.
IntroductionDe novo donor-specific antibodies (DSAs) increase the risk of chronic lung allograft dysfunction (CLAD) in lung transplant recipients (LTRs). Both carfilzomib (CFZ) and rituximab (RTX) lower the mean fluorescent intensity (MFI) of DSAs, but comparative data are lacking. We compared CLAD-free survival and the degree and duration of DSA depletion after treatment of LTRs with CFZ or RTX.MethodsLTRs that received CFZ or RTX for DSA depletion between 08/01/2015 and 08/31/2020 were included. The primary outcome was CLAD-free survival. Secondary outcomes were change in MFI at corresponding loci within 6 months of treatment (ΔMFI), time to DSA rebound, and change in % predicted FEV1 6 months after treatment (ΔFEV1).ResultsForty-four LTRs were identified, 7 of whom had ≥2 drug events; therefore, 53 drug events were divided into 2 groups, CFZ (n = 17) and RTX (n = 36). Use of plasmapheresis, immunoglobulin, and mycophenolate augmentation was equivalent in both groups. CLAD-free survival with a single RTX event was superior to that after ≥2 drug events (p = 0.001) but comparable to that with a single CFZ event (p = 0.399). Both drugs significantly lowered the MFI at DQ locus, and the median ΔMFI was comparable. Compared to the RTX group, the CFZ group had a shorter median interval to DSA rebound (p = 0.015) and a lower ΔFEV1 at 6 months (p = 0.014).ConclusionAlthough both CFZ and RTX reduced the MFI of circulating DSAs, RTX prolonged the time to DSA rebound. Despite more pronounced improvement in FEV1 with RTX, comparable CLAD-free survival between the 2 groups suggests that both drugs offer a reasonable treatment strategy for DSAs in LTRs.  相似文献   

8.
ObjectiveAngiotensin II type-1 receptor antibodies (AT1R-Ab) and endothelin-1 type-A receptor antibodies (ETAR-Ab) are non-human leukocyte antigen (HLA) antibodies that can elicit adverse effects on kidney transplantation (KT) outcomes. We investigated the correlation between levels of AT1R-Ab and ETAR-Ab and postoperative outcomes in KT recipients.MethodsPre-KT and post-KT serum from 79 patients was collected. Post-KT serum was collected within 1 year after KT or simultaneously as the biopsy. Levels of AT1R-Ab and ETAR-Ab were measured using enzyme-linked immunosorbent assay kits. AT1R-Ab >17.0 U/mL and ETAR-Ab >10.0 U/mL was considered to denote positivity according to manufacturer recommendations. We measured donor-specific antibodies against human leukocyte antigens (HLA-DSA) levels using LABScreen™ single-antigen kits.ResultsSeventy-nine (54 men, 25 women) formed the study cohort. Seven (8.7%) patients were positive for AT1R-Ab, 25 (31.6%) patients were positive for both AT1R-Ab and ETAR-Ab, and 47 (59.5%) were negative for both antibodies at all time points. No patients died during the study period. Patients with both AT1R-Ab and ETAR-Ab were associated with a higher prevalence of antibody-mediated rejection (AMR) and lower estimated glomerular filtration rate, but not allograft loss or delayed graft function. AT1R-Ab were associated with T-cell-mediated rejection, but the association was not significant. HLA-DSA were associated significantly with a higher creatinine level in serum at 12 months and 24 months in patients with AT1R-Ab and/or ETAR-Ab.ConclusionsAT1R-Ab, ETAR-Ab, and HLA-DSA were associated with a higher prevalence of AMR and decline in graft function. Measurement of levels of AT1R-Ab and ETAR-Ab in KT patients may be useful for stratification of immunological risk and identification of patients at a high risk of adverse graft outcome.  相似文献   

9.
ObjectivesTo assess the reliability and applicability of duplex ultrasound scanning (DUS) of lower limb arteries, compared with digital subtraction angiography (DSA), in patients with peripheral arterial disease (PAD).DesignA prospective, blinded, comparative study.Materials and methodsA total of 169 patients were examined by DUS and DSA. Intermittent claudication (IC) was present in 42 (25%) patients and critical limb ischaemia (CLI) in 127 (75%) patients. To allow segment-to-segment comparison, the arterial tree was divided into 15 segments. In total, 2535 segments were examined using kappa (κ) statistics to test the agreement.ResultsThe agreement between DUS and DSA was very good (κ > 0.8) or good (0.8  κ > 0.6) in most segments, but moderate (0.6  κ > 0.4) in the tibio-peroneal trunk and the peroneal artery. Agreement between the two techniques was significantly better in the supragenicular (κ = 0.75 (95% confidence interval (CI): 0.70–0.80)) than in the infragenicular segments (κ = 0.63 (0.59–0.67)) (p < 0.001). Similarly, the technical success rate was significantly higher in the supragenicular segments (DUS: 100%; DSA: 99%) than in the infragenicular segments (both 93%) (p < 0.001). DUS was the best technique for imaging of the distal crural arteries (92% vs. 97%; p < 0.001) and DSA was the best technique for imaging of the proximal crural arteries (95% vs. 91%; p < 0.01). Neither the agreement nor the technical success rate was influenced by the severity of PAD, that is, IC versus CLI.ConclusionThe agreement between DUS and DSA was generally good, irrespective of the severity of ischaemia. DUS performed better in the supragenicular arteries than in the infragenicular arteries. However, DUS compared favourably with DSA in both tibial vessels, particularly in the distal part, which makes DUS a useful non-invasive alternative to DSA.  相似文献   

10.
BackgroundDiagnostic tools to measure the response to individual immunosuppressive drugs for transplant patients are currently lacking. We previously developed the blood-based Immunobiogram bioassay for in-vitro characterization of the pharmacodynamic response of patients' own immune cells to a range of immunosuppressants. We used Immunobiogram to examine the association between patients' sensitivity to their prescribed immunosuppressants and clinical outcome.MethodsWe conducted an international, multicenter, observational study in a kidney transplant population undergoing maintenance immunosuppressive therapy. Patients were selected by clinical course poor [PCC] N = 53 (with renal dysfunction, and rejection signs in biopsy or/and an increase in DSA strength in last 12 months) versus good [GCC] N = 50 (with stable renal function and treatment, no rejection and no DSA titers). Immunobiogram dose-response curve parameters were compared between both subgroups in patients treated with mycophenolate, tacrolimus, corticosteroids, cyclosporine A or everolimus. Parameters for which significant inter-group differences were observed were further analyzed by univariate and subsequent multivariate logistic regression.ResultsClinical outcome was associated with following parameters: area over the curve (AOC) and 25% (ID25) and 50% (ID50) inhibitory response in mycophenolate, tacrolimus, and corticosteroid-treated subgroups, respectively. These statistically significant associations persisted in mycophenolate (OR 0.003, CI95% <0.001–0.258; p = 0.01) and tacrolimus (OR < 0.0001, CI95% <0.00001–0.202; p = 0.016) subgroups after adjusting for concomitant corticosteroid treatment, and in corticosteroid subgroup after adjusting for concomitant mycophenolate or tacrolimus treatment (OR 0.003; CI95% <0.0001–0.499; p = 0.026).ConclusionsOur results highlight the potential of Immunobiogram as a tool to test the pharmacodynamic response to individual immunosuppressive drugs.  相似文献   

11.
Study objectiveTo evaluate the effectiveness of preoperative gabapentinoid administration.DesignRetrospective hospital registry study.SettingTertiary referral center (Boston, MA).Patients111,008 adult non-emergency, non-cardiac surgical patients between 2014 and 2018.InterventionsPreoperative administration of gabapentinoids (gabapentin or pregabalin).MeasurementsWe tested the primary hypothesis that preoperative gabapentinoid use was associated with lower odds of hospital readmission within 30 days. Contingent on this hypothesis, we examined whether lower intraoperative opioid utilization mediated this effect. Secondary outcome was postoperative respiratory complications.Main resultsGabapentinoid administration was associated with lower odds of readmission (adjusted odds ratio [ORadj] 0.80 [95% CI, 0.75–0.85]; p < 0.001). This effect was in part mediated by lower intraoperative opioid utilization in patients receiving gabapentinoids (8.2% [2.4–11.5%]; p = 0.012). Readmissions for gastrointestinal disorders (ORadj 0.74 [0.60–0.90]; p = 0.003), neuro-psychiatric complications (ORadj 0.66 [0.49–0.87]; p = 0.004), non-surgical site infections (ORadj 0.68 [0.52–0.88; p = 0.004) and trauma or poisoning (ORadj 0.25 [0.16–0.41]; p < 0.001) occurred less frequently in patients receiving gabapentinoids. The risk of postoperative respiratory complications was lower in patients receiving gabapentinoids (ORadj 0.77 [0.70–0.85]; p < 0.001). Lower doses of pregabalin (< 75 mg) and gabapentin (< 300 mg) compared to both, no and high-dose administration of gabapentinoids, were associated with a lower risk of postoperative respiratory complications (ORadj 0.61 [0.50–0.75]; p < 0.001 and ORadj 0.70 [0.53–0.92]; p = 0.012, respectively). These lower gabapentinoid doses prevented 30-day readmission (ORadj 0.74 [0.65–0.85]; p < 0.001). The results were robust in several sensitivity analyses including surgical procedure defined subgroups and patients undergoing ambulatory surgery.ConclusionsThe preoperative use of pregabalin and gabapentin, up to doses of 75 and 300 mg respectively, mitigates the risks of hospital readmission and postoperative respiratory complications which can in part be explained by lower intraoperative opioid use. Further research is warranted to elucidate mechanisms of the preventive action.  相似文献   

12.
BackgroundIt remains challenging to manage antibody-mediated rejection (ABMR) associated with angiotensin II type 1 receptor antibodies (AT1R-Abs) in kidney transplant recipients and the outcomes are not well defined. We describe the presentation, clinical course, and outcomes of this condition.MethodsThis retrospective study included kidney transplant recipients with AT1R-Ab levels ≥10 units/mL and biopsy-proven ABMR in the absence of significant HLA-donor-specific antibodies at the time of rejection.ResultsWe identified 13 recipients. Median creatinine (Cr) at rejection was significantly higher (2.05 mg/dL) compared with baseline (1.2 mg/dL), P = .006. After ABMR management, the difference in median Cr was not significant (1.5 mg/dL), P = .152. Median AT1R-Ab level was higher in the pretransplant sample (34.5 units/mL) compared with the level at rejection (19 units/mL) and after rejection treatment (13 units/mL); however, these differences were not significant, P = .129. Eight of the 13 recipients received antibody reduction therapy with plasmapheresis and intravenous immunoglobulin, and 5 of the 13 recipients had other therapies. After rejection management, 6 of the 13 recipients had improvement in Cr to baseline and 7 of the 13 recipients had > 50% reduction in proteinuria.ConclusionsAT1R-Ab–associated ABMR management and outcomes depend on the clinical presentation and may include antibody-reducing therapies among other therapies. Further prospective cohorts will improve recognizing and managing this condition.  相似文献   

13.
BackgroundCytomegalovirus (CMV) infection is a risk factor for bronchiolitis obliterans (BO), one form of chronic lung allograft dysfunction (CLAD). The viral chemokine receptor M33 is essential for successful spread of murine CMV to host salivary glands. In the present study we investigated the impact of M33 on chronic airway rejection.MethodsMHC I-mismatched tracheas of C·B10-H2b/LilMcdJ mice were transplanted into BALB/c (H2d) recipients and infected at different dates with wild type (WT) or M33-deleted (delM33) MCMV representing clinical settings of viral recipient (R)-donor (D)-serostatus: (D−/R+) or (D+/R-). Grafts were recovered for gene expression and histological / immunofluorescence analysis, respectively.ResultsEvaluations showed significantly increased signs of chronic rejection in WT-infected mice compared to uninfected allografts seen in lower epithelium/lamina propria-ratio (ELR) (ELR 0.46 ± 0.07 [WT post] vs. ELR 0.66 ± 0.10 [non-inf.]; p < 0.05). The rejection in delM33-infected groups was significantly reduced vs. WT-infected groups (0.67 ± 0.04 [delM33 post]; vs. WT post p < 0.05). Furthermore, decreased rejection was observed in WT pre-infected compared to post-infected groups (0.56 ± 0.08 [WT pre]; vs. WT post p < 0.05). CD8+ T cell infiltration was significantly higher in WT-post compared to the delM33 infected or non-infected allografts.ConclusionsThese data support the role of the CMV in accelerating CLAD. The deletion of chemokine receptor M33 leads to attenuated rejection.  相似文献   

14.
PurposeWe examined cystic fibrosis (CF) patients and compared their clinical status at the time of primary versus double lung re-transplantation (re-DLTx) in order to better understand lung retransplant practice patterns.MethodsWe performed a retrospective analysis of the UNOS Database identifying CF patients ≥18 years old undergoing re-DLTx (5/4/2005 and 12/4/2020). Baseline and clinical variables at the primary and re-DLTx were compared utilizing the paired student t-test. Graft survival was defined as time from surgery to retransplant and analyzed using Kaplan-Meier estimates.Results277 CF patients who underwent re-DLTx experienced a significantly worse 5-year survival when compared to the primary DLTx cohort (47.9% vs 58.8%, p = 0.00012). The following differences were observed comparing CF re-DLTx group to their primary DLTx: higher LAS score at the time of listing (50.66 vs 42.15, p < 0.001) and transplant (62.19 vs 48.20, p < 0.001), and increase LAS from the time of listing to transplant (+12.22 vs +7.23, p = 0.002). While serum albumin and total bilirubin were similar, CF patients had a higher creatinine (1.05 vs 0.74, p < 0.001), dialysis (4.4% vs 0.6%, p < 0.001), ECMO bridge to transplant rates (7.6% vs 4.0%, p < 0.001), and higher oxygen requirements (5.95 vs 3.93, p < 0.001) at the time of listing for a re-DLTx.ConclusionCompared to their initial transplant, CF patients experience significant clinical decline in renal, cardiac, and pulmonary function at the time of lung retransplantation. This may indicate that an earlier evaluation and rehabilitation process may be necessary to identify patients earlier for lung retransplantation prior significant clinical decline.  相似文献   

15.
PurposeInduction immunosuppression has improved the long-term outcomes after kidney transplant. This study explores the association of different induction immunosuppression medications (Basiliximab vs. Alemtuzumab vs. rabbit Antithymocyte Globulin) used at the time of kidney transplant with the development of de novo donor-specific HLA antibodies (DSA) in the first 12 months post-transplant period.MethodsA total of 390 consecutive kidney transplant recipients (KTR), between 2016 and 2018, were included in the analysis. A 104 (26.6%) received Basiliximab, 186 (47.6%) received Alemtuzumab, and 100 (25.6%) received rabbit Antithymocyte Globulin (rATG) for induction. All recipients had a negative flow cytometry crossmatch before transplant. Serum samples at 4- and 12-months post-transplant were assessed for the presence of de novo HLA DSA. kidney allograft function was compared among the three groups with calculated Creatinine Clearance on 24 h urine collection.ResultsDe novo HLA DSA were detected in total of 81 (20.8%) patients within 12 months post-transplant. De novo HLA DSA were detected in 12/104 (11.5%), 43/186 (23.11%), and 26/100 (26%) KTR that received Basiliximab, Alemtuzumab, and rATG respectively (p = 0.006). KTR that received Basiliximab were significantly older, and the last follow-up creatinine clearance was significantly lower at 42 ml/min compared to KTR that received Alemtuzumab or rATG (p = 0.006).ConclusionInduction immunosuppression utilizing Basiliximab is associated with significant reduction in development of de novo DSA within the first 12-months post kidney transplant but had lower creatinine clearance with long-term follow up.  相似文献   

16.
BackgroundLung transplantation is a lifesaving procedure, still marred by worse results than other solid organ transplants. The 1-year mortality is 10%, and within 5 years after the procedure, half of patients develop chronic lung allograft dysfunction (CLAD), which also is the main limiting factor for long-term survival. Heart arrhythmias are also common directly after a lung transplant, and 1 treatment for this is the drug amiodarone. Recent research suggests that amiodarone exposure leads to activation of fibroblasts, a cell type that synthesizes stroma in the lung, associated with acute respiratory distress syndrome and CLAD. This study aims to retrospectively investigate the effect of posttransplant amiodarone treatment on survival and CLAD.Material and MethodsAll patients transplanted at Sahlgrenska University Hospital between 2007 and 2018 were reviewed, and adult patients with a follow-up within Sweden were included. Of the 394 patients who met this inclusion criteria, retrospective data concerning postoperative complications and long-term outcomes were retrieved. A multivariable Cox proportional hazards model was applied to identify a set of independently significant predictors.ResultsPosttransplant use of amiodarone was associated with shorter survival (hazard ratio = 1.65; 95% confidence interval, 1.08-2.54; P = .02). Amiodarone exposure was not associated with CLAD (hazard ratio = 0.64; 95% confidence interval, 0.33-1.22; P = .17).ConclusionsAn increased risk of death but not CLAD was observed in patients treated with amiodarone postoperatively after lung transplantation in the current cohort.  相似文献   

17.
BackgroundThe impact of Behavioral Health Disorders (BHDs) on pediatric injury is poorly understood. We investigated the relationship between BHDs and outcomes following pediatric trauma.MethodsWe analyzed injured children (age 5–15) from 2014 to 2016 using the Pediatric Trauma Quality Improvement Program. The primary outcome was in-hospital mortality. Univariable and multivariable analyses compared children with and without a comorbid BHD.ResultsOf 69,305 injured children, 3,448 (5%) had a BHD. These 3,448 children had a median of 1 [IQR: 1, 1] BHD diagnosis: ADHD (n = 2491), major psychiatric disorder (n = 1037), drug use disorder (n = 250), and alcohol use disorder (n = 29). A higher proportion of injured children with BHDs suffered intentional and penetrating injury. Firearm injuries were more common for BHD patients (3% vs 1%, p<0.001). Children with BHDs were more likely to have an ISS>25 compared to children without (5% vs 3%, p<0.001). While median LOS was longer for BHD patients (2 [1, 3] vs 2 [1, 4], p<0.001), mortality was similar (1% vs 1%, p = 0.76) and complications were less frequent (7% vs 8%, p = 0.002). BHD was associated with lower risk of mortality (OR 0.45, 95%CI [0.30, 0.69]) after controlling for age, sex, race, trauma type, and injury intent and severity.ConclusionChildren with BHDs experienced lower in-hospital mortality risk after traumatic injury despite more severe injury upon presentation. Intentional and penetrating injuries are particularly concerning, and future work should assess prevention efforts in this vulnerable group.  相似文献   

18.
Total lymphoid irradiation (TLI) is an alternative treatment for chronic lung allograft dysfunction (CLAD). However, data regarding its efficacy and tolerance are scarce. This study included patients with CLAD treated with TLI at our center between 2011 and 2018. Clinical characteristics before and after TLI and related complications were analyzed. Forty patients with CLAD (twenty-nine bronchiolitis obliterans syndrome [BOS], nine restrictive allograft syndrome [RAS], and two mixed) were included. Significant attenuation of the forced expiratory volume in 1-sec (FEV1) decline slope was observed in all phenotypes, in both the BOS and RAS. The median FEV1 12, 6, and 3 months pre-TLI were as follows: 1980 (IQR 1720-2560), 1665 (IQR 1300-2340) and 1300 (IQR 1040-1740) ml (p < .001), while the median FEV1 at 3, 6, and 12 months post-TLI was 1110 (IQR 810–1440), 1130 (IQR 860–1470), and 1115 (IQR 865–1490) ml (p = .769). No dropouts due to radiation toxicity were observed. The mean survival according to the Karnofsky Performance Status Scale (KPS) >70 or ≤70 at baseline was 1837 (IQR 259–2522) versus 298 (IQR 128–554) days (p < .0001), respectively. In conclusion, TLI may stop FEV1 decline in both BOS and RAS. Moreover, a good KPS score may be an important prognostic factor.  相似文献   

19.
ObjectiveFolic acid (FA) administration can reduce plasma total homocysteine (tHcy); however, it fails to decrease cardiovascular events and progression of peripheral artery disease (PAD). N?–homocysteinyl–lysine isopeptide (N?–Hcy–Lys) is formed during catabolism of homocysteinylated proteins. We sought to investigate factors that determine the presence of N?–Hcy–Lys in PAD patients with hyperhomocysteinemia receiving FA.Patients and methodsWe studied 131 consecutive PAD patients with tHcy > 15 μmol l?1 taking FA 0.4 mg d?1 for 12 months. Serum N?–Hcy–Lys was determined by high-performance liquid chromatography (HPLC). We also measured interleukin-6 (IL-6), plasminogen activator inhibitor-1 (PAI-1), asymmetric dimethylarginine (ADMA) and 8-iso-prostaglandin F (8-iso-PGF).ResultsFA administration resulted in a 70.5% decrease in tHcy (p < 0.0001). However, serum N?–Hcy–Lys was detectable in 28 (21.4%) patients on FA who were more frequently current smokers and survivors of ischaemic stroke (p < 0.001). They had higher tHcy by 46.0%, PAI-1 by 51.7%, 8-iso-PGF by 59.1% and ADMA by 26.4% (all, p < 0.0001). The presence of N?–Hcy–Lys was associated with lower ankle-brachial index (ABI) values (p < 0.001) and higher prevalence of cardiovascular events (p < 0.001) following therapy.ConclusionThe presence of N?–Hcy–Lys in one-fifth of hyperhomocysteinemic individuals with PAD despite FA treatment is associated with progression of PAD and with increased ADMA formation, oxidative stress and hypofibrinolysis.  相似文献   

20.
BackgroundThe cystic fibrosis transmembrane conductance regulator (CFTR) potentiator, ivacaftor, was first approved for people with CF and the G551D CFTR mutation. This study describes the long-term clinical effectiveness of ivacaftor in this population.MethodsWe conducted a multicenter, prospective, longitudinal, observational study of people with CF ages ≥6 years with at least one copy of the G551D CFTR mutation. Measurements of lung function, growth, quality of life, and sweat chloride were performed after ivacaftor initiation (baseline, 1 month, 3 months, 6 months, and annually thereafter until 5.5 years).ResultsNinety-six participants were enrolled, with 81% completing all study measures through 5.5 years. This cohort experienced significant improvements in percent predicted forced expiratory volume in 1 second (ppFEV1) of 4.8 [2.6, 7.1] (p < 0.001) at 1.5 years, that diminished to 0.8 [-2.0, 3.6] (p = 0.57) at 5.5 years. Adults experienced larger improvements in ppFEV1 (7.4 [3.6, 11.3], p < 0.001 at 1.5 years and 4.3 [0.6, 8.1], p = 0.02 at 5.5 years) than children (2.8 [0.1, 5.6], p = 0.04 at 1.5 years and -2.0 [-5.9, 2.0], p = 0.32 at 5.5 years). Rate of lung function decline for the overall study cohort from 1 month after ivacaftor initiation through 5.5 years was estimated to be -1.22 pp/year [-1.70, -0.73]. Significant improvements in growth, quality of life measures, sweat chloride, Pseudomonas aeruginosa detection, and pulmonary exacerbation rates requiring antimicrobial therapy persisted through five years of therapy.ConclusionsThese findings demonstrate the long-term benefits and disease modifying effects of ivacaftor in children and adults with CF and the G551D mutation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号