首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 944 毫秒
1.
BackgroundLeukocyte infiltration into the graft has pivotal effects on kidney transplantation outcome. The present study sought to determine whether the expression of sequential chemokine receptors on CD4+ and CD8+ T cells in human renal allograft can predict clinical episodes.MethodsBlood samples from 52 consecutive renal transplant patients were evaluated at the time of transplantation and at three times (2, 90 and 180 days) after transplantation to analyze the expression of CCR1 and CXCR3 on CD4+ and CD8+ T cells by flowcytometry. A total of 30 biopsies, including protocol biopsy (n = 24) and cause biopsy (n = 6), were investigated according to the Banff criteria.ResultsThe mean percentage of CD4+ and CD8+ T cells expressing CCR1 was significantly increased in patients with allograft dysfunction (n = 25) (p = 0.006, p = 0.004). The mean fluorescence intensity of CXCR3 on CD4+ and CD8+ T cells were found to be significantly higher in graft dysfunction than that in well-functioning grafts. (p < 0.001, p = 0.007). Receiver Operating Characteristic (ROC) Curve Analysis showed that the calculated AUC was 0.86 at the third month for CD4+ CCR1+ and CD8+ CCR1+ (p < 0.001). Multiple logistic regression analysis showed that an increase in CD4+ expressing CXCR3 leads to a lower risk of graft dysfunction (OR = 0.37), while an increase in CD8+ expressing CCR1 results in a higher risk of graft dysfunction (OR = 3.66).ConclusionDuring renal transplantation, CD4+ and CD8+ T cells expressing CCR1 were increased in patients who developed graft dysfunction. These findings may prospectively predict allograft dysfunction, and help elucidate the underlying pathogenic mechanisms.  相似文献   

2.
PurposeAge negatively impacts the biologic features of mesenchymal stem cells (MSCs), including decreased expansion kinetics and differentiation potential. Clinically, donor-age may be within a wide spectrum; therefore, investigation of the role of donor's age on immunoregulatory potential is of critical importance to translate stem cell therapies from bench to bedside.MethodsAdipose and bone marrow derived MSCs (ASCs and BMSCs) were isolated in parallel from Lewis and Brown Norway rats of young (less than 4-week old) and senior groups (older than 15-month). The presentation of cells and time required for growth to 90% confluence was recorded. FACS sorting based on the expression of CD90 and CD29 double positive and CD45 CD11 double negative quantified the proportions of MSCs. After expansion, ASCs and BMSCs from different age groups were co-cultured in mixed lymphocyte reaction (MLR; Lewis vs. Brown Norway) assays. The suppression of CD3+CD4+ and CD3+CD8+ T cell populations by different sources of MSCs were compared.ResultsThe kinetics of cell growth was slower in old animals (17.3 ± 2 days) compared with young animals (8.8 ± 3 days), and cell morphology was irregular and enlarged in the senior groups. The yield of MSCs by FACS sorting was significantly higher in young groups compared to senior groups (p < 0.02). With regard to immunoregulatory potential, senior ASCs failed to induce any CD3+CD4+ T cell suppression (p > 0.05). In addition, young BMSCs-induced suppression was more prominent than seniors (p < 0.05).ConclusionsDonor age should be taken into consideration when using recipient MSC of either bone marrow or adipose origin in clinical applications.  相似文献   

3.
ObjectiveTest whether the development of abnormal venous-to arterial CO2 difference (ΔPCO2) during the early phases of postoperative care after a liver transplantation is related to multi-organ dysfunction and outcomes.Materials and methodsProspective cohort study accomplished in a mixed intensive care unit at a university hospital. We included 150 eligible patients after a liver transplantation between 2015 and 2018.Patients were classified in 4 predefined groups according to the ΔPCO2 evolution during the first 6 h of resuscitation: 1) persistently normal ΔPCO2 (normal at T0 and T6); 2) decreasing ΔPCO2 (high at T0, normal at T6); 3) increasing ΔPCO2 (normal at T0, high at T6); and 4) persistently high ΔPCO2 (high at T0 and T6). Multiorgan dysfunction at day-3 was compared for predefined groups and a Kaplan Meier curve was constructed to show the survival probabilities using a log-rank test to evaluate differences between groups. A Spearman-rho was used to test the agreement between cardiac output and ΔPCO2.ResultsThere were no significant differences between the study groups regarding higher SOFA scores at day-3 (P = 0.86), Δ-SOFA (P = 0.088), as well as global mortality rates (χ2 = 5.72; P = 0.126) and mortality rates at day-30 (χ2 = 2.23; P = 0.5252).A significantly poor inverse agreement between cardiac output and ΔPCO2 was observed (rho de Spearman ?0,17; P = 0,002) at different points of resuscitation.ConclusionsAfter a liver transplantation, central venous-to-arterial CO2 difference was not associated with survival or postoperative adverse outcomes in a critical care patients population.  相似文献   

4.
TitlePrevalence and Risk Factors for Hypertrophic Scarring of Split Thickness Autograft Donor Sites in a Pediatric Burn Population.ObjectiveThe split-thickness autograft remains a fundamental treatment for burn injuries; however, donor sites may remain hypersensitive, hyperemic, less pliable, and develop hypertrophic scarring. This study sought to assess the long-term scarring of donor sites after pediatric burns.MethodsA retrospective review of pediatric burn patients treated at a single institution (2010–2016) was performed. Primary outcomes were prevalence of donor site hypertrophic scarring, scarring time course, and risk factor assessment.Results237 pediatric burn patients were identified. Mean age at burn was 7 yrs., mean %TBSA was 26% with 17% being Full Thickness. Mean follow-up was 2.4 yrs. Hypertrophic scarring was observed in 152 (64%) patients with 81 (34%) patients having persistent hypertrophic scarring through long-term follow-up. Patient-specific risk factors for hypertrophic scarring were Hispanic ethnicity (P = 0.03), increased %TBSA (P = 0.03), %Full Thickness burn (P = 0.02) and total autograft amount (P = 0.03). Donor site factors for hypertrophic scarring were longer time to epithelialization (P < 0.0001), increased donor site harvest depth (P < 0.0001), autografts harvested in the acute burn setting (P = 0.008), and thigh donor site location (vs. all other sites; P < 0.0001). The scalp, arm, foot, and lower leg donor sites (vs. all other sites) were less likely to develop HTS (P < 0.0001, 0.02, 0.005, 0.002, respectively), along with a history of previous donor site harvest (P = 0.04).ConclusionsHypertrophic scarring is a prominent burden in donor site wounds of pediatric burn patients. Knowledge of pertinent risk factors can assist with guiding management and expectations.  相似文献   

5.
《Transplant immunology》2010,22(4):215-220
BackgroundB cell depletion has been employed to treat antibody-mediated organ transplantation rejection, although the effects on cellular immune responses have not been extensively investigated.MethodsA model of B cell depletion used SCID/beige mice reconstituted with BALB/c splenocytes either depleted of B cells (BD) or not (BN). BD and B/N mice received C57BL/6 skin grafts and were sacrificed after 6 weeks (BD-S6 and BN-S6).ResultsRecall proliferative responses of BD-S6 splenocytes to C57BL/6 were significantly reduced compared to BN-S6, and central memory T cells' proportions (CD4+CD44+CD62L+ or CD8+CD44+CD62L+) were significantly decreased in BD-S6 spleens. Recall IFN-γ production by BD-S6 splenocytes was significantly reduced compared to BN-S6 splenocytes (p = 0.0028). Survival times of C57BL/6 heart grafts were significantly longer in SCID/beige mice reconstituted with BD-S6 splenocytes (8.5 ± 1.1 days) than for SCID/beige reconstituted with BN-S6 splenocytes (6.0 ± 1.1 days; p = 0.0006). Under cyclosporine therapy, C57BL/6 heart survival was significantly longer for SCID/beige reconstituted with BD-S6 splenocytes (17.5 ± 6.4 days) than those reconstituted with BN-S6 splenocytes (6.2 ± 1.5 days; p < 0.0001).ConclusionB cell depletion during allogeneic sensitization decreased memory T cells and recalls IFN-γ production and reduced second-set allograft rejection.  相似文献   

6.
BackgroundIn lung transplantation (LT), the length of ischemia time is controversial as it was arbitrarily stablished. We ought to explore the impact of extended cold-ischemia time (CIT) on ischemia-reperfusion injury in an experimental model.MethodsExperimental, randomized pilot trial of parallel groups and final blind analysis using a swine model of LT. Donor animals (n = 8) were submitted to organ procurement. Lungs were subjected to 6 h (n = 4) or 12 h (n = 4) aerobic hypothermic preservation. The left lung was transplanted and re-perfused for 4 h. Lung biopsies were obtained at (i) the beginning of CIT, (ii) the end of CIT, (iii) 30 min after reperfusion, and (iv) 4 h after reperfusion. Lung-grafts were histologically assessed by microscopic lung injury score and wet-to-dry ratio. Inflammatory response was measured by determination of inflammatory cytokines. Caspase-3 activity was determined as apoptosis marker.ResultsWe observed no differences on lung injury score or wet-to-dry ratio any given time between lungs subjected to 6 h-CIT or 12h-CIT. IL-1β and IL6 showed an upward trend during reperfusion in both groups. TNF-α was peaked within 30 min of reperfusion. IFN-γ was hardly detected. Caspase-3 immunoexpression was graded semiquantitatively by the percentage of stained cells. Twenty percent of apoptotic cells were observed 30 min after reperfusion.ConclusionsWe observed that 6 and 12 h of CIT were equivalent in terms of microscopic lung injury, inflammatory profile and apoptosis in a LT swine model. The extent of lung injury measured by microscopic lung injury score, proinflammatory cytokines and caspase-3 determination was mild.  相似文献   

7.
IntroductionAncillary hospital personnel represent an important body of opinion because as they work in a hospital their opinion has more credibility for the general public as a result of their activity in hospitals. However, in most cases they do not have any health care training which means that their attitude could be based on a lack of knowledge or unfounded fears.ObjectiveTo analyze the attitude toward living kidney donation (LKD) among ancillary personnel in Spanish and Latin-American hospitals and to analyze the variables that might influence such attitude.Patients and methodfrom «International Collaborative Donor Project» a random sample was taken among ancillary personnel in Spain, Mexico and Cuba hospitals. Attitude towards LKD was evaluated using a validated, anonymously filled and self-administered survey.Results951 professionals were surveyed (Spain: 277, Mexico: 632, Cuba: 42). 89% (n = 850) are in favor of related kidney donation, lowering to 31% (n = 289) in non-related donation. Of the rest, 8% (n = 78) are not in favor and the 3% (n = 23) are unsure. By country, Cubans (98%) and Mexicans (91%) are more in favour than Spanish (84%) (P = .001). The following variables are related to favourable attitude towards LKD: female sex (P = .017), university degree (P = .010), work in health services (P = .035), labour stability (P = .016), personal experience in donation and transplantation (P = .001), positive attitude toward cadaveric donation (P < .001), belief that he or she might need a transplant in the future (P < .001), positive attitude towards living liver donation (P < .001), a willingness to receive a donated living liver if needed (P < .001), having discussed the subject of organ donation and transplantation within the family (P < .001), partner's positive attitude towards the subject (P < .001), participation in voluntary type pro-social activities (P = .002) and not being concerned about possible mutilation after donation (P < .001)ConclusionsThe attitude toward living related kidney donation is favourable among ancillary personnel in Spanish and Latin-Americans hospitals. Because living donation is a better source of organs than cadaveric ones, this favourable predisposition can be used as promoting agent of living donation in order to develop it in Spanish-speaking countries.  相似文献   

8.
BackgroundThis study retrospectively investigated nutritional status, dietetic intervention and intake in Cystic Fibrosis (CF) patients before and after lung transplantation (LTX).MethodsBody Mass Index (BMI), Fat Free Mass Index (FFMI) and nutritional intake were retrieved from 75 out-patients aged 15–53 years. Patients were seen every 3–4 months during the waiting list time (range 0–81 months) and up to 116 months after LTX. Survival was measured in months.ResultsThe median BMI at baseline was 19.2 kg/m2 (range: 15.3 to 28.4 kg/m2) with 29 patients (39%) below ≤ 18.5 kg/m2. FFMI (measured in 65 patients) had a median of 15.2 kg/m2 (range: 11.1 to 22.4 kg/m2) with 39 patients (60%) ≤ 16.7 kg/m2 (men) or ≤ 14.6 kg/m2 (women). Median energy intake was 2800 kcal, 239 kcal higher than the estimated energy requirement. However, 8 patients consumed ≥ 500 kcal less than recommended. Protein intake was 104 (range 60–187) g or 1.9 g/kg per day. Despite dietetic intervention with oral nutritional supplements (ONS) (36 patients), tube feeding (12 patients), or both (13 patients), BMI and FFMI hardly improved pre-LTX. LTX was performed in 51 patients (68%); 10 patients died during follow-up, median survival time was 41 months. A BMI ≤ 18.5 kg/m2 was more prevalent in patients who died before LTX (6/9) or who died after LTX (4/10) than in patients who were still alive on the waiting list (5/15) or who survived LTX (14/41). Results for FFMI were comparable. From 6–12 months post-LTX, BMI and FFMI markedly improved, especially in underweight patients.ConclusionA BMI ≤ 18.5 kg/m2 and an FFMI ≤ 16.7 kg/m2 (men) or ≤ 14.6 kg/m2 (women) appears to impair survival in LTX candidates with CF. Patients maintained a low body weight before LTX. After LTX weight gain is achieved.  相似文献   

9.
BackgroundEn bloc kidney transplantation from pediatric donors into adult recipients increases the donor pool. However, this surgical procedure is not widely performed in many transplant centers. To evaluate the long-term outcomes of bloc kidney transplantation from pediatric donors into adult recipients in a single center.Material and methodsRetrospective analysis of 42 patients who received pediatric cadaveric bloc kidney transplantation in our center since 1999. Median follow-up period was 73 months (5-233) in which renal function tests were taken and complications registered.ResultsWe have performed 42 bloc kidney transplantation from pediatric donors into adult recipients in our center. The recipients’ age was 44.1 ± 11.8 years. Pediatric donors were 22.4 ± 14.7 months old and weighted 11.3 ± 3.6 kg. Cold ischemia time was 15.7 ± 4.5 hours. During a median follow-up of 73 months, 35 patients (83.3%) had graft survival with excellent function (first-year serum creatinine levels of 0.99 ± 0.25 mg/dl). There were 7 graft losses (16.7%) in the immediate postoperative period (4 cases of vascular thrombosis, one anastomosis dehiscence and 2 cortical necrosis).ConclusionsThe pediatric en bloc renal graft transplantation into adults is a safe technique with excellent medium- to long-term functional performance. The vast majority of significant complications leading to graft loss were reported in the immediate postoperative period. A good selection of donors and recipients as well as an adequate surgical technique are essential to minimize the occurrence of adverse events.  相似文献   

10.
INTRODUCTIONThere is an ever-increasing need for organ donations globally. Paediatric kidney transplantation into adult recipients is a well-recognised technique to expand the donor pool. The transplantation can be done either via en bloc kidney transplant (EBKT) or as single kidney transplantation (SKT).PRESENTATION OF CASEAn EKBT from a 18-month-old (15 kg) male patient was transplanted in a 35-year old, 85 kg male with end stage renal failure (ESRF), secondary to Focal Segmental Glomerulosclerosis (FSGS) on haemodialysis. Post-operative recovery was uneventful. Immuno-suppressant drugs used were tacrolimus, basiliximab and prednisolone. Doppler ultrasound scans performed post-operatively showed normal renal resistive indices in both kidneys. Serum creatinine decreased from 1200 to 170 μmol/L 57 with eGFR improving from 4 to 38 mL/min/1.73 m2 at four weeks post-transplant.DISCUSSIONGiven the low incidence of paediatric donors, EBKTs are relatively uncommon and subsequently published series tend to be centre specific with small numbers. The graft survival rates tell us that paediatric kidney donors should not be considered as marginal transplants. The difficulty is in determining when it is more appropriate to perform a paediatric EBKT as opposed to splitting and performing two SKT. Unfortunately there are no widely accepted guidelines to direct clinicians.CONCLUSIONThis case report highlights the first EKBT performed at our institution. The current literature demonstrates that paediatric donors are excellent resources that should be procured whenever available.  相似文献   

11.
ObjectiveLung transplantation (LT) for pulmonary fibrosis is related to higher mortality than other transplant indications. We aim to assess whether the amount of anterior mediastinal fat (AMF) was associated to early and long-term outcomes in fibrotic patients undergoing LT.MethodsRetrospective analysis of 92 consecutive single lung transplants (SLT) for pulmonary fibrosis over a 10-year period. AMF dimensions were measured on preoperative CT-scan: anteroposterior axis (AP), transverse axis (T), and height (H). AMF volumes (V) were calculated by the formula: AP × T × H × 3.14/6.According to the radiological AMF dimensions, patients were distributed into two groups: low-AMF (V < 20 cm3) and high-AMF (V > 20 cm3), and early and long-term outcomes were compared by univariable and multivariable analyses.ResultsThere were 92 SLT: 73M/19F, 53 ± 11 [14–68] years old. 30-Day mortality (low-AMF vs. high-AMF): 5 (5.4%) vs. 15 (16.3%), p = 0.014. Patients developing primary graft dysfunction within 72 h post-transplant, and those dying within 30 days post-transplant presented higher AMF volumes: 21.1 ± 19.8 vs. 43.3 ± 24.7 cm3 (p = 0.03) and 24.4 ± 24.2 vs. 56.9 ± 63.6 cm3 (p < 0.01) respectively. Overall survival (low-AMF vs. high-AMF) (1, 3, and 5 years): 85%, 81%, 78% vs. 55%, 40%, 33% (p < 0.001).Factors predicting 30-day mortality were: BMI (HR = 0.77, p = 0.011), AMF volume (HR = 1.04, p = 0.018), CPB (HR = 1.42, p = 0.002), ischaemic time (HR = 1.01, p = 0.009).Factors predicting survival were: AMF volume (HR = 1.02, p < 0.001), CPB (HR = 3.17, p = 0.003), ischaemic time (HR = 1.01, p = 0.001).ConclusionPreoperative radiological assessment of mediastinal fat dimensions and volumes may be a useful tool to identify fibrotic patients at higher risk of mortality after single lung transplantation.  相似文献   

12.
ObjectiveTo evaluate the effects of low-dose butorphanol on hyperalgesia induced by high-dose remifetanil in patients undergoing laparoscopic cholecystectomy.DesignRandomized double-blind clinical trial.SettingIntraoperative.PatientsSeventy-five patients scheduled for laparoscopic cholecystectomy were enrolled.InterventionsRandomly allocated into 3 groups, low dose of remifentanil (LR) group and high dose of remifentanil (HR) group received low (0.1 μg kg 1 min 1) or high (0.3 μg kg 1 min 1) doses of remifentanil, respectively, and butorphanol combined with remifentanil (BR) group received remifentanil (0.3 μg kg 1 min 1) and butorphanol (0.2 μg/kg).MeasurementsThe visual analog scale scores and cumulative consumption of fentanyl were recorded.Main resultsVisual analog scale scores were significantly higher in the HR group than in the LR and BR groups (P < .001). The dose of intravenously given fentanyl was significantly higher in the HR group than in the LR and BR groups (P < .001). In addition, the HR group showed a significantly higher cumulative consumption of fentanyl during 5 to 8 hours after the operation (P < .001).ConclusionsA high dose of remifentanil induces postoperative hyperalgesia, which could be prevented by a continuous intravenous administration of a low dose of butorphanol.  相似文献   

13.
PurposeTo compare the capabilities of apparent diffusion coefficient (ADC) and normalized ADC using the pancreatic parenchyma as reference organ in the characterization of focal pancreatic lesions.Patients and methodsThirty-six patients with focal pancreatic lesions (malignant, n = 18; benign tumors, n = 10; focal pancreatitis, n = 8) underwent diffusion-weighted MR imaging (DWI) at 1.5-Tesla using 3 b values (b = 0, 400, 800 s/mm2). Lesion ADC and normalized lesion ADC (defined as the ratio of lesion ADC to apparently normal adjacent pancreas) were compared between lesion types using nonparametric tests.ResultsSignificant differences in ADC values were found between malignant (1.150 × 10 −3 mm2/s) and benign tumors (2.493 × 10−3 mm2/s) (P = 0.004) and between benign tumors and mass-forming pancreatitis (1.160 × 10−3 mm2/s) (P = 0.0005) but not between malignant tumors and mass-forming pancreatitis (P = 0.1092). Using normalized ADC, significant differences were found between malignant tumors (0.933 × 10−3 mm2/s), benign tumors (1.807 × 10−3 mm2/s) and mass-forming pancreatitis (0.839 × 10−3 mm2/s) (P < 0.0001).ConclusionOur preliminary results suggest that normalizing ADC of focal pancreatic lesions with ADC of apparently normal adjacent pancreatic parenchyma provides higher degrees of characterization of focal pancreatic lesions than the conventional ADC does.  相似文献   

14.
《Transplant immunology》2014,30(1-4):109-113
BackgroundIncidence, characteristics, and risk-factors for invasive aspergillosis (IA)-associated immune reconstitution syndrome (IRS) in lung transplant recipients are not known.MethodsPatients comprised 68 lung transplant recipients with proven/probable IA followed for 12 months. IRS was defined based on previously proposed criteria.ResultsIn all, 7.3% (5/68) of the patients developed IRS based on aforementioned criteria, a median of 56 days after initiation of antifungal therapy. This entity was associated with heart–lung transplantation (p = 0.006), anti T-cell agent use (p = 0.003), discontinuation of calcineurin inhibitor agent (p = 0.002), and disseminated IA (p = 0.069). In a risk assessment model, IRS developed in 0% (0/55) of the patients with none of the aforementioned factors, 28.6% (2/7) with one, 33.3% (1/3) with two, and in 1/1 patient with 3 factors (X2 for trend p = 0.0001). Three out of 5 patients with IRS died and 2 of 3 deaths in this group were due to chronic rejection.ConclusionsOverall 7% of the lung transplant recipients with IA appear to develop an IRS-like entity. Clinically assessable factors can identify patients at risk for post-transplant IA-associated IRS. Deaths due to chronic rejection were significantly higher in patients with IRS than those without IRS.  相似文献   

15.
BackgroundRecently, antibody mediated rejection (AMR) has been associated with a higher incidence of chronic lung allograft dysfunction (CLAD) and mortality after lung transplantation (LTx). We investigated markers related to AMR and matrix remodeling in CLAD, with special attention for its two phenotypes being bronchiolitis obliterans syndrome (BOS) and restrictive CLAD (rCLAD).MethodsImmunoglobulins (IgA, IgE, IgG1–IgG4, total IgG and IgM) and complement (C4d and C1q) were quantified in lung lavage samples at the moment of BOS (n = 15) or RAS (n = 16) diagnosis; and were compared to stable transplant patients who served as control (n = 14). Also, airway remodeling and metalloproteinases (MMPs) were investigated via zymography and gelatin degradation. The presence of DSA was additionally assessed in blood.ResultsTotal IgG, IgG1-IgG4 and IgM were increased in rCLAD versus control (p < 0.001) and BOS patients (p < 0.01). IgA and IgE were increased in rCLAD compared to control (respectively p < 0.05 and p < 0.01), but not to BOS. Total IgG and IgE were increased in BOS versus control (respectively p < 0.01 and p < 0.05). Complement proteins were exclusively present in rCLAD and correlated positively with immunoglobulins. Additionally, in blood, DSA were more present in rCLAD (p = 0.041). MMP-9 levels increased in RAS and BOS versus control (p < 0.001) and MMP-9 induced gelatin degradation was only increased in BOS compared to control (p < 0.01).ConclusionWe demonstrated increased levels of immunoglobulins and complement proteins dominantly present in rCLAD. This leads to the belief that antibodies and AMR might play a more important role in rCLAD compared to BOS. Therefore, anti B-cell therapy could offer beneficial therapeutic effects in patients diagnosed with rCLAD, which needs further research.  相似文献   

16.
BackgroundAlloimmunization remains a critical factor which affects the success of kidney transplantation. Patients awaiting solid organ transplantation may develop anti-HLA antibodies after pregnancies, transfusions and previous events of transplantations.AimWe evaluated the effects of different sensitizing events on the anti-HLA antibody production and the potential role of patient HLA alleles in the context of antibody development in both the overall and pregnancy sensitized groups.Material and methodsWe retrospectively stratified 411 women on waiting list for kidney transplantation by route of sensitization. The presence of anti-HLA antibodies was evaluated by Solid Phase Assay and HLA typing was performed by serological and molecular methods.ResultsIn our study population, 54% of women had anti-HLA antibodies. We found that the 51.6% of women with pregnancy only, 44% of women with transfusion only and 100% of women with a history of transplantation only developed anti-HLA antibodies. Pregnancy only resulted significantly associated with all anti-HLA antibody development such as anti-A, -B, -C, -DR, -DP as well as anti-DQB and -DQA antibodies. We investigated the influence of patient HLA alleles on the antibody development in the overall study population. Patients expressing HLA A*32 (p = 0.024, OR = 0.42), B*14 (p = 0.035, OR = 0.44), HLA-B*44 (p = 0.026, OR = 0.51) and DRB1*01 (p = 0.029, OR = 0.55) alleles produced anti-HLA antibodies less frequently compared to subjects with other alleles. In the pregnancy only group, B*14 (p = 0.010, OR = 0.12) and B*51 (p = 0.005, OR = 0.24) alleles were associated with a low risk of anti-HLA antibody development, while A*11 (p = 0.033, OR = 3.56) and DRB1*04 (p = 0.022, OR = 3.03) alleles seem to represent a higher risk.ConclusionsPregnancy still remains a strong sensitizing event in women awaiting kidney transplantation. The anti-HLA antibody development in pregnancy appears to be associated with the expression of particular HLA alleles.  相似文献   

17.
《Foot and Ankle Surgery》2019,25(3):264-271
BackgroundThe aim of the study was to assess the 5-year-follow-up after matrix-associated stem cell transplantation (MAST) in chondral lesions at the ankle as part of a complex surgical approach.MethodsIn a prospective consecutive non-controlled clinical follow-up study, all patients with chondral lesion at the ankle that were treated with MAST from April 1, 2009 to May 31, 2012 were included. Size and location of the chondral lesions, method-associated problems and the Visual-Analogue-Scale Foot and Ankle (VAS FA) before treatment and at follow-up were analysed. Stem cell-rich blood was harvested from the ipsilateral pelvic bone marrow and centrifuged (10 min, 1500 RPM). The supernatant was used to impregnate a collagen I/III matrix (Chondro-Gide) that was fixed into the chondral lesion with fibrin glue.ResultsOne hundred and twenty patients with 124 chondral lesions were included in the study. Age at the time of surgery was 35 years on average (range, 12–65 years), 74 (62%) were male. VAS FA before surgery was 45.2 on average (range, 16.4–73.5). Lesions were located at medial talar shoulder, n = 55; lateral talar shoulder, n = 58 (medial and lateral, n = 4); tibia, n = 11. Lesion size was 1.7 cm2 on average (range, .8–6 cm2). One hundred patients (83%) completed 5-year-follow-up after. VAS FA improved to 84.4 (range, 54.1–100; t-test, p < 0.01).ConclusionsMAST as part of a complex surgical approach led to improved and high validated outcome scores in the mid-term-follow-up. No method related complications were registered. Even though a control group is missing, we conclude that MAST as part of a complex surgical approach is an effective method for the treatment of chondral lesions of the ankle for at least five years.  相似文献   

18.
IntroductionThere is growing evidence that the lectin pathway is significantly associated with acute rejection. Rare studies associated both gene polymorphisms of MBL2 and FCN2 with acute rejection after kidney transplantation. The aim of the present study was to investigate the role of the lectin gene profile and clinical risk factors such as PRA level on acute rejection in kidney transplant recipients.MethodsWe prospectively analyzed 157 kidney transplant recipients with and without acute rejection. A total of 6 well-known functional single-nucleotide polymorphisms in the MBL2 gene and 5 in the FCN2 gene of the recipients were determined by gene sequencing. MBL2 and FCN2 genotypic variants were analyzed for association with the incidence of acute rejection within the first year after kidney transplantation.ResultsAfter adjusting for variables of P < 0.2, we found the differences in the incidence of acute rejection were only according to panel-reactive antibodies (odds ratios (OR) = 6.468, 95% confidence intervals (CI) = 2.017–20.740, P = 0.002) and the HH genotypes of MBL2 promoter ? 550 (OR = 2.448, 95%CI = 1.026–5.839, P = 0.044).ConclusionPanel-reactive antibodies and the HH genotypes of MBL2 promoter ? 550 have significant impacts on the risk of developing acute rejection after kidney transplantation.  相似文献   

19.
《Transplant immunology》2009,20(3-4):173-177
BackgroundElectroporation has been shown to increase the efficacy of intramuscular injection of plasmid DNA, resulting in a higher level of foreign gene expression. Using this technique, we examined the effect of viral IL-10 gene transfer on the prevention of tracheal allograft stenosis in an animal model.MethodsOn the day of tracheal transplantation, recipient Lewis rats were intramuscularly injected with either plasmid pCAGGS-LacZ or plasmid pCAGGS-viral IL-10, followed immediately by electroporation. Tracheas from Brown Norway donors were transplanted into the backs of Lewis recipients, and the histology of the grafts were assessed 2 and 4 weeks after transplantation.ResultsThe serum level of IL-10 peaked at 2000 pg/ml one day after injection; the level then slowly decreased, but was maintained above 1000 pg/ml until 8 days after injection. At Day 28, the airway lumina of the tracheal allografts were almost completely obliterated by fibroproliferative tissue in the control pCAGGS-LacZ-treated rats. In rats injected once with pCAGGS-viral IL-10, luminal obliteration was significantly decreased compared with the control pCAGGS-LacZ-treated rats (mean luminal opening 46.8% vs 0% p < 0.05). The loss of epithelial cells lining the airway was also decreased in the IL-10-treated group (mean epithelial coverage 42% vs 5% p < 0.05). Multiple injections with pCAGGS-viral IL-10 did not further improve the histological changes.ConclusionIL-10 gene transfer by intramuscular injection using electroporation attenuated tracheal allograft stenosis associated with mild epithelial injury.  相似文献   

20.
BackgroundHIV infection is associated with high rates of acute rejection following kidney transplantation. The underlying mechanisms for such predisposition are incompletely understood. Pathological immune activation is a hallmark of chronic HIV infection that persists despite effective antiretroviral therapy. We hypothesized that the baseline levels of T cell activation in HIV+ candidates would correlate with their risk of acute rejection following kidney transplantation.MethodsSingle-center retrospective cohort analysis of HIV+ adult kidney transplants performed between October 2006 and September 2013. The frequency of CD3+ HLA-DR+ cells measured by flow cytometry served as a surrogate marker of immune activation. Patients were categorized into tertiles of activation, and the rates of biopsy-proven acute rejection were compared across groups.Results(1) Compared to matched HIV controls, the baseline number of CD3+ HLA-DR+ cells was higher in HIV+ kidney transplant candidates. (2) Abnormally high levels of activation did not decrease with transplant-associated immunosuppression. (3) Patients categorized within the lower and middle CD3+ HLA-DR+ tertiles had higher probability of rejection during the first 3 years post-transplant compared to those in the higher activation tertile (36.9% vs. 0%; log-rank P = 0.04).ConclusionsPathological immune activation in HIV+ transplant candidates does not explain their increased susceptibility to allograft rejection. Paradoxically, those with the highest levels of immune activation seem to be less prone to rejection.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号