首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Development of arterial hypertension is to some extent related to decreased activity of the kallikrein-kinin system. This poorly understood hormonal system consists of blood proteins playing a role in the process of inflammation, coagulation, blood pressure control, and pain conduction. The system consists of kallikreins (plasma and tissue), kallistatin, kininogens, kinins (bradykinin, kallidin-lizynobradykinin), kininases (I and II), and membrane receptors of bradykinin. The aim of the study was the assessment of kallistatin in correlation to blood pressure value in heart transplant recipients.

Patients and Methods

Kallistatin level was estimated in 131 heart transplant recipients on standard 3 drugs immunosuppressive regimens (calcineurin inhibitor, mycophenolate mofetil/mycophenolic acid, and steroids) in correlation to inflammation markers and blood pressure values. Additionally, 22 healthy volunteers served as controls. In cross-sectional study, kallistatin and catecholamine concentrations were assessed using commercially available assays.

Results

Kallistatin concentration did not differ significantly among heart transplant recipients in comparison with controls; serum noradrenaline concentration was lower in the study group. In the orthotopic heart transplant group, kallistatin correlated with renal function; estimated glomerular filtration rate was calculated by Modification of Diet in Renal Disease formula (r = ?0.28, P?<?.01; hemoglobin r = ?0.19, P?<?.05; cholesterol level r = ?0.23, P?<?.01; low-density lipoprotein r = 0.25, P?<?.01; ferritin r = 0.21, P?<?.05; noradrenaline r = ?0.28, P?<?.01). No correlation with blood pressure values were revealed. In multivariate analysis, cholesterol serum level and age predicted 32% of variability of kallistatin concentration.

Conclusion

Kallistatin among heart transplant recipients does not seem to be the pathogenetic factor of arterial hypertension but may be involved in the development of hyperlipidemia often present in this group of patients.  相似文献   

2.

Background

After successful lung transplantation, patients are monitored for chronic lung allograft dysfunction. Pulmonary function tests and 6-minute-walk tests are commonly used for functional graft monitoring. As these methods require substantial effort, however, many patients are unable to complete testing fully. The impulse oscillometry system is a noninvasive method that requires minimal patient cooperation and is suitable for use for patients incapable of strenuous activity. We compared impulse oscillometry system with pulmonary function tests and 6-minute-walk tests to determine if impulse oscillometry system could serve as a substitute measure.

Methods

This prospective, observational study evaluated 25 consecutive patients (19 men, median age 54.5 years) admitted to a single institution from January to October 2016 (double-lung = 13, single-lung = 13). Patients were assessed using pulmonary function tests, impulse oscillometry system, and 6-minute-walk tests.

Results

Eighty-eight percent of patients reached high-resonance frequency (Fres) and, in 84% of patients, the value of Ax (area of reactance) increased above the norm (N < 0.33 kPa/L) indicating peripheral airways obstruction. High resistance of small airways, measured with an R5 – R20 difference, followed higher Ax values. The increase of resistance at 5 Hz in 31% of patients (R5 >150% predicted value) also indicated small airway obstruction. Airway obturation in patients with elevated Ax and R5 was confirmed by decreased FEV1 (<75% predictive value) and FEV1/FVC ratio in 38% of patients.

Conclusions

Study results confirm the impulse oscillometry system method could be a substitute for pulmonary function tests in determining the occurrence of chronic lung allograft dysfunction. 6-minute-walk tests showed neither strong correlations regarding impulse oscillometry system and pulmonary function tests nor any base for differentiation of results regarding main factor codes.  相似文献   

3.

Introduction

The aim of the study was to assess the impact of bacterial infection during hospital stay on long-term follow-up.

Materials and Methods

This was a retrospective single-center study of 97 recipients of lung transplantations performed between December 2004 and June 2016 at a single center. Information about age, sex, underlying lung disease, and date and type of procedure was gathered from patients' charts. Immunosuppressive treatment has been analyzed individually among the cohort. Microbiological evaluation included the presence of infection, bacterial species in recipients and donors, as well as type of biological material.

Results

During a mean hospitalization time of 57 days (range 4–398 days), 67 patients (69%) were diagnosed with bacterial infection. There were 120 episodes of infection caused by 32 species of bacteria. The most common were Pseudomonas aeruginosa (27%), Acinetobacter baumanii (21%), Klebsiella pneumoniae (10%) and Stenotrophomonas maltophilia (11%). Analysis revealed that 39 patients developed bronchiolitis obliterans syndrome (43%). Patients with A baumanii had a lower probability of survival than the rest of the population (P < .05). Patients treated with mammalian target of rapamycin inhibitors had a higher probability of survival.

Conclusions

Infection with A baumanii affects lung transplant recipients' survival. Incorporating sirolimus could be beneficial for the lung transplant recipients' survival.  相似文献   

4.

Background

Model for End-Stage Liver Disease (MELD) score predicts multisystem dysfunction and death in patients with heart failure (HF). Left ventricular assist devices (LVADs) have been used for the treatment of end-stage HF.

Aim of the study

We evaluated the prognostic values of MELD, MELD-XI, and MELD-Na scores in patients with POLVAD MEV LVAD.

Materials and methods

We retrospectively analyzed data of 25 consecutive pulsatile flow POLVAD MEV LVAD patients (22 men and 3 women) divided in 2 groups: Group S (survivors), 20 patients (18 men and 2 women), and Group NS (nonsurvivors), 5 patients (4 men and 1 woman). Patients were qualified in INTERMACS class 1 (7 patients) and class 2 (18 patients). Clinical data and laboratory parameters for MELD, MELD-XI, and MELD-Na score calculation were obtained on postoperative days 1, 2, and 3. Study endpoints were mortality or 30 days survival. MELD scores and complications were compared between Groups S and NS.

Results

20 patients survived, and 5 (4 men and 1 woman) died during observation. Demographics did not differ. MELD scores were insignificantly higher in patients who died (Group 2). Values were as follows: 1. MELD preoperatively (21.71 vs 15.28, P = .225) in day 1 (22.03 vs 17.14, P = .126), day 2 (20.52 vs 17.03, P = .296); 2. MELD-XI preoperatively (19.28 vs 16.39, P = .48), day 1 (21.55 vs 18.14, P = .2662), day 2 (20.45 vs 17.2, P = .461); and 3. MELD-Na preoperatively (20.78 vs 18.7, P = .46), day 1 23.68 vs 18.12, P = .083), day 2 (22.00 vs 19.19, P = .295) consecutively.

Conclusions

The MELD scores do not identify patients with pulsatile LVAD at high risk for mortality in our series. Further investigation is needed.  相似文献   

5.

Background

Although the effectiveness of pulmonary rehabilitation in patients with chronic obstructive lung disease, cystic fibrosis, and interstitial lung disease is well documented, little is known about pulmonary rehabilitation in patients who are referred for lung transplantation. Nordic walking is a low-cost and accessible form of physical exercise with proven benefits. The purpose of this prospective study was to examine the effects of Nordic walking on lung function, perception of dyspnea, and health-related quality of life in patients referred for lung transplantation.

Methods

Twenty-two of 40 patients who was qualified for lung transplantation at the Department of Lung Diseases in Zabrze, Poland, completed a rehabilitation program consisting of 12 weeks of Nordic walking. Lung function tests, exercise tolerance, and perception of dyspnea and quality of life were assessed before and after completion of the program.

Results

No adverse events were observed during the rehabilitation program. After 12 weeks, there was a significant increase in mean 6-minute walk distance (374 meters vs 288 meters, P < .034) and a significant reduction in perception of dyspnea after completion of the rehabilitation program. Assessment of general health and quality of life showed significant improvement (P < .05). No significant changes in lung function tests were noted.

Conclusion

Nordic walking is a safe and feasible physical activity for pulmonary rehabilitation in patients with end-stage lung disease who are referred for lung transplantation. This rehabilitation technique results in significant improvements in patient mobility and quality of life.  相似文献   

6.

Background

Serum N-terminal pro-brain natriuretic peptide (NT-proBNP) concentration is elevated in patients with pulmonary hypertension (PH); however, its role in the detection of PH associated with lung disease is not well established.

Aim

The aim of this study was to assess the value of NT-proBNP in the detection of PH in patients with end-stage lung disease (esLD) referred for lung transplantation.

Materials and methods

The study population consisted of 65 patients: 37 with idiopathic pulmonary fibrosis (IPF), 20 with chronic obstructive pulmonary disease, and 8 patients with other interstitial lung diseases (75% men, mean age 53.3 ± 9.5 years). Serum concentration of NT-proBNP was assessed with an immunoradiometric assay kit. The mean pulmonary artery pressure (mPAP) was measured using a Swan-Ganz catheter. PH was defined as mPAP ≥ 25 mm Hg.

Results

Median NT-proBNP concentrations were significantly higher in patients with PH than in patients without PH: 139 (49–1236) pg/mL vs 67 (38–116) pg/mL, respectively; P = .016. Receiver operating characteristic (ROC) analysis revealed that NT-proBNP concentration higher than 131.5 pg/mL was a predictor of PH with good specificity (81%) and positive predictive value (78.9%) but low sensitivity (55.6%) and negative predictive value (58.6%). The area under the ROC curve of serum NT-proBNP concentration for PH was 0.71 (95% confidence interval 0.57–0.85, P = .039).

Conclusion

Serum concentration of NT-proBNP may be useful in the diagnosis of PH in patients with esLD referred for lung transplantation.  相似文献   

7.
The aim of the study was to investigate serum concentration of visfatin, irisin, and omentin in patients diagnosed as having end-stage lung diseases who qualified for lung transplantation (LTx) and to find the relationship between adipokine levels and clinical status.

Material and methods

The study population consisted of 23 consecutive patients (10 patients diagnosed as having cystic fibrosis, 6 patients diagnosed as having chronic obstructive pulmonary disease, and 7 patients diagnosed as having idiopathic pulmonary fibrosis) who qualified for LTx. Patients performed pulmonary function tests; visfatin, irisin, and omentin serum levels were assessed using commercially available enzyme-linked immunosorbent assay kits.

Results

Mean visfatin serum level was 4.99 ±?3.83 pg/mL; mean irisin serum level was 2.82 ± 0.24 ng/mL; mean omentin serum level was 389.99 ± 320.85 ng/mL. Mean distance in 6-minute walk test (6MWT) was 310.62 ± 147.09 m. Average partial pressure of oxygen (pO2) was 55.79 ± 10.33 mm Hg, forced expiratory volume (FEV1) was 26.25 ± 22.38%, and forced vital capacity (FVC) was 56.95 ± 21.91% of a due value. There was no statistically significant correlation between adipokine levels and 6MWT, pO2, FEV1, and FVC in patients waiting for LTx, regardless of underlying lung disease. Significant difference between patients was noted only in 6MWT, FEV1, and pO2 in connection to lung disease.

Conclusion

Our findings indicate that adipokines may not have a statistically significant effect on parameters of pulmonary function. Results require further investigation on a larger study group, especially comparison of adipokine serum levels between groups of overweight patients, obese patients, and patients with normal weight who qualify for LTx.  相似文献   

8.

Background

The aim of the study was to assess the frequency of infections caused by Pneumocystis jiroveci, Chlamydophila pneumoniae, Legionella pneumophila, and Mycoplasma pneumoniae among lung transplant recipients in the context of immunosuppression.

Methods

The study group consisted of 94 patients (37 women and 57 men; mean age 42.03 years) transplanted between 2009 and 2016 at the Silesia Center for Heart Diseases (SCCS). Immunosuppressive treatment (induction and maintenance therapy) was assessed. The immunofluorescence methods were used to detect the P. jiroveci, L. pneumophila, C. pneumoniae, and M. pneumoniae antigens in samples obtained from the respiratory tract.

Results

Thirty-two of 94 graft recipients developed atypical or opportunistic infection. The median time of its occurrence was 178 days after transplantation. P. jiroveci was responsible for 84.38% of first infections. Five patients developed infection with P. jiroveci and C. pneumoniae. None of the infections occurred during induction of immunosuppression. An opportunistic or atypical infection developed in 19.35% of the patients treated with a tacrolimus-based regimen, and in 43.33% of patients on a cyclosporine-based regimen.

Conclusion

Infection with P. jiroveci is a recognized problem after lung transplantation and should be monitored. The percentage of infected patients is higher in patients treated with a cyclosporine-based regimen in comparison to those treated with tacrolimus.  相似文献   

9.

Background

Kidney transplantation is currently the best approach for renal replacement therapy. Compared with dialysis, it provides a better quality of life and improves patient prognosis. However, some evidence suggests that body composition could play a role in the complications observed in kidney transplant recipients (KTRs), and may influence survival. The purpose of this study was to assess the eating habits and body composition of KTRs.

Methods

Seventy KTRs were included in this study. Anthropometry and body composition were performed using electronic-scale, dynamometer, and bioimpedance analyses. Dietary habits were investigated using the Food Frequency Questionnaire (FFQ6). Biochemical parameters were also determined.

Results

Overweight and obesity were found in 33.8% and 21.1% of KTRs, respectively. High body mass index (BMI, >25) correlated positively with high body fat (r = 0.8, P < .05) and waist circumference (r = 0.7, P < .05). The mean percentage of body fat was 30.8 ± 9.3% (range, 13%-52%), fat tissue index was 12.4 ± 4.9, and lean tissue index (LTI) was 13.2 ± 2.2. Sarcopenia was recognized based on decreased LTI and decreased handgrip strength in 33.3% of KTRs with excess body weight. Patients with excess body mass consumed significantly (P < .05) more sugar and fruits.

Conclusion

A significant percentage of KTRs present with sarcopenic obesity. Excess body weight is associated with many factors, such as immunosuppressive therapy, low physical activity, and abnormal diet. Results based on the FFQ6 indicate a relationship between carbohydrate intake and excess body weight among those in the study group.  相似文献   

10.

Background

The diagnosis of acute cellular rejection (ACR) is a major objective in the management of heart transplant recipients. The aim of this study was to assess the utility of speckle-tracking derived parameters in identifying patients at risk of graft rejection.

Methods

A prospective, single-center study was carried out involving 45 consecutive heart transplant patients who underwent a total of 220 routine endomyocardial biopsies (EMBs) with correlative echocardiographic examination.

Results

No significant ACR (grade 0-1R) was seen in 190 biopsies (81.2% of the ACR-free group), and moderate ACR requiring specific treatment (grade 2R) was detected in 30 biopsies (13.6% of the ACR group). Grade 3R was not observed. All longitudinal left ventricular (LV) and right ventricular (RV) strain parameters were greater in the ACR-free group than in patients with ACR, while no differences were observed between radial and circumferential strain parameters. In our analysis, we selected RV free wall longitudinal strain (RV FW) ≤ 16.8% and 4-chamber longitudinal strain (4CH LS) ≤ 13.8%, which related to the presence of ACR requiring treatment. We assigned 1 point for each parameter (minimum 0, maximum 2 points) and derived a new echocardiographic index, the Strain Rejection Score (SRS). Our proposed approach—a combination of the 2 abovementioned indices—for screening patients at risk of ACR ≥ 2R, when expressed by a score 2 points, showed good specificity, strong negative predictive value, and the highest area under the curve.

Conclusions

Our study demonstrated that combination of 4CH LS and RV FW as a new echocardiographic index, the Strain Rejection Score, can be useful as a noninvasive assessment of ACR during the first year of follow-up after heart transplant.  相似文献   

11.

Background

Left ventricular assist devices (LVADs) are used for treatment of end-stage heart failure. Outcomes are dependent on right ventricle (RV) function. Prediction of RV function after LVAD implantation is crucial for device selection and patient outcome.The aim of our study was to compare early LVAD course in patients with optimal and borderline echocardiographic parameters of RV function.

Material and methods

We retrospectively reviewed 24 male patients with LVAD implantation. The following echocardiographic data of RV function were collected: FAC (fractional area change) with optimal value?>?20%, tricuspid annulus plane systolic excursion?>15 mm, RV diameter?<?50mm, and right-to-left ventricle ratio?<?0.57 (RV/LV). Patients were divided into group 1 (12 patients) with transthoracic echocardiography parameters in optimal ranges and group 2 (12 patients) with suboptimal transthoracic echocardiography findings. Study endpoints were mortality, discharge from the intensive care unit, and RV dysfunction. Demographics, postoperative clinical outcomes, comorbidities, complications, and results in a 30-day period were analyzed between groups.

Results

Echocardiography parameters differed significantly between groups 1 and 2 according to FAC (31.8% vs 24.08%; P?=?.005), RV4 (45.08 mm vs 51.69 mm; P?=?.02), and RV/LV ratio (0.6 vs 0.7; P?=?.009).Patients did not differ according to course of disease, comorbidities before implantation, or complications. One patient from each group died. Patients in group 2 experienced more pulmonary hypertension, required increased doses of catecholamines, and stayed in the intensive care unit longer. No RV dysfunction was noted.

Conclusions

Borderline FAC, tricuspid annulus plane systolic excursion, and RV4 add RV/LV ratio prolonged recovery after LVAD implantation even with no RV failure. Parameters chosen for qualification are in safe ranges.  相似文献   

12.

Background

Pulmonary hypertension (PH) is a common complication in end-stage lung disease (esLD). The aim of this study was to establish the best threshold values for mean, systolic, and diastolic artery pressure (mPAP, dPAP, and sPAP, respectively) to identify patients with esLD referred for lung transplantation and to predict 1-year prognosis.

Methods

Sixty-five patients were enrolled in the study (75% men) with a mean age of 53.3 ± 9.5 years; 31% had chronic obstructive pulmonary disease (COPD), 57% had idiopathic pulmonary fibrosis (IPF), and 12% had interstitial lung diseases (ILDs). The mean period of observation was 14.4 ± 5 months. We assessed invasively mPAP, dPAP, and sPAP, as well as pulmonary capillary wedge pressure (PCWP), using a Swan-Ganz catheter. Receiver-operating characteristic (ROC) curves were constructed to identify the best cutoff points for mPAP, dPAP, and sPAP to predict survival. The study endpoint was defined as 1-year mortality before transplantation. Survival analysis was completed according to the Kaplan-Meier method.

Results

During follow-up, 30 (46.1%) patients died and 19 (29%) underwent lung transplantation. Based on ROC curve analysis, we estimated mPAP ≥30 mm Hg, dPAP ≥20 mm Hg, and sPAP ≥44 mm Hg as the best threshold values with the highest sensitivity (70%, 70%, and 73%, respectively) and specificity (76%, 69%, and 72%, respectively) and the acceptable area under curve (0.67, 0.68, and 0.72, respectively). The negative predictive values for mPAP, dPAP, and sPAP were higher than the positive predictive values (79%, 77%, and 81% vs 67%, 61%, and 64%, respectively). We also constructed Kaplan-Meier curves for mPAP, dPAP, and sPAP threshold values. There were significant differences in 1-year survival between patients with and without PH for mPAP, dPAP, and sPAP threshold values (P = .005, P = .035, and P < .001; respectively).

Conclusion

Elevated mPAP, dPAP, and sPAP are related to worse prognosis in patients with esLD referred for lung transplantation.  相似文献   

13.

Background

Cardiovascular complications (CVCs) in patients with end-stage renal disease (ESRD) often require hospitalization and are associated with an increased risk of fatality. Although kidney transplantation (KTx) improves a patient's status, CVCs are still a serious risk factor, so early identification is very important for final therapeutic outcome.

Methods

This study included 5 post-KTx patients (age, 20.8 ± 1.16 years), dialyzed before KTx, and followed up for 6.7 ± 1.71 years. Body surface potential mapping (BSPM) was performed 4 times: twice before and twice after KTx. Electrocardiographic data were processed into map plotting to illustrate differences in ventricular activation times (VATs).

Results

A comparative analysis of difference maps, both of dialyzed patients and normal subjects, highlighted certain specificities in the distribution of VAT changes for the left anterior fascicle block (LAFB). The maps clearly showed a significant correlation between the intensity of changes and duration of dialysis before KTx. After KTx, VATs seemed to be similar to those in normal subjects; however, this was true only for patients dialyzed for <1 year. The patients dialyzed for >1 year showed persistent conduction abnormalities on their VAT maps.

Conclusion

Summary differences in VAT maps can enable diagnostics of initial activation propagation abnormalities in the heart. Short-term dialysis therapy before KTx imposes positive effects with regression of heart conduction changes. These observations need to be verified in a larger study population.  相似文献   

14.

Background

The impact of dialysis modality before kidney transplantation (hemodialysis or peritoneal dialysis) on outcomes is not clear. In this study we retrospectively analyzed the impact of dialysis modality on posttransplant follow-up.

Methods

To minimize donor bias, a paired kidney analysis was applied. One hundred thirty-three pairs of peritoneal dialysis (PD) and hemodialysis (HD) patients were transplanted at our center between 1994 and 2016. Those who received kidneys from the same donor were included in the study. HD patients were significantly older (44 vs 48 years), but the Charlson Comorbidity Index was similar (3.12 vs 3.46) in both groups. The groups did not differ significantly with respect to immunosuppressive protocols and number of mismatches (2.96 vs 2.95).

Results

One-year patient (98% vs 96%) and graft (90% vs 93%) survival was similar in the PD and HD patient groups. The Kaplan-Meier curves of the patients and graft survival did not differ significantly. Delayed graft function (DGF) and acute rejection (AR) occurred significantly more often in the HD recipients. Graft vessel thrombosis resulting in graft loss occurred in 9 PD (6.7%) and 4 HD (3%) patients (P > .05). Serum creatinine concentration and estimated glomerular filtration rate (using the Modification of Diet in Renal Disease guidelines) showed no difference at 1 month, 1 year, and at final visit. On multivariate analysis, factors significantly associated with graft loss were graft vessel thrombosis, DGF, and graft function 1 month after transplantation. On univariate analysis, age, coronary heart disease, and graft loss were associated with death. Among these factors, only coronary heart disease (model 1) and graft loss were significant predictors of death on multivariate analysis.

Conclusion

The long-term outcome for renal transplantation is similar in patients with PD and HD. These groups differ in some aspects, however, such as susceptibility to vascular thrombosis in PD patients, and to DGF and AR in HD patients.  相似文献   

15.

Background

Red blood cell markers (RBCM) have been found to be predictors of mortality in various populations. However, there is no information regarding the association between the values of RBCM and long-term outcomes after orthotopic heart transplantation (OHT).The aim of this study was to assess whether the values of inflammatory markers and RBCM obtained directly before OHT are associated with mortality in patients diagnosed as having end-stage heart failure undergoing OHT.

Methods

We retrospectively analyzed data of 173 nonanemic adult patients diagnosed as having end-stage heart failure undergoing primary OHT between 2007 and 2014. Clinical and laboratory data were obtained at the time of admission for the OHT. RBCM were analyzed using an automated blood counter (Sysmex XS-1000i and XE-2100, Sysmex Corporation, Kobe, Japan).

Results

Mean age of the patients was 54 (41–59) and 72% of them were male. During the observation period, the mortality rate was 32%. Multivariable analysis of Cox proportional hazard confirmed that elevated pretransplantation red blood cell distribution width value (hazard ratio [HR], 1.38 [1.25–1.48], P < .001) was the sole independent predictor of death during long-term follow-up. Other red blood cell distribution width such as mean corpuscular volume, mean corpuscular hemoglobin concentration, and mean corpuscular hemoglobin (HR, 0.88 [0.84–0.91]; P < .001; HR, 0.75 [0.53–1.05]; P < .05; HR, 0.78 [0.64–0.96]; P < .05, respectively) had predictive value in univariable analysis.

Conclusions

In summary, we have demonstrated that elevated red blood cell distribution width immediately before OHT is an independent predictor of all-cause mortality in heart transplant recipients. Other factors associated with posttransplantation mortality include lower values of mean corpuscular volume, mean corpuscular hemoglobin, and mean corpuscular hemoglobin concentration.  相似文献   

16.
Pregnancy following renal or liver transplant is safe for the mother, fetus, and allograft if standard practice guidelines are strictly followed. Cesarean delivery is often required for the safety of the mother and child. The aim of this paper was the evaluation of delivery method in patients after liver (G1) and kidney transplantation (G2) in comparison with the population of healthy pregnant women (G0).

Materials

Retrospective analysis included 51 (G1) and 59 (G2) women who delivered between 2000 and 2016. Control group (G0) consisted of 170 nontransplanted patients, who delivered between 2014 and 2016. The results were compared using nonparametric and parametric tests (Fisher exact test, t test). The SAS 9.2 was used for the analysis.

Results

The rate of cesarean delivery was high in all pregnancies following kidney (G1 = 80.4%) or liver transplantation (G2 = 67.8%) compared with control group (G0 = 44.1%; P < .05). The most common indication for cesarean delivery in G1 was gestational hypertension/preeclampsia (n = 18; 43.9%), threatening intrauterine asphyxia (n = 12; 29.3%), and failure to progress (n = 2; 4.9%). The most common indications for cesarean delivery in G2 were threatening intrauterine asphyxia (n = 14; 35%), failure to progress (n = 9; 22.5%), and gestational hypertension/preeclampsia (n = 2; 5%).

Conclusion

Cesarean delivery in patients after kidney or liver transplantation is performed mainly for obstetric reasons. The reported incidence of cesarean delivery in pregnancy following transplant is high, reflecting the high degree of clinical caution exercised in these patients.  相似文献   

17.

Introduction

End-stage renal disease (ESRD) has a significant impact on a patient's quality of life (QoL). The optimal treatment for ESRD is kidney transplantation (KTx), which aims to extend and improve QoL. The aim of the study was to assess a QoL in KTx recipients.

Methods

Our study included 118 post-KTx patients. The research tool employed for assessment was a questionnaire consisting of standardized instruments: the 36-item Short Form (SF-36); the Kidney Disease Quality of Life (KDQOL) instrument; and the Depression, Anxiety, and Stress (DASS) scale. In addition, patients were provided with information on their own weight and height, followed by calculation of body mass index.

Results

Correlation analysis showed a statistically significant influence of age on general health (R = 0.191, P = .039), physical functioning (R = ?0.295, P = .001), and general physical health (R = ?0.275, P = .003) assessment. The mean severity of depression, anxiety, and stress among subjects changed over time since KTx. For the post-KTx periods studied (ie, <1 year, 1–10 years, and >10 years), the following changes were observed: for depression, 14.0 vs 11.2 vs 13.1, respectively; for anxiety, 15.6 vs 9.8 vs 14.0, respectively; and for stress, 22.0 vs 13.5 vs 16.8, respectively.

Conclusion

In this study we found that: 1. QoL in patients after KTx showed a good level for everyday life functioning, and 2. general health assessment, physical functioning, pain, sleep quality, occupational status, vitality, social activity, staff support, and quality of care were major factors associated with QoL after KTx.  相似文献   

18.

Background

Numerous studies have shown that osteoporosis is common in kidney transplant recipients. However, the change in bone mineral density after kidney transplantation (KT) is not fully understood.

Methods

Thirty-nine kidney transplant recipients with bone densitometry at pretransplant and 24 months after KT were reviewed.

Results

The recipients' median age (44.5 ± 10.7 years) and dialysis duration before KT (4.2 ± 3.4 years) were recorded. The T-scores of the lumbar spine and femur neck at 24 months after KT were positively associated with the respective pretransplant T-score (P < .001 in the lumbar spine and P < .001 in the femur neck). However, the T-score after KT did not show significant change (P?=?.680 in lumbar spine, P?=?.093 in femur neck). Changes in the T-scores of the lumbar spine and femur neck over 24 months (delta T-score) were negatively associated with the respective pretransplant T-scores (P = .001 in lumbar spine, P?=?.026 in femur neck). Changes in the T-scores of the lumbar spine and femur neck over 24 months (delta T-score) were also associated with the pretransplant T-scores after the adjustment of other variables.

Conclusion

The change of bone mineral density was related with pretransplant bone mineral density. Careful follow-up of bone densitometry for KT recipients was needed.  相似文献   

19.

Introduction

Diseases of the cardiovascular system are the most common cause of death in patients after kidney transplantation (KTx). Pulse wave velocity (PWV) measurement is a simple, noninvasive, and increasingly popular method to assess arterial stiffness, and thus to assess cardiovascular risk. The aim of the study was to compare arterial stiffness and body composition in patients after KTx in the early and late postoperative periods.

Methods

This research was carried out from January to November 2017 at two locations: (1) Department and Clinic of General and Transplant Surgery and (2) Nephrology and Transplantology Clinic Medical University of Warsaw, the Infant Jesus Teaching Hospital, Warsaw, Poland. The study group consisted of 30 patients in the early postoperative period (2–7 postoperative days) and 151 patients in the late period (6 months to 27 years) after KTx. A single blood pressure measurement, PWV, was performed using a Schiller BR-102 plus PWV. Body composition analysis was performed using a Tanita MC-780 device.

Results

The average PWV for patients in the early period after KTx was 8.02 ± 2.21 m/s and in the late period 8.09 ± 1.68 m/s. Positive correlations were found between adipose tissue in the abdominal cavity (R = 0.444, P = .033) and PWV value. There was no correlation between the values of PWV and time after transplantation (R = 0.034, P = .777). Upon analyzing patients after transplantation and taking into account the type of dialysis therapy, lower systolic blood pressure (142 ± 21 mm Hg vs 156 ± 24 mm Hg) and diastolic blood pressure (84 ± 13 mm Hg vs 98 ± 11 mm Hg) values were observed in patients treated with hemodialysis compared with those treated with peritoneal dialysis.

Conclusion

Using PWV measurement, we found that arterial stiffness levels were similar for early and late periods after transplantation.  相似文献   

20.

Introduction

Patients subjected to long-term immunosuppressive therapy after organ and cells transplantation are more susceptible than healthy people to the development of the pathologic changes in the oral cavity, including precancerous lesions, oral cancers, lesions following viral infections (herpes simplex virus, Epstein-Barr virus, and cytomegalovirus), fungal infections mainly caused by Candida albicans, drug-induced gingival overgrowth, stomatitis, and tongue disorders.

Material and methods

Clinical case material included 38 patients after kidney, liver, or blood-forming cells transplantation subjected to various immunosuppressive therapy schemes. The study comprised standard case taking and physical examination of the patient, including detailed intraoral and extraoral stomatological examinations.

Results

Extraoral examination confirmed 1 case of multifocal basal cell carcinoma in the auricular region and one case of systemic lupus erythematosus. Intraoral examination revealed gingivitis (60.5%), gingival recession (58%), periodontitis (55.26%), macroglossia (15.8%), lingual papillary atrophy (13.16%), leukoplakia aphthae/ulcerations (10.5%), lichen planus, pallor of mucous membranes (7.9%), pathologic pigmentation of oral mucosa, geographic tongue (5.26%) and erythroplakia (2.6%). When their histories were taken, patients reported xerostomia (68.42%), halitosis (23.68%), gum bleeding while brushing teeth (18.42%), and dysgeusia (15.78%).

Discussion

Both the patients after organ and hematopoietic stem cells transplantations and those qualified for a transplant should undergo multispecialty treatment, particularly dental treatment, to enable the detection of pathologies at an early stage and commencement of effective therapy. Cooperation between the main doctor and the dentist is crucial in the process of treatment of this group of patients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号