首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 562 毫秒
1.

Background and objectives

African Americans are disproportionately affected by ESRD, but few receive a living donor kidney transplant. Surveys assessing attitudes toward donation have shown that African Americans are less likely to express a willingness to donate their own organs. Studies aimed at understanding factors that may facilitate the willingness of African Americans to become organ donors are needed.

Design, setting, participants, & measurements

A novel formative research method was used (the nominal group technique) to identify and prioritize strategies for facilitating increases in organ donation among church-attending African Americans. Four nominal group technique panel interviews were convened (three community and one clergy). Each community panel represented a distinct local church; the clergy panel represented five distinct faith-based denominations. Before nominal group technique interviews, participants completed a questionnaire that assessed willingness to become a donor; 28 African-American adults (≥19 years old) participated in the study.

Results

In total, 66.7% of participants identified knowledge- or education-related strategies as most important strategies in facilitating willingness to become an organ donor, a view that was even more pronounced among clergy. Three of four nominal group technique panels rated a knowledge-based strategy as the most important and included strategies, such as information on donor involvement and donation-related risks; 29.6% of participants indicated that they disagreed with deceased donation, and 37% of participants disagreed with living donation. Community participants’ reservations about becoming an organ donor were similar for living (38.1%) and deceased (33.4%) donation; in contrast, clergy participants were more likely to express reservations about living donation (33.3% versus 16.7%).

Conclusions

These data indicate a greater opposition to living donation compared with donation after one’s death among African Americans and suggest that improving knowledge about organ donation, particularly with regard to donor involvement and donation-related risks, may facilitate increases in organ donation. Existing educational campaigns may fall short of meeting information needs of African Americans.  相似文献   

2.

Background

An important issue in the transplantation of livers procured from cardiac death donors (CDDs) concerns why some centres report equivalent outcomes and others report inferior outcomes in transplantations using CDD organs compared with standard criteria donor (SCD) organs. Resolving this discrepancy may increase the number of usable organs.

Objectives

This study aimed to test whether differences in cold ischaemic time (CIT) are critical during CDD organ transplantation and whether such differences might explain the disparate outcomes.

Methods

Results of CDD liver transplants in our own centre were compared retrospectively with results in a matched cohort of SCD liver recipients. Endpoints of primary non-function (PNF) and ischaemic cholangiopathy (IC) were used because these outcomes are clearly associated with CDD organ use.

Results

In 22 CDD organ transplants, CIT was a strong predictor of PNF or IC (P = 0.021). Minimising CIT in CDD organ transplants produced outcomes similar to those in a matched SCD organ transplant cohort at our centre and in SCD organ transplant results nationally (1- and 3-year graft and patient survival rates: 90.9% and 73.3% vs. 77.6% and 69.2% in CDD and SCD grafts, respectively. A review of the published literature demonstrated that centres with higher CITs tend to have higher rates of PNF or IC (correlation coefficient: 0.41).

Conclusions

These findings suggest that a targeted effort to minimise CIT might improve outcomes and allow the safer use of CDD organs.  相似文献   

3.

Background and objective

Chronic obstructive pulmonary disease (COPD) represents an increasing healthcare concern as a leading cause of morbidity and mortality worldwide. Our objective was to predict the outcome of COPD patients associated with multiple organ dysfunction syndrome (MODS) by scoring models.

Methods

A retrospective study was performed on severe COPD patients within 24 hours of the onset of MODS. The Acute Physiology and Chronic Health Evaluation (APACHE) II, APACHE III, Multiple Organ Dysfunction Score (MODS), Simplified Acute Physiology Score II (SAPS II), and Sepsis-related Organ Failure Assessment (SOFA) scores were calculated for patients.

Results

A total of 153 elderly patients were recruited. Compared to 30-day survivors, the number of failing organs and all of the scoring models were significantly higher in 30-day non-survivors. The SOFA showed the highest sensitivity and area under the curve (AUC) for predicting the prognosis of patients with MODS induced by acute exacerbation of COPD. The results of logistic regression indicated that factors that were correlated with the prognosis of COPD included the exacerbation history, SOFA score, number of failing organs, and duration of ICU stay. The value of exacerbation frequency for predicting the outcome of COPD was excellent (AUC: 0.892), with a sensitivity of 0.851 and a specificity of 0.797.

Conclusions

The SOFA score, determined at the onset of MODS in elderly patients with COPD, was a reliable predictor of the prognosis. The exacerbation frequency, number of failing organs, and the SOFA score were risk factors of a poor prognosis, and the exacerbation frequency could also effectively predict the outcome of COPD.  相似文献   

4.

Background and objectives

In December of 2014, the Organ Procurement and Transplant Network implemented a new Kidney Allocation System (KAS) for deceased donor transplant, with increased priority for highly sensitized candidates (calculated panel–reactive antibody [cPRA] >99%). We used a modified version of the new KAS to address issues of access and equity for these candidates.

Design, setting, participants, & measurements

In a simulation, 10,988 deceased donor kidneys transplanted into waitlisted recipients in 2010 were instead allocated to candidates with cPRA≥80% (n=18,004). Each candidate’s unacceptable donor HLA antigens had been entered into the allocation system by the transplant center. In simulated match runs, kidneys were allocated sequentially to adult ABO identical or permissible candidates with cPRA 100%, 99%, 98%, etc. to 80%. Allocations were restricted to donor/recipient pairs with negative virtual crossmatches.

Results

The simulation indicated that 2111 of 10,988 kidneys (19.2%) would have been allocated to patients with cPRA 100% versus 74 of 10,988 (0.7%) that were actually transplanted. Of cPRA 100% candidates, 74% were predicted to be compatible with an average of six deceased donors; the remaining 26% seemed to be incompatible with every deceased donor organ that entered the system. Of kidneys actually allocated to cPRA 100% candidates in 2010, 66% (49 of 74) were six–antigen HLA matched/zero–antigen mismatched (HLA-A, -B, and -DR) with their recipients versus only 11% (237 of 2111) in the simulation. The simulation predicted that 10,356 of 14,433 (72%) candidates with cPRA 90%–100% could be allocated an organ compared with 7.3% who actually underwent transplant.

Conclusions

Data in this simulation are consistent with early results of the new KAS; specifically, nearly 20% of deceased donor kidneys were (virtually) compatible with cPRA 100% candidates. Although most of these candidates were predicted to be compatible with multiple donors, approximately one-quarter are unlikely to receive a single offer.  相似文献   

5.

Background:

In the living donor liver transplant setting, the preoperative assessment of potential donors is important to ensure the donor safety.

Objectives:

The aim of this study was to identify causes and costs of living liver-donors rejection in the donation process.

Materials and Methods:

From June 2010 to June 2012, all potential living liver donors for 66 liver transplant candidates were screened at the Ain Shams Center for Organ Transplantation. Potential donors were evaluated in 3 phases, and their data were reviewed to determine the causes and at which phase the donors were rejected.

Results:

One hundred and ninety two potential living liver donors, including 157 (81.7%) males, were screened for 66 potential recipients. Of these, 126 (65.6%) were disqualified for the donation. The causes of rejection were classified as surgical (9.5 %) or medical (90.5 %). Five donors (3.9 %) were rejected due to multiple causes. Factor V Leiden mutation was detected in 29 (23 %) rejected donors (P = 0.001), 25 (19.8 %) donors had positive results for hepatitis serology (P = 0.005), and 16 (12.7 %) tested positive for drug abuse. Portal vein trifurcation (n = 9, 7.1%) and small size liver graft estimated by CT volumetric analysis (n = 6, 4.8 %) were the main surgical causes which precluded the donation.

Conclusions:

Among potential Egyptian living liver donors, Factor V Leiden mutation was a significant cause for live donor rejection. A stepwise approach to donor assessment was found to be cost-effective.  相似文献   

6.

Background and objectives

Data reported to the Organ Procurement and Transplantation Network (OPTN) are used in kidney transplant research, policy development, and assessment of center quality, but the accuracy of early post–transplant outcome measures is unknown.

Design, setting, participants, & measurements

The Deceased Donor Study (DDS) is a prospective cohort study at five transplant centers. Research coordinators manually abstracted data from electronic records for 557 adults who underwent deceased donor kidney transplantation between April of 2010 and November of 2013. We compared the post-transplant outcomes of delayed graft function (DGF; defined as dialysis in the first post–transplant week), acute rejection, and post–transplant serum creatinine reported to the OPTN with data collected for the DDS.

Results

Median kidney donor risk index was 1.22 (interquartile range [IQR], 0.97–1.53). Median recipient age was 55 (IQR, 46–63) years old, 63% were men, and 47% were black; 93% had received dialysis before transplant. Using DDS data as the gold standard, we found that pretransplant dialysis was not reported to the OPTN in only 11 (2%) instances. DGF in OPTN data had a sensitivity of 89% (95% confidence interval [95% CI], 84% to 93%) and specificity of 98% (95% CI, 96% to 99%). Surprisingly, the OPTN data accurately identified acute allograft rejection in only 20 of 47 instances (n=488; sensitivity of 43%; 95% CI, 17% to 73%). Across participating centers, sensitivity of acute rejection varied widely from 23% to 100%, whereas specificity was uniformly high (92%–100%). Six-month serum creatinine values in DDS and OPTN data had high concordance (n=490; Lin concordance correlation =0.90; 95% CI, 0.88 to 0.92).

Conclusions

OPTN outcomes for recipients of deceased donor kidney transplants have high validity for DGF and 6-month allograft function but lack sensitivity in detecting rejection. Future studies using OPTN data may consider focusing on allograft function at 6 months as a useful outcome.  相似文献   

7.

Background

There is a worldwide need to expand the donor liver pool. We report a consecutive series of elective candidates for liver transplantation (LT) who received ‘livers that nobody wants’ (LNWs) in Argentina.

Methods

Between 2006 and 2009, outcomes for patients who received LNWs were analysed and compared with outcomes for a control group. To be defined as an LNW, an organ is required to fulfil two criteria. Firstly, each liver must be officially offered and refused more than 30 times; secondly, the liver must be refused by at least 50% of the LT programmes in our country before our programme can accept it. Principal endpoints were primary graft non-function (PNF), mortality, and graft and patient survival.

Results

We transplanted 26 LNWs that had been discarded by a median of 12 centres. A total of 2666 reasons for refusal had been registered. These included poor donor status (n = 1980), followed by LT centre (n = 398) or recipient (n = 288) conditions. Incidences of PNF (3.8% vs. 4.0%), in-hospital mortality (3.8% vs. 8.0%), 1-year patient (84% vs. 84%) and graft (84% vs. 80%) survival were equal in the LNW and control groups.

Conclusions

Transplantable livers are unnecessarily discarded by the transplant community. External and internal supervision of the activity of each LT programme is urgently needed to guarantee high standards of excellence.  相似文献   

8.
Kim IS  Lee H  Park JC  Shin SK  Lee SK  Lee YC 《Gut and liver》2012,6(3):349-354

Background/Aims

Solid organ transplant recipients frequently report gastrointestinal symptoms, especially heartburn or dyspepsia. However, the prevalence of endoscopic erosive esophagitis (EE) and associated risk factors after transplantation are unknown. The aim of this study was to determine whether there was a high incidence of endoscopic findings of EE in solid organ transplant recipients.

Methods

This retrospective case-control study included 256 of 3,152 solid organ transplant recipients who underwent sequential screening upper endoscopic examinations and an equal number of controls.

Results

Forty-four (17.2%) and 16 (6.2%) cases of EE were detected in the solid organ transplant and control groups, respectively (p<0.001). In the multivariate analysis, transplantation was significantly associated with EE (odds ratio [OR], 6.48; 95% confidence interval, 2.74 to 15.35). Factors such as old age (OR, 1.17), the presence of a hiatal hernia (OR, 5.84), an increased duration of immunosuppression (OR, 1.07), and the maintenance administration of mycophenolate mofetil (OR, 4.13) were independently associated with the occurrence of EE in the solid organ transplant recipients.

Conclusions

A significant increase in the incidence of endoscopically detected EE was observed in solid organ transplant recipients. This increased incidence was associated with the type and duration of the immunosuppressive therapy.  相似文献   

9.

Summary

Background and objectives

The profound organ shortage has resulted in longer waiting times and increased mortality for those awaiting kidney transplantation. Consequently, patients are turning to older living donors. It is unclear if an upper age limit for donation should exist, both in terms of recipient and donor outcomes.

Design, setting, participants, & measurements

In the United States, 219 healthy adults aged ≥70 have donated kidneys at 80 of 279 transplant centers. Competing risks models with matched controls were used to study the independent association between older donor age and allograft survival, accounting for the competing risk of recipient mortality as well as other transplant factors.

Results

Among recipients of older live donor allografts, graft loss was significantly higher than matched 50-to 59-year-old live donor allografts (subhazard ratio [SHR] 1.62, 95% confidence interval [CI] 1.16 to 2.28, P = 0.005) but similar to matched nonextended criteria 50-to 59-year-old deceased donor allografts (SHR 1.19, 95% CI 0.87 to 1.63, P = 0.3). Mortality among living kidney donors aged ≥70 was no higher than healthy matched controls drawn from the NHANES-III cohort; in fact, mortality was lower, probably reflecting higher selectivity among older live donors than could be captured in National Health and Nutrition Examination Survey III (NHANES-III; HR 0.37, 95% CI 0.21 to 0.65, P < 0.001).

Conclusions

These findings support living donation among older adults but highlight the advantages of finding a younger donor, particularly for younger recipients.  相似文献   

10.

Objectives:

Cytomegalovirus (CMV) infection is responsible for significant morbidity and mortality among solid organ transplant recipients. Prophylaxis using valganciclovir (VGCV) in orthotopic liver transplant (OLT) recipients is not approved by the Food and Drug Administration and its use is controversial. This study aimed to evaluate the effectiveness of VGCV in CMV prophylaxis in OLT recipients.

Methods:

We carried out a retrospective, single-centre study including all OLT procedures performed during 2005–2008. Patients with early death (at ≤30 days), without CMV serology or prophylaxis, or with follow-up of <1 year were excluded.

Results:

The overall incidence of CMV disease was 6% (n = 9). The ganciclovir (GCV) and VGCV groups had similar incidences of CMV disease (4.6% vs. 7.0%; P = 0.4) and similar distributions of disease presentation (CMV syndrome vs. tissue-invasive CMV; P = 0.4). Incidences of CMV infection, as well as disease presentation, were similar between the high-risk (CMV D+/R−) and non-high-risk groups (P = 0.16). Although acute cellular rejection occurred more frequently in patients who developed CMV disease (P = 0.005), overall survival in these patients did not differ from that in patients who did not develop CMV infection (P = 0.5).

Conclusions:

Valganciclovir is an effective antiviral for the prevention of CMV disease in liver transplant recipients. Our data support its use in high-risk OLT patients.  相似文献   

11.

Objectives:

The aim of this study was to evaluate the cost-effectiveness in liver transplantation (LT) of utilizing organs donated after cardiac death (DCD) compared with organs donated after brain death (DBD).

Methods:

A Markov-based decision analytic model was created to compare two LT waitlist strategies distinguished by organ type: (i) DBD organs only, and (ii) DBD and DCD organs. The model simulated outcomes for patients over 10 years with annual cycles through one of four health states: survival; ischaemic cholangiopathy; retransplantation, and death. Baseline values and ranges were determined from an extensive literature review. Sensitivity analyses tested model strength and parameter variability.

Results:

Overall survival is decreased, and biliary complications and retransplantation are increased in recipients of DCD livers. Recipients of DBD livers gained 5.6 quality-adjusted life years (QALYs) at a cost of US$69 000/QALY, whereas recipients on the DBD + DCD LT waitlist gained 6.0 QALYs at a cost of US$61 000/QALY. The DBD + DCD organ strategy was superior to the DBD organ-only strategy.

Conclusions:

The extension of life and quality of life provided by DCD LT to patients on the waiting list who might otherwise not receive a liver transplant makes the continued use of DCD livers cost-effective.  相似文献   

12.

Summary

Background and objectives

Type 2 diabetic patients with end-stage renal disease may receive a simultaneous pancreas-kidney (SPK) transplant. However, outcomes are not well described. Risks for death and graft failure were examined in SPK type 2 diabetic recipients.

Design, setting, participants, & measurements

Using the United Network for Organ Sharing database, outcomes of SPK transplants were compared between type 2 and type 1 diabetic recipients. All primary SPK adult recipients transplanted between 2000 and 2007 (n = 6756) were stratified according to end-stage pancreas disease diagnosis (type 1: n=6141, type 2: n=582). Posttransplant complications and risks for death and kidney/pancreas graft failure were compared.

Results

Of the 6756 SPK transplants, 8.6% were performed in recipients with a type 2 diabetes diagnosis. Rates of delayed kidney graft function and primary kidney nonfunction were higher in the type 2 diabetics. Five-year overall and death-censored kidney graft survival were inferior in type 2 diabetics. After adjustment for other risk factors, including recipient (age, race, body weight, dialysis time, and cardiovascular comorbidities), donor, and transplant immune characteristics, type 2 diabetes was not associated with increased risk for death or kidney or pancreas failure when compared with type 1 diabetic recipients.

Conclusions

After adjustment for other risk factors, SPK recipients with type 2 diabetes diagnosis were not at increased risk for death, kidney failure, or pancreas failure when compared with recipients with type 1 diabetes.  相似文献   

13.

Summary

Background and objectives

The choice of induction agent in the elderly kidney transplant recipient is unclear.

Design, setting, participants, & measurements

The risks of rejection at 1 year, functional graft loss, and death by induction agent (IL2 receptor antibodies [IL2RA], alemtuzumab, and rabbit antithymocyte globulin [rATG]) were compared among five groups of elderly (≥60 years) deceased-donor kidney transplant recipients on the basis of recipient risk and donor risk using United Network of Organ Sharing data from 2003 to 2008.

Results

In high-risk recipients with high-risk donors there was a higher risk of rejection and functional graft loss with IL2RA versus rATG. Among low-risk recipients with low-risk donors there was no difference in outcomes between IL2RA and rATG. In the two groups in which donor or recipient was high risk, there was a higher risk of rejection but not functional graft loss with IL2RA. Among low-risk recipients with high-risk donors, there was a trend toward a higher risk of death with IL2RA.

Conclusions

rATG may be preferable in high-risk recipients with high-risk donors and possibly low-risk recipients with high-risk donors. In the remaining groups, although rATG is associated with a lower risk of acute rejection, long-term outcomes do not appear to differ. Prospective comparison of these agents in an elderly cohort is warranted to compare the efficacy and adverse consequences of these agents to refine the use of induction immunosuppressive therapy in the elderly population.  相似文献   

14.

Background

Postoperative infections are frequent complications after liver resection and have significant impact on length of stay, morbidity and mortality. Surgical site infection (SSI) is the most common nosocomial infection in surgical patients, accounting for 38% of all such infections.

Objectives

This study aimed to identify predictors of SSI and organ space SSI after liver resection.

Methods

Data from the American College of Surgeons National Surgical Quality Improvement Program (ACS–NSQIP) database for patients who underwent liver resection in 2005, 2006 or 2007 in any of 173 hospitals throughout the USA were analysed. All patients who underwent a segmental resection, left hepatectomy, right hepatectomy or trisectionectomy were included.

Results

The ACS–NSQIP database contained 2332 patients who underwent hepatectomy during 2005–2007. Rates of SSI varied significantly across primary procedures, ranging from 9.7% in segmental resection patients to 18.3% in trisectionectomy patients. A preoperative open wound, hypernatraemia, hypoalbuminaemia, elevated serum bilirubin, dialysis and longer operative time were independent predictors for SSI and for organ space SSI.

Conclusions

These findings may contribute towards the identification of patients at risk for SSI and the development of strategies to reduce the incidence of SSI and subsequent costs after liver resection.  相似文献   

15.

Summary

Background and objectives

Recent interest has focused on wait listing patients without pretreating coronary artery disease to expedite transplantation. Our practice is to offer coronary revascularization before transplantation if indicated.

Design, setting, participants, & measurements

Between 2006 and 2009, 657 patients (427 men, 230 women; ages, 56.5 ± 9.94 years) underwent pretransplant assessment with coronary angiography. 573 of 657 (87.2%) patients were wait listed; 247 of 573 (43.1%) patients were transplanted during the follow-up period, 30.09 ± 11.67 months.

Results

Patient survival for those not wait listed was poor, 83.2% and 45.7% at 1 and 3 years, respectively. In wait-listed patients, survival was 98.9% and 95.3% at 1 and 3 years, respectively. 184 of 657 (28.0%) patients were offered revascularization. Survival in patients (n = 16) declining revascularization was poor: 75% survived 1 year and 37.1% survived 3 years. Patients undergoing revascularization followed by transplantation (n = 51) had a 98.0% and 88.4% cardiac event–free survival at 1 and 3 years, respectively. Cardiac event–free survival for patients revascularized and awaiting deceased donor transplantation was similar: 94.0% and 90.0% at 1 and 3 years, respectively.

Conclusions

Our data suggest pre-emptive coronary revascularization is not only associated with excellent survival rates in patients subsequently transplanted, but also in those patients waiting on dialysis for a deceased donor transplant.  相似文献   

16.

Summary

Background and objectives

Pre-existing hepatitis B virus (HBV) infection has been associated in inferior renal transplant outcomes. We examined outcomes of HBV+ renal recipients in a more recent era with availability of oral anti-viral agents.

Design, setting, participants, & measurements

Using the Organ Procurement Transplant Network/United Network for Organ Sharing database, we selected adult primary kidney recipients transplanted in the United States (2001 to 2007). The cohort was divided into HBV+ (surface antigen positive, n = 1346) and HBV− patients (surface antigen negative; n = 74,335). Five-year graft survival, patient survival, hepatic failure incidence, and associated adjusted risks were compared.

Results

HBV+ recipients were more frequently Asian, had a lower body mass index, and glomerulonephritis was more prevalent as the etiology of ESRD. HBV+ recipients had less pretransplant diabetes and cardiovascular disease, were less likely a living donor recipient, and were less likely to receive steroids at discharge. Five-year patient survival was 85.3% and 85.6% and graft survival was 74.9% and 75.1% for HBV+ and HBV−, respectively. HBV infection was not a risk factor for death or kidney failure, although 5-year cumulative incidence of hepatic failure was higher in HBV+ recipients (1.3% versus 0.2%; P < 0.001), and HBV+ was associated with 5.5- and 5.2-fold increased risk for hepatic failure in living and deceased donors, respectively, compared with HBV−.

Conclusions

In a recent era (2001 to 2007), HBV-infected renal recipients were not at higher risk for kidney failure or death; however, they remain at higher risk of liver failure compared with HBV− recipients.  相似文献   

17.

Summary

Background and objectives

Major predisposing risks for the development of SLE in the nontransplant setting have been reported to include female gender, ethnicity, and genetic factors among others. In the current study, we aimed to determine whether increasing haplotype match in living donor renal transplantation would have a negative impact on the long-term rates of graft loss due to lupus nephritis recurrence.

Design, setting, participants, and measurements

Data were provided by the Organ Procurement and Transplantation Network—United Network for Organ Sharing. Living-related primary kidney transplants performed between January 1, 1988, and December 31, 2007 with the native renal diagnosis of lupus nephritis for all patients alive and with functioning graft at discharge were included. The cumulative probability rates of allograft loss due to recurrence of lupus nephritis (RLN) stratified by haplotype match and immunosuppression were obtained.

Results

The cumulative probability rates of graft loss due to RLN in primary kidney transplant recipients receiving cyclosporine-based immunosuppression were 4.8% (n = 187), 2.9% (n = 602), and 0.7% (n = 192) for recipients with 0-, 1-, and 2-haplotype matches, respectively. Similarly, recipients receiving “all maintenance” immunosuppressive therapy with 0-, 1-, and 2-haplotype matches had graft loss rates of 4.3% (n = 433), 2.3% (n = 1049), and 0.5% (n = 303), respectively. Chi-squared analyses revealed no significant gender or ethnic background differences among haplotype groups. Compared with 0-haplotype, 1- and 2-haplotype matched recipients were generally younger.

Conclusions

Living-related kidney donation with increasing haplotype match is unexpectedly associated with lower rates of allograft loss due to RLN. Potential contributory factors to this positive effect are not known.  相似文献   

18.

BACKGROUND:

In a previous small retrospective study, the authors reported that hepatopulmonary syndrome was less common among liver transplant candidates at high-altitude centres compared with low-altitude centres.

OBJECTIVE:

To further explore the relationship between hepatopulmonary syndrome and altitude of residence in a larger patient cohort.

METHODS:

A cohort of 65,264 liver transplant candidates in the Organ Procurement and Transplantation Network liver database between 1988 and 2006 was analyzed. Hepatopulmonary syndrome diagnosis was determined during a comprehensive evaluation at a liver transplant centre by physicians who were experienced in the diagnosis and treatment of hepatopulmonary syndrome. The altitude of residence was determined for each patient by assigning the mean altitude of the zip code of residence at the time of entry on the wait list. Mean zip code elevation was calculated using the National Elevation Dataset of the United States Geological Survey, which provides exact elevation measurements across the entire country.

RESULTS:

Hepatopulmonary syndrome was significantly less common at higher resident altitudes (P=0.015). After adjusting for age, sex and Model for End-Stage Liver Disease score, there was a 46% decrease in the odds of hepatopulmonary syndrome with every increase of 1000 m of resident elevation (OR 0.54 [95% CI 0.33 to 0.89]).

CONCLUSION:

There was a negative association between altitude and hepatopulmonary syndrome. One plausible explanation is that the lower ambient oxygen found at higher elevation leads to pulmonary vasoconstriction, which mitigates the primary physiological lesion of hepatopulmonary syndrome, namely, pulmonary vasodilation.  相似文献   

19.

Background

Although liver transplantation is the last resort for treating end stage liver diseases, this medical procedure is not available for all needful patients because of inadequate organ supply. Therefore, guidelines have been developed by medical experts to regulate the process. Some professionals believe that medical criteria are inadequate for organ allocation in all situations and may not secure fairness of organ allocation.

Objectives

The current study has been designed to identify decision criteria about allocation of donated liver to potential recipients from public points of view.

Patients and Methods

This is a qualitative study that was conducted through individual interviews and Focus Group Discussions. Individual interviews were conducted among patients’ companions and nurses in one of the two liver transplant centers in Iran. Group discussions were conducted among groups of ordinary people who had not dealt previously with the subject. Data was analyzed by Thematic Analysis method.

Results

Most of the participants in this study believe that in equal medical conditions, some individual and societal criteria could be used to prioritize patients for receiving donated livers. The criteria include psychological acceptance, ability to pay post-operative care costs, being breadwinner of the family, family support, being socially valued, ability to be instructed, lack of mental disorders, young age of the recipient, being on waiting list for a long time, lack of patient’s role in causing the illness, first time transplant recipient, critical medical condition, high success rate of transplantation, lack of concurrent medical illnesses, not being an inmate at the time of receiving transplant, and bearing Iranian nationality.

Conclusions

Taking public opinion into consideration may smooth the process of organ allocation to needful patients with equal medical conditions. It seems that considering these viewpoints in drafting organ allocation guidelines may increase confidence of the society to the equity of organ allocation in the country. This strategy may also persuade people to donate organs particularly after death.  相似文献   

20.

BACKGROUND:

Donation after circulatory death is a novel method of increasing the number of donor lungs available for transplantation. Using organs from donors after circulatory death has the potential to increase the number of transplants performed.

METHODS:

Three bilateral lung transplants from donors after circulatory death were performed over a six-month period. Following organ retrieval, all sets of lungs were placed on a portable ex vivo lung perfusion device for evaluation and preservation.

RESULTS:

Lung function remained stable during portable ex vivo perfusion, with improvement in partial pressure of oxygen/fraction of inspired oxygen ratios. Mechanical ventilation was discontinued within 48 h for each recipient and no patient stayed in the intensive care unit longer than eight days. There was no postgraft dysfunction at 72 h in two of the three recipients. Ninety-day mortality for all recipients was 0% and all maintain excellent forced expiratory volume in 1 s and forced vital capacity values post-transplantation.

CONCLUSION:

The authors report excellent results with their initial experience using donors after circulatory death after portable ex vivo lung perfusion. It is hoped this will allow for the most efficient use of available donor lungs, leading to more transplants and fewer deaths for potential recipients on wait lists.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号