首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Children with Down syndrome have an increased risk of developing acute lymphoblastic leukemia and a poor tolerance of methotrexate. This latter problem is assumed to be caused by a higher cellular sensitivity of tissues in children with Down syndrome. However, whether differences in pharmacokinetics play a role is unknown.

Design and Methods

We compared methotrexate-induced toxicity and pharmacokinetics in a retrospective case-control study between patients with acute lymphoblastic leukemia who did or did not have Down syndrome. Population pharmacokinetic models were fitted to data from all individuals simultaneously, using non-linear mixed effect modeling.

Results

Overall, 468 courses of methotrexate (1–5 g/m2) were given to 44 acute lymphoblastic leukemia patients with Down syndrome and to 87 acute lymphoblastic leukemia patients without Down syndrome. Grade 3–4 gastrointestinal toxicity was significantly more frequent in the children with Down syndrome than in those without (25.5% versus 3.9%; P=0.001). The occurrence of grade 3–4 gastrointestinal toxicity was not related to plasma methotrexate area under the curve. Methotrexate clearance was 5% lower in the acute lymphoblastic leukemia patients with Down syndrome (P=0.001); however, this small difference is probably clinically not relevant, because no significant differences in methotrexate plasma levels were detected at 24 and 48 hours.

Conclusions

We did not find evidence of differences in the pharmacokinetics of methotrexate between patients with and without Down syndrome which could explain the higher frequency of gastrointestinal toxicity and the greater need for methotrexate dose reductions in patients with Down syndrome. Hence, these problems are most likely explained by differential pharmaco-dynamic effects in the tissues between children with and without Down syndrome. Although the number of patients was limited to draw conclusions, we feel that it may be safe in children with Down syndrome to start with intermediate dosages of methotrexate (1–3 g/m2) and monitor the patients carefully.  相似文献   

2.

Background

Corticosteroids are a standard component of the treatment of acute lymphoblastic leukemia and lymphoblastic lymphoma. Our aim was to determine whether dexamethasone results in a better outcome than prednisolone.

Design and Methods

Adult patients with acute lymphoblastic leukemia or lymphoblastic lymphoma were randomized to receive, as part of their induction therapy on days 1–8 and 15–22, either dexamethasone 8 mg/m2 or prednisolone 60 mg/m2. Those who reached complete remission were given two courses of consolidation therapy with high-dose cytarabine and mitoxantrone and methotrexate and asparaginase. Subsequently patients younger than 50 years, with a suitable donor, were to undergo allogeneic stem cell transplantation, whereas the others were planned to receive either an autologous stem cell transplant or high-dose maintenance chemotherapy with prophylactic central nervous system irradiation. Randomization was done with a minimization technique. The primary endpoint was event-free survival and the analyses was conducted on an intention-to-treat basis.

Results

Between August 1995 and October 2003, 325 patients between 15 to 72 years of age were randomized to receive either dexamethasone (163 patients) or prednisolone (162 patients). After induction and the course of first consolidation therapy, 131 (80.4%) patients in the dexamethasone group and 124 (76.5%) in the prednisolone group achieved complete remission. No significant difference was observed between the two treatment groups with regards to 6-year event-free survival rates (±SE) which were 25.9% (3.6%) and 28.7% (3.5%) in the dexamethasone and prednisolone groups, respectively (P=0.82, hazard ratio 0.97; 95% confidence interval, 0.75–1.25). Disease-free survival after complete remission was also similar in the dexamethasone and prednisolone groups, the 6-year rates being 32.3% and 37.5%, respectively (hazard ratio 1.03; 95% confidence interval 0.76–1.40). The 6-year cumulative incidences of relapse were 49.8% and 53.5% (Gray’s test: P=0.30) while the 6-year cumulative incidences of death were 18% and 9% (Gray’s test: P=0.07).

Conclusions

In the ALL-4 trial in adult patients with acute lymphoblastic leukemia or lymphoblastic lymphoma, treatment with dexamethasone did not show any advantage over treatment with prednisolone.  相似文献   

3.

Background

Approximately 40% of adults with Philadelphia chromosome-negative acute lymphoblastic leukemia achieve long-term survival following unrelated donor hematopoietic stem cell transplantation in first complete remission but severe graft-versus-host disease remains a problem affecting survival. Although T-cell depletion abrogates graft-versus-host disease, the impact on disease-free survival in acute lymphoblastic leukemia is not known.

Design and Methods

We analyzed the outcome of 48 adults (median age 26 years) with high-risk, Philadelphia-chromosome-negative acute lymphoblastic leukemia undergoing T-cell depleted unrelated donor-hematopoietic stem cell transplantation (67% 10 of 10 loci matched) in first complete remission reported to the British Society of Blood and Marrow Transplantation Registry from 1993 to 2005.

Results

T-cell depletion was carried out by in vivo alemtuzumab administration. Additional, ex vivo T-cell depletion was performed in 21% of patients. Overall survival, disease-free survival and non-relapse mortality rates at 5 years were 61% (95% CI 46–75), 59% (95% CI 45–74) and 13% (95% CI 3–25), respectively. The incidences of grades II–IV and III–IV acute graft-versus-host disease were 27% (95% CI 16–44) and 10% (95% CI 4–25), respectively. The actuarial estimate of extensive chronic graft-versus-host disease at 5 years was 22% (95%CI 13–38). High-risk cytogenetics at diagnosis was associated with a lower 5-year overall survival (47% (95% CI 27–71) vs. 68% (95% CI 44–84), p=0.045).

Conclusions

T-cell depleted hematopoietic stem cell transplantation from unrelated donors can result in good overall survival and low non-relapse mortality for adults with high-risk acute lymphoblastic leukemia in first complete remission and merits prospective evaluation.  相似文献   

4.

Aims/Introduction

To assess the current status of glycemic control in patients with type 2 diabetes treated with a combination of metformin and sulfonylurea for >3 months, as measured by glycosylated hemoglobin (HbA1c).

Materials and Methods

Data on patient demographics, diabetic complications, HbA1c, fasting plasma glucose (FPG) and type of treatment were collected in this multicenter, cross-sectional, non-interventional study.

Results

From April 2008 to February 2009, 5,628 patients were recruited from 299 centers in Korea. Patients characteristics (mean ± SD) were as follows: age 58.4 ± 10.8 years, duration of diabetes 6.1 ± 4.7 years, body mass index 24.7 ± 2.9 kg/m2, HbA1c 7.77 ± 1.22%, FBG 147.4 ± 46.5 mmol/L and FPG 164.0 ± 54.3 mmol/L. The most common diabetic complication was neuropathy (22.5%), followed by retinopathy (18.3%) and microalbuminuria (16.1%). Just 1,524 (27.1%) patients achieved HbA1c ≤7%. A higher number of patients (32.6%) treated by endocrinologists achieved HbA1c ≤7% than those treated by internists (24.4%) and primary care physicians (23.2%). In multivariate analyses, diabetic retinopathy (odds ratio 0.455, 95% confidence interval 0.341–0.606), nephropathy (odds ratio 0.639, 95% confidence interval 0.43–0.949), diabetes for ≥5 years (odds ratio 0.493, 95% confidence interval 0.4–0.606) and older age added by 1 year (odds ratio 1.019, 95% confidence interval 1.01–1.029) was significantly associated with achieving target HbA1c. In addition, treatment by endocrinologists rather than internists significantly increased chances of achieving target HbA1c (odds ratio 1.417, 95% confidence interval 1.146–1.751).

Conclusions

The majority of patients with type 2 diabetes in Korea had inadequate glycemic control, despite receiving a combination of metformin and sulfonylurea.  相似文献   

5.

Background

Lymphopenia and tumor-associated macrophages are negative prognostic factors for survival in classical Hodgkin’s lymphoma. We, therefore, studied whether the peripheral blood absolute lymphocyte count/absolute monocyte count ratio at diagnosis affects survival in classical Hodgkin’s lymphoma.

Design and Methods

We studied 476 consecutive patients with classical Hodgkin’s lymphoma followed at the Mayo Clinic from 1974 to 2010. Receiver operating characteristic curves and area under the curve were used to determine cut-off values for the absolute lymphocyte count/absolute monocyte count ratio at diagnosis, while proportional hazards models were used to compare survival based on the absolute lymphocyte count/absolute monocyte count ratio at diagnosis.

Results

The median follow-up period was 5.6 years (range, 0.1–33.7 years). An absolute lymphocyte count/absolute monocyte count ratio at diagnosis of 1.1 or more was the best cut-off value for survival with an area under the curve of 0.91 (95% confidence interval, 0.86 to 0.96), a sensitivity of 90% (95% confidence interval, 85% to 96%) and specificity of 79% (95% confidence interval, 73% to 88%). Absolute lymphocyte count/absolute monocyte count ratio at diagnosis was an independent prognostic factor for overall survival (hazard ratio, 0.18; 95% confidence interval, 0.08 to 0.38, P<0.0001); lymphoma-specific survival (hazard ratio, 0.10; 95% confidence interval, 0.04 to 0.25, P<0.0001); progression-free survival (hazard ratio, 0.35; 95% confidence interval, 0.18 to 0.66, P<0.002) and time to progression (hazard ratio, 0.27; 95% confidence interval, 0.17 to 0.57, P<0.0006).

Conclusions

The ratio of absolute lymphocyte count/absolute monocyte count at diagnosis is an independent prognostic factor for survival and provides a single biomarker to predict clinical outcomes in patients with classical Hodgkin’s lymphoma.  相似文献   

6.

Objectives

For patients with unresectable pancreatic cancer (PC), the efficacy and safety of molecular targeted agents (MTAs) in combination with gemcitabine are still unclear. Published randomized controlled trials (RCTs) have reported conflicting results. This study aimed to conduct a systematic review of the literature and to perform a meta-analysis if appropriate.

Methods

Seven electronic databases were searched using a standard technique to November 2011 without restriction on publication status or language. The primary aim was to assess overall survival (OS). Secondary aims were to assess progression-free survival (PFS), overall response rates (ORRs) and grade 3, 4 and 5 toxicities. A random-effects model was used for the meta-analysis.

Results

Seven Phase III RCTs were identified; 1981 patients were treated with MTAs and gemcitabine, and 1992 patients received gemcitabine with or without placebo. No statistically significant difference in OS was found between the two groups [hazard ratio (HR) = 0.93, 95% confidence interval (CI) 0.85–1.02; P = 0.13]. The addition of MTAs improved PFS (HR = 0.86, 95% CI 0.79–0.93; P = 0.000) and ORR (odds ratio 1.35, 95% CI 1.05–1.74; P = 0.01). However, these benefits were accompanied by significantly higher toxicity (P = 0.001).

Conclusions

The findings of this study suggest that the palliation of PC with gemcitabine and MTAs does not provide a significant survival benefit and is associated with increased grade 3 and 4 toxicities.  相似文献   

7.

BACKGROUND:

Single nucleotide polymorphisms in the 5,10-methylenetetrahydrofolate reductase (MTHFR), vascular endothelial growth factor (VEGF), endothelial nitric oxide synthase (eNOS), monocyte chemoattractant protein-1 (MCP-1) and apolipoprotein E (ApoE) genes appear to be a genetic risk factor for atherosclerosis. Common carotid intima-media thickness (cIMT) provides information on the severity of atherosclerosis.

OBJECTIVE:

To investigate the relationship between cIMT and gene polymorphisms associated with atherosclerosis in Turkish patients with coronary artery disease (CAD).

METHODS:

Sixty-two patients with angiographically diagnosed stable CAD were divided into two groups according to their cIMT values (group 1: n=35, cIMT of 1 mm or greater; group 2: n=27, cIMT of less than 1 mm). MTHFR 677 C/T, VEGF–460 C/T, eNOS 894 G/T, MCP-1–2518 A/G and ApoE (E2, E3 and E4) gene polymorphisms (where A is adenine, C is cytosine, G is guanine and T is thymine) were analyzed by polymerase chain reaction and restriction fragment length polymorphism. Evaluations of cardiovascular risk factors and coronary atherosclerotic lesions were performed in all patients. Serum homocysteine and high-sensitivity C-reactive protein were measured and compared between the two groups.

RESULTS:

Serum high-sensitivity C-reactive protein (P=0.04) and homocysteine (P=0.006) levels were higher in group 1 than in group 2. The ratio of multivessel CAD and previous myocardial infarction was significantly higher in group 1 than in group 2 (P=0.014). In the study population, no significant difference in cIMT was observed according to the polymorphisms studied. Only hyperhomocysteinemia (OR 1.17 [95% CI 1.01 to 1.35], P=0.033) and previous myocardial infarction (OR 3.76 [95% CI 1.10 to 12.81], P=0.034) maintained a significant correlation with cIMT on multiple logistic regression analysis.

CONCLUSION:

cIMT is increased in patients with hyperhomocysteinemia, inflammation and extended CAD. MTHFR 677 C/T, VEGF–460 C/T, eNOS 894 G/T, MCP-1–2518 A/G and ApoE single nucleotide polymorphisms were not associated with increased cIMT.  相似文献   

8.

Background

Previous studies have shown differences in the impact of regular physical exercise on the risk of venous thromboembolism. The inconsistent findings may have depended on differences in study design and specific population cohorts (men only, women only and elderly). We conducted a prospective, population-based cohort to investigate the impact of regular physical exercise on the risk of venous thromboembolism.

Design and Methods

Risk factors, including self-reported moderate intensity physical exercise during leisure time, were recorded for 26,490 people aged 25–97 years old, who participated in a population health survey, the Tromsø study, in 1994–95. Incident venous thromboembolic events were registered during the follow-up until September 1, 2007.

Results

There were 460 validated incident venous thromboembolic events (1.61 per 1000 person-years) during a median of 12.5 years of follow-up. Age, body mass index, the proportion of daily smokers, total cholesterol, and serum triglycerides decreased (P<0.001), whereas high density cholesterol increased (P<0.001) across categories of more physical exercise. Regular physical exercise of moderate to high intensity during leisure time did not significantly affect the risk of venous thromboembolism in the general population. However, compared to inactivity, high amounts of physical exercise (≥3 hours/week) tended to increase the risk of provoked venous thromboembolism (multivariable hazard ratio, 1.30; 95% confidence interval, 0.84–2.0), and total venous thromboembolism in the elderly (multivariable hazard ratio, 1.33; 95% confidence interval, 0.80–2.21) and in the obese (multivariable hazard ratio, 1.49; 95% confidence interval, 0.63–3.50). Contrariwise, compared to inactivity, moderate physical activity (1.0–2.9 hours/week) was associated with a border-line significant decreased risk of venous thromboembolism among subjects under 60 years old (multivariable hazard ratio, 0.72; 95% confidence interval, 0.48–1.08) and subjects with a body mass index of less than 25 kg/m2 (multivariable hazard ratio, 0.59; 95% confidence interval, 0.35–1.01).

Conclusions

Our study showed that regular, moderate intensity physical exercise did not have a significant impact on the risk of venous thromboembolism in a general population. Future studies are required to assess the impact of regular physical exercise on venous thromboembolism risk in different population subgroups.  相似文献   

9.

Objectives

The role of uric acid as a prognostic factor in patients with acute ST elevation myocardial infarction is controversial. The purpose of this study was to demonstrate the relationship between serum uric acid level and mortality during admission period and 30-day period after admission.

Methods

We assessed the relation between serum uric acid level and in-hospital and short-term mortality rates in 184 patients admitted with acute ST elevation myocardial infarction. We divided the patients according to their gender and uric acid level measured on admission into four groups: group A1: men with uric acid ⩽7 mg/dl versus group B1: men with uric acid >7 mg/dl and group A2: women with uric acid ⩽5.6 mg/dl versus group B2: women with uric acid >5.6 mg/dl. The patients were followed for 30 days after admission.

Results

In-hospital mortality rate in group B1 was higher than group A1 [P value: 0.011, Relative risk: 13.33 (95% confidence interval: 1.55–114.7)]. Short-term all-cause mortality was significantly higher in group B1 patients [P value: 0.037, Relative risk: 3.3 (95% confidence interval: 1.02–10.64)]. Multivariate logistic regression analysis of data showed an odds ratio of 15.23 for in-hospital mortality and odds ratio of 3.76 for short-term mortality in male hyperuricemic patients.

Conclusions

Our data suggest that in the acute phase of ST elevation myocardial infarction, uric acid has a prognostic role for in-hospital and short-term (30-day) mortality in men.  相似文献   

10.

Background and objectives

AKI after coronary angiography is associated with poor long-term outcomes. The relationship between contrast-associated AKI and subsequent use of prognosis-modifying cardiovascular medications is unknown.

Design, setting, participants, & measurements

A cohort study of 5911 participants 66 years of age or older with acute coronary syndrome who received a coronary angiogram in Alberta, Canada was performed between November 1, 2002, and November 30, 2008. AKI was identified according to Kidney Disease Improving Global Outcomes AKI criteria.

Results

In multivariable logistic regression models, compared with participants without AKI, those with stages 1 and 2–3 AKI had lower odds of subsequent use of angiotensin-converting enzyme inhibitors/angiotensin receptor blocker within 120 days of hospital discharge (adjusted odds ratio, 0.65; 95% confidence interval, 0.53 to 0.80 and odds ratio, 0.34; 95% confidence interval, 0.23 to 0.48, respectively). Subsequent statin and β-blockers use within 120 days of hospital discharge was significantly lower among those with stages 2–3 AKI (adjusted odds ratio, 0.44; 95% confidence interval, 0.31 to 0.64 and odds ratio, 0.46; 95% confidence interval, 0.31 to 0.66, respectively). These associations were consistently seen in patients with diabetes mellitus, heart failure, low baseline eGFR, and albuminuria; 952 participants died during subsequent follow-up after hospital discharge (mean=3.1 years). The use of each class of cardiovascular medication was associated with lower mortality, including among those who had experienced AKI.

Conclusions

Strategies to optimize the use of cardiac medications in people with AKI after coronary angiography might improve care.  相似文献   

11.

Background and objectives

Permanent hemodialysis vascular access is crucial for RRT in ESRD patients and patients with failed renal transplants, because central venous catheters are associated with greater risk of infection and mortality than arteriovenous fistulae or arteriovenous grafts. The objective of this study was to determine the types of vascular access used by patients initiating hemodialysis after a failed renal transplant.

Design, setting, participants, & measurements

Data from the US Renal Data System database on 16,728 patients with a failed renal transplant and 509,643 patients with native kidney failure who initiated dialysis between January 1, 2006, and September 30, 2011 were examined.

Results

At initiation of dialysis, of patients with a failed transplant, 27.7% (n=4636) used an arteriovenous fistula, 6.9% (n=1146) used an arteriovenous graft, and 65.4% (n=10,946) used a central venous catheter. Conversely, 80.8% (n=411,997) of patients with native kidney failure initiated dialysis with a central venous catheter (P<0.001). Among patients with a failed transplant, predictors of central venous catheter use included women (adjusted odds ratio, 1.75; 95% confidence interval, 1.63 to 1.87), lack of referral to a nephrologist (odds ratio, 2.00; 95% confidence interval, 1.72 to 2.33), diabetes (odds ratio, 1.14; 95% confidence interval, 1.06 to 1.22), peripheral vascular disease (odds ratio, 1.31; 95% confidence interval, 1.16 to 1.48), and being institutionalized (odds ratio, 1.53; 95% confidence interval, 1.23 to 1.89). Factors associated with lower odds of central venous catheter use included older age (odds ratio, 0.85 per 10 years; 95% confidence interval, 0.83 to 0.87), public insurance (odds ratio, 0.74; 95% confidence interval, 0.68 to 0.80), and current employment (odds ratio, 0.87; 95% confidence interval, 0.80 to 0.95).

Conclusions

Central venous catheters are used in nearly two thirds of failed renal transplant patients. These patients are usually followed closely by transplant physicians before developing ESRD after a failed transplant, but the relatively low prevalence of arteriovenous fistulae/arteriovenous grafts in this group at initiation of dialysis needs to be investigated more thoroughly.  相似文献   

12.

Aims/Introduction

We investigated the relationship between the frequency of self-monitoring of blood glucose (SMBG) and glycemic control in type 1 diabetes mellitus patients on continuous subcutaneous insulin infusion (CSII) or on multiple daily injections (MDI) using data management software.

Materials and Methods

We recruited 148 adult type 1 diabetes mellitus patients (CSII n = 42, MDI n = 106) and downloaded their SMBG records to the MEQNET™ SMBG Viewer software (Arkray Inc., Kyoto, Japan). The association between the SMBG frequency and the patients'' hemoglobin A1c (HbA1c) levels was analyzed using the χ2-test and linear regression analysis was carried out to clarify their relationship.

Results

The odds ratio of achieving a target HbA1c level of <8% (63.9 mmol/mol) was significantly higher in subjects with SMBG frequencies of ≥3.5 times/day compared with those with SMBG frequencies of <3.5 times/day in the CSII group (odds ratio 7.00, 95% confidence interval 1.72–28.54), but not in the MDI group (odds ratio 1.35, 95% CI 0.62–2.93). A significant correlation between SMBG frequency and the HbA1c level was detected in the CSII group (HbA1c [%] = –0.24 × SMBG frequency [times/day] + 8.60 [HbA1c {mmol/L} = –2.61 × SMBG frequency {times/day} + 70.5], [r = –0.384, = 0.012]), but not in the MDI group.

Conclusions

A SMBG frequency of <3.5 times per day appeared to be a risk factor for poor glycemic control (HbA1c ≥8%) in type 1 diabetes mellitus patients on CSII.  相似文献   

13.

Background

In order to improve the molecular response rate and prevent resistance to treatment, combination therapy with different dosages of imatinib and cytarabine was studied in newly diagnosed patients with chronic myeloid leukemia in the HOVON-51 study.

Design and Methods

Having reported feasibility previously, we hereby report the efficacy of escalated imatinib (200 mg, 400 mg, 600 mg or 800 mg) in combination with two cycles of intravenous cytarabine (200 mg/m2 or 1000 mg/m2 days 1 to 7) in 162 patients with chronic myeloid leukemia.

Results

With a median follow-up of 55 months, the 5-year cumulative incidences of complete cytogenetic response, major molecular response, and complete molecular response were 89%, 71%, and 53%, respectively. A higher Sokal risk score was inversely associated with complete cytogenetic response (hazard ratio of 0.63; 95% confidence interval, 0.50–0.79, P<0.001). A higher dose of imatinib and a higher dose of cytarabine were associated with increased complete molecular response with hazard ratios of 1.60 (95% confidence interval, 0.96–2.68, P=0.07) and 1.66 (95% confidence interval, 1.02–2.72, P=0.04), respectively. Progression-free survival and overall survival rates at 5 years were 92% and 96%, respectively. Achieving a major molecular response at 1 year was associated with complete absence of progression and a probability of achieving a complete molecular response of 89%.

Conclusions

The addition of intravenous cytarabine to imatinib as upfront therapy for patients with chronic myeloid leukemia is associated with a high rate of complete molecular responses (Clinicaltrials.Gov Identifier: NCT00028847).  相似文献   

14.

Background

A high prevalence of gallbladder diseases (GBD) in Northern India warranted a population survey into environmental risk factors.

Methods

In 60 villages of Uttar Pradesh and Bihar from 13 334 households, 22 861 persons aged >30 years were interviewed for symptoms of GBD, diet and environmental factors. Subsequently ultrasonography (US) was performed in 5100 and 1448 people with and without symptoms, respectively. Heavy metal and pesticide content in soil and water were estimated.

Results

US revealed a prevalence of GBD of 6.20%. GBD was more common in 5100 persons with symptoms (7.12%) compared with 1448 without (2.99%) (P < 0.05). Adjusted odds ratio (ORs) [95% confidence interval (CI)] revealed a significantly increased risk of GBD in females >50, 1.703 (CI 1.292–2.245); multiparity 1.862 (CI 1.306–2.655) and a genetic history 1.564 (CI 1.049–2.334). An increased risk noted in males with diabetes was 4.271 (CI 2.130–8.566), chickpea consumption 2.546 (CI 1.563–4.146) and drinking unsafe water 3.835 (CI 2.368–6.209). Prevalence of gallstones was 4.15%; more in females 5.59% than males 1.99% (P < 0.05). Cluster analysis identified a positive correlation of nickel, cadmium and chromium in water with a high prevalence of GBD in adjacent villages in Vaishali district, Bihar.

Conclusion

A high risk of GBD was observed in older, multiparous women and men with diabetes, intake of chickpeas, unsafe water and villages with heavy metal water pollution.  相似文献   

15.

Background

Because hepatitis C virus infection causes hepatic and immunological dysfunction, we hypothesized that seropositivity for this virus could be associated with increased non-relapse mortality after allogeneic hematopoietic stem cell transplantation.

Design and Methods

We performed a case-control study of the outcomes of patients who were hepatitis C virus seropositive at the time of allogeneic hematopoietic stem cell transplantation (N=31). Patients positive for hepatitis C virus were considered candidates for stem cell transplantation only if they had no significant evidence of hepatic dysfunction. Matched controls (N=31) were seronegative for viral hepatitides and were paired according to age, diagnosis, disease stage, conditioning regimen and donor type. We also compared the hepatitis C virus seropositive patients to all seronegative patients (all controls, N=1800) transplanted during the same period, to adjust for other confounding effects.

Results

The median age of the seropositive patients was 49 (range 26–72); 15 had acute myeloid leukemia/myelodysplastic syndrome, 6 had chronic myeloid leukemia/myeloproliferative disease, 6 non-Hodgkin’s lymphoma, 2 myeloma, 1 acute lymphocytic leukemia and 1 Hodgkin’s lymphoma; 61% had poor risk disease; 68% had related donors; 68% received reduced intensity conditioning; 7 patients had mildly abnormal alanine transaminase levels (all less than three times the upper limit of normal) and 1 patient had minimally elevated bilirubin. These characteristics were similar to those of the matched control group. Median overall survival was 3, 18 and 20 months, and 1-year survival was 29%, 56% and 56%, in the hepatitis C virus, matched and all controls groups, respectively (hazard ratio for death 3.1, 95% confidence interval 1.9–5.6, p<0.001 in multivariate analysis). Non-relapse mortality at 1 year was 43%, 24% and 23%, respectively (hazard ratio 3.3, 95% confidence interval 1.8–7.1, p<0.01). Disease progression and graft-versus-host disease rates were comparable.

Conclusions

Hepatitis C virus seropositivity is a significant risk factor for non-relapse mortality after allogeneic hematopoietic stem cell transplantation even in patients with normal or minimally abnormal liver function tests.  相似文献   

16.

Background and objectives

Preoperative anemia adversely affects outcomes of cardiothoracic surgery. However, in patients with CKD, treating anemia to a target of normal hemoglobin has been associated with increased risk of adverse cardiac and cerebrovascular events. We investigated the association between preoperative hemoglobin and outcomes of cardiac surgery in patients with CKD and assessed whether there was a level of preoperative hemoglobin below which the incidence of adverse surgical outcomes increases.

Design, setting, participants, & measurements

This prospective observational study included adult patients with CKD stages 3–5 (eGFR<60 ml/min per 1.73 m2) undergoing cardiac surgery from February 2000 to January 2010. Patients were classified into four groups stratified by preoperative hemoglobin level: <10, 10–11.9, 12–13.9, and ≥14 g/dl. The outcomes were postoperative AKI requiring dialysis, sepsis, cerebrovascular accident, and mortality.

Results

In total, 788 patients with a mean eGFR of 43.5±13.7 ml/min per 1.73 m2 were evaluated, of whom 22.5% had preoperative hemoglobin within the normal range (men: 14–18 g/dl; women: 12–16 g/dl). Univariate analysis revealed an inverse relationship between the incidence of all adverse postoperative outcomes and hemoglobin level. Using hemoglobin as a continuous variable, multivariate logistic regression analysis showed a proportionally greater frequency of all adverse postoperative outcomes per 1-g/dl decrement of preoperative hemoglobin (mortality: odds ratio, 1.38; 95% confidence interval, 1.23 to 1.57; P<0.001; sepsis: odds ratio, 1.31; 95% confidence interval, 1.14 to 1.49; P<0.001; cerebrovascular accident: odds ratio, 1.31; 95% confidence interval, 1.00 to 1.67; P=0.03; postoperative hemodialysis: odds ratio, 1.38; 95% confidence interval, 1.11 to 1.75; P<0.01). Moreover, preoperative hemoglobin<12 g/dl was an independent risk factor for postoperative mortality (odds ratio, 2.6; 95% confidence interval, 1.1 to 7.3; P=0.04).

Conclusions

Similar to the general population, preoperative anemia is associated with adverse postoperative outcomes in patients with CKD. Whether outcomes could be improved by therapeutically targeting higher preoperative hemoglobin levels before cardiac surgery in patients with underlying CKD remains to be determined.  相似文献   

17.

Summary

Background and objectives

Hispanics are the largest minority group in the United States. The leading cause of death in patients with chronic kidney disease (CKD) is cardiovascular disease (CVD), yet little is known about its prevalence among Hispanics with CKD.

Design, setting, participants, & measurements

We conducted cross-sectional analyses of prevalent self-reported clinical and subclinical measures of CVD among 497 Hispanics, 1638 non-Hispanic Caucasians, and 1650 non-Hispanic African Americans, aged 21 to 74 years, with mild-to-moderate CKD at enrollment in the Chronic Renal Insufficiency Cohort (CRIC) and Hispanic CRIC (HCRIC) studies. Measures of subclinical CVD included left ventricular hypertrophy (LVH), coronary artery calcification (CAC), and ankle-brachial index.

Results

Self-reported coronary heart disease (CHD) was lower in Hispanics compared with non-Hispanic Caucasians (18% versus 23%, P = 0.02). Compared with non-Hispanic Caucasians, Hispanics had a lower prevalence of CAC >100 (41% versus 34%, P = 0.03) and CAC >400 (26% versus 19%, P = 0.02). However, after adjusting for sociodemographic factors, these differences were no longer significant. In adjusted analyses, Hispanics had a higher odds of LVH compared with non-Hispanic Caucasians (odds ratio 1.97, 95% confidence interval, 1.22 to 3.17, P = 0.005), and a higher odds of CAC >400 compared with non-Hispanic African Americans (odds ratio, 2.49, 95% confidence interval, 1.11 to 5.58, P = 0.03). Hispanic ethnicity was not independently associated with any other CVD measures.

Conclusions

Prevalent LVH was more common among Hispanics than non-Hispanic Caucasians, and elevated CAC score was more common among Hispanics than non-Hispanic African Americans. Understanding reasons for these racial/ethnic differences and their association with long-term clinical outcomes is needed.  相似文献   

18.

Background

An explanation for the increased risk of myocardial infarction and stroke in patients with venous thrombosis is lacking. The objective of this study was to investigate whether risk factors for arterial cardiovascular disease also increase the risk of venous thrombosis.

Design and Methods

Cases who had a first venous thrombosis (n=515) and matched controls (n=1,505) were identified from a population-based, nested, case-cohort study (the HUNT 2 study) comprising 71% (n=66,140) of the adult residents of Nord-Trøndelag County in Norway.

Results

The age- and sex-adjusted odds ratio of venous thrombosis for subjects with concentrations of C-reactive protein in the highest quintile was 1.6 (95% confidence interval: 1.2–2.2) compared to subjects with C-reactive protein in the lowest quintile. This association was strongest in subjects who experienced venous thrombosis within a year after blood sampling with a three-fold increased risk of participants in the highest versus the lowest quintile. Having first degree relatives who had a myocardial infarction before the age of 60 years was positively associated with venous thrombosis compared to not having a positive family history [odds ratio 1.3 (95% confidence interval: 1.1–1.6)]. Subjects with blood pressure in the highest quintile had half the risk of developing venous thrombosis compared to subjects whose blood pressure was in the lowest quintile. There were no associations between the risk of venous thrombosis and total cholesterol, low density lipoprotein-cholesterol, high density lipoprotein-cholesterol, triglycerides, glucose or smoking. We confirmed the positive association between obesity and venous thrombosis.

Conclusions

C-reactive protein and a family history of myocardial infarction were positively associated with subsequent venous thrombosis. Blood pressure was inversely correlated to venous thrombosis. These findings should be confirmed by further investigations.  相似文献   

19.

Aims/Introduction

Data on hyperhomocysteinemia in relation to fractures in diabetes are limited. We aimed to explore the relationship between plasma total homocysteine concentrations and fractures in men and premenopausal women with type 2 diabetes.

Materials and Methods

Diabetic and control participants (n = 292) were enrolled in a cross‐sectional hospital‐based study. Bone mineral density and fractures were documented by dual energy X‐ray absorptiometry and X‐ray film, respectively. Plasma total homocysteine concentrations were measured using fluorescence polarization immunoassay. Risk factors for low bone mineral density or fractures and determinants of homocysteine were obtained from blood samples and the interviewer questionnaire.

Results

Plasma total homocysteine levels were higher in diabetic participants with fractures than without (8.6 [2.1] μmol/L vs 10.3 [3.0] μmol/L, P = 0.000). Diabetic participants with fractures had similar bone mineral densities as control participants. The association of homocysteine with the fracture was independent of possible risk factors for fractures (e.g., age, duration of diabetes, glycated hemoglobin, body mass index, thiazolidenediones and retinopathy) and determinants of homocysteine concentration (e.g., age, sex, serum folate and vitamin B12, renal status and biguanide use; odds ratio 1.41, 95% confidence interval 1.05–2.03, P = 0.020). Furthermore, per increase of 5.0 μmol/L plasma homocysteine was related to the fracture, after controlling for per unit increase of other factors (odds ratio 1.42, 95% confidence interval 1.12–1.78, P = 0.013).

Conclusions

Plasma total homocysteine concentration is independently associated with occurrence of fractures in men and premenopausal women with type 2 diabetes. Future prospective studies are warranted to clarify the relationship.  相似文献   

20.

Objective

Adiponectin may play a role in the development of type 2 diabetes and cardiovascular disease (CVD). However, little is known about the relationship between adiponectin and impaired glucose tolerance (IGT). We investigated the association between adiponectin and IGT and between adiponectin and cardiovascular risk factors among subjects with IGT.

Research Design and Methods

Subjects with normal glucose tolerance (NGT)(n = 571) and impaired glucose tolerance (n = 167) were recruited from the Chennai Urban Rural Epidemiology Study in south India. Serum total adiponectin levels were measured using a radioimmunoassay (Linco Research, St. Charles, MO). High sensitivity C-reactive protein (hsCRP) was estimated by nephelometry.

Results

In sex-stratified analyses, adiponectin was significantly associated with IGT in females [odds ratio (OR): 0.93, 95% confidence interval (CI): 0.872–0.991, p = 0.026] after controlling for age, waist circumference, blood pressure, alcohol consumption, smoking, lipid profile, and glycemic indices; in males there was no significant association (OR = 0.90, 95% CI: 0.798–1.012, p = 0.078). In prediabetic females, adiponectin was not associated with any CVD risk factors (age, waist circumference, blood pressure, cholesterol, triglyceride, high-density lipoprotein, low-density lipoprotein, fasting glucose, fasting insulin, and insulin resistance level), but was associated negatively with 2-hour postplasma glucose levels (r = –0.243, p < 0.05) and hsCRP (r = –0.219, p < 0.05) after adjusting for demographic and biomedical indices. No associations with CVD risk factors were observed in males with IGT.

Conclusion

Serum total adiponectin levels are associated with IGT, 2-hour postplasma glucose, and hsCRP in Asian Indian females but not in males.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号