首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Freedman ND  Chow WH  Gao YT  Shu XO  Ji BT  Yang G  Lubin JH  Li HL  Rothman N  Zheng W  Abnet CC 《Gut》2007,56(12):1671-1677

Background

Gastric cancer incidence rates are consistently lower in women than men in both high and low‐risk regions worldwide. Sex hormones, such as progesterone and estrogen, may protect women against gastric cancer.

Objective

To investigate the association of menstrual and reproductive factors and gastric cancer risk.

Methods

These associations were prospectively investigated in 73 442 Shanghai women. After 419 260 person‐years of follow‐up, 154 women were diagnosed with gastric cancer. Hazard ratios (HR) and 95% confidence intervals (CI) were calculated using Cox proportional hazards models adjusted for age, body mass index, education, income, and cigarette use.

Results

No associations were observed between gastric cancer risk and age of menarche, number of children, breast feeding, or oral contraceptive use. In contrast, associations were observed with age of menopause (HR 0.80 per five‐year increase in menopausal age, 95% CI 0.66–0.97), years of fertility (participants with less than 30 years of fertility were at increased risk compared with those with 30–36 years of fertility, HR 1.90, 95% CI 1.25–2.90), years since menopause (HR 1.26 per five years, 95% CI 1.03–1.53), and intrauterine device use (HR for users 1.61, 95% CI 1.08–2.39).

Conclusions

These results support the hypothesis that female hormones play a protective role in gastric cancer risk.  相似文献   

2.

Background

The CoPanFlu-France household cohort was set up in 2009 to identify risk factors of infection by the pandemic A/H1N1 (H1N1pdm09) virus in the general population.

Objectives

To investigate the determinants of infection during the 2010–2011 season, the first complete influenza season of study follow-up for this cohort.

Patients/Methods

Pre- and post-epidemic blood samples were collected for all subjects, and nasal swabs were obtained in all subjects from households where an influenza-like illness was reported. Cases were defined as either a fourfold increase in the serological titer or a laboratory-confirmed H1N1pdm09 on a nasal swab, with either RT-PCR or multiplex PCR. Risk factors for H1N1pdm09 infections were explored, without any pre-specified hypothesis, among 167 individual, collective and environmental covariates via generalized estimating equations modeling. We adopted a multimodel selection procedure to control for model selection uncertainty.

Results

This analysis is based on a sample size of 1121 subjects. The final multivariable model identified one risk factor (history of asthma, OR = 2·17; 95% CI: 1·02–4·62) and three protective factors: pre-epidemic serological titer (OR = 0·51 per doubling of the titer; 95% CI: 0·39–0·67), green tea consumption a minimum of two times a week (OR = 0·39; 95% CI: 0·18–0·84), and proportion of subjects in the household always covering their mouth while coughing/sneezing (OR = 0·93 per 10% increase; 95% CI: 0·86–1·00).

Conclusion

This exploratory study provides further support of previously reported risk factors and highlights the importance of collective protective behaviors in the household. Further analyses will be conducted to explore these findings.  相似文献   

3.

Background

Transplantation from an HLA-matched sibling is the treatment of choice for young patients with acquired severe aplastic anemia. For older patients, the acceptable upper age limit for transplantation as first-line treatment varies. The current analysis, therefore, sought to identify age or ages at transplantation at which survival differed.

Design and Methods

We studied the effect of patients’ age, adjusting for other significant factors affecting outcomes, in 1307 patients with severe aplastic anemia after HLA-matched sibling transplantation using logistic and Cox regression analysis. Age categories (<20 years, 20–40 years, >40 years) were determined using Martingale residual plots for overall survival and categories based on differences in survival.

Results

Patients aged over 40 years old were more likely to have had immunosuppressive therapy, a poor performance score and a longer interval between diagnosis and transplantation. Neutrophil recovery was similar in all age groups but patients aged over 40 years had a lower likelihood of platelet recovery compared to patients aged less than 20 years (OR 0.45, P=0.01) but not compared to those aged 20–40 years (OR 0.60, P=0.10). Compared to the risk of mortality in patients aged less than 20 years, mortality risks were higher in patients over 40 years old (RR 2.70, P<0.0001) and in those aged 20–40 years (RR 1.69, P<0.0001). The mortality risk was also higher in patients aged over 40 years than in those 20–40 years old (RR 1.60, P=0.008).

Conclusions

Mortality risks increased with age. Risks were also higher in patients with a poor performance score and when the interval between diagnosis and transplantation was longer than 3 months, implying earlier referral would be appropriate when this treatment option is being considered.  相似文献   

4.

Background

The present study is a meta-analysis of English articles comparing one-stage [laparoscopic common bile duct exploration or intra-operative endoscopic retrograde cholangiopancreatography (ERCP)] vs. two-stage (laparoscopic cholecystectomy preceded or followed by ERCP) management of common bile duct stones.

Methods

MEDLINE/PubMed and Science Citation Index databases (1990–2011) were searched for randomized, controlled trials that met the inclusion criteria for data extraction. Outcomes were calculated as odds ratios (ORs) with 95% confidence intervals (CIs) using RevMan 5.1.

Results

Nine trials with 933 patients were studied. No significant differences was observed between the two groups with regard to bile duct clearance (OR, 0.89; 95% CI, 0.65–1.21), mortality (OR, 1.2; 95% CI, 0.32–4.52), total morbidity (OR, 0.75; 95% CI, 0.53–1.06), major morbidity (OR, 0.95; 95% CI, 0.60–1.52) and the need for additional procedures (OR, 1.58; 95% CI, 0.76–3.30).

Conclusions

Outcomes after one-stage laparoscopic/endoscopic management of bile duct stones are no different to the outcomes after two-stage management.  相似文献   

5.

Background

Due to increased rates of secondary solid organ cancer in patients with severe aplastic anemia who received an irradiation-based conditioning regimen, we decided some years ago to use the combination of cyclophosphamide and antithymocyte globulin. We report the long-term follow up of patients who underwent hematopoietic stem cell transplantation from an HLA-matched sibling donor after this conditioning regimen.

Design and Methods

We analyzed 61 consecutive patients transplanted from June 1991 to February 2010, following conditioning with cyclophosphamide (200 mg/kg) and antithymocyte globulin (2.5 mg/kg/day × 5 days).

Results

Median age was 21 years (range 4–43); 41 of the 61 patients were adults. Median duration of the disease before hematopoietic stem cell transplantation was 93 days. All but 2 patients received bone marrow as the source of stem cells and all but 2 engrafted. Cumulative incidence of acute grade II–IV graft-versus-host disease was 23% (95%CI 13–34) and 18 developed chronic graft-versus-host disease (cumulative incidence 32% at 72 months, 95% CI 20–46). In multivariate analysis, a higher number of infused CD3 cells was associated with an increased risk of developing chronic graft-versus-host disease (P=0.017). With a median follow up of 73 months (range 8–233), the estimated 6-year overall survival was 87% (95% CI 78–97). At 72 months, the cumulative incidence of avascular necrosis was 21% and 12 patients presented with endocrine dysfunction (cumulative incidence of 19%). Only one patient developed a secondary malignancy (Hodgkin’s lymphoma) during follow up.

Conclusions

Cyclophosphamide and antithymocyte globulin is an effective conditioning regimen for patients with severe aplastic anemia and is associated with low treatment-related mortality. Long-term complications include avascular necrosis and endocrine dysfunction.  相似文献   

6.

Background

School-located influenza vaccination (SLV) programs have the potential to mass-vaccinate all enrolled children, but parental consent is required.

Objective

To examine parental attitudes and determine predictors of parental consent for vaccination of schoolchildren through SLV programs.

Patients/Methods

Surveys were distributed to parents of 4517 children during 2009–2010 (year 1) and 4414 children during 2010–2011 (year 2) in eight elementary schools in conjunction with a SLV program.

Results

Participants included 1259 (27·9%) parents in year 1 and 1496 (33·9%) in year 2. Parental consent for 2009 H1N1, 2009 seasonal, and 2010 seasonal influenza vaccines was obtained from 738 (70·8%), 673 (64·5%), and 1151 (77·2%) respondents, respectively. During the 2009 pandemic, respondents concerned about influenza severity were twice as likely to consent for the 2009 H1N1 vaccination compared to unconcerned respondents (OR 2·04, 95% CI:1·19–3·51). During year 2, factors that predicted parental consent were the perception of high susceptibility to influenza infection (OR 2·19, 95% CI:1·50–3·19) and high benefit of vaccine (OR 2·23, 95% CI:1·47–3·40). In both years, college-educated parents were more likely to perceive vaccine risks (year 1: 83·6 versus 61·5%, P < 0·001 and year 2: 81·1% versus 60·6%, P < 0·001) and less likely to consent for seasonal influenza vaccine (year 1: OR 0·69, 95% CI:0·53–0·89 and year 2: OR 0·61, 95% CI:0·47–0·78) compared to non-college-educated parents.

Conclusions

Parents who appreciate the risks of influenza and benefits of vaccination are more likely to consent for SLV. More research is needed to determine how to address heightened safety concerns among college-educated parents.  相似文献   

7.

Background

The impact of blood transfusion on the development of post-operative stroke after coronary artery bypass grafting (CABG) is not well established. We, therefore, investigated this issue.

Materials and methods.

Complete data on peri-operative blood transfusion were available for 2,226 patients who underwent CABG in three Finnish hospitals.

Results

Stroke occurred post-operatively in 53 patients (2.4%). Logistic regression showed that pre-operative creatinine (OR 1.003, 95% CI 1.000–1.006), extracardiac arteriopathy (OR 2.344, 95% CI 1.133–4.847), pre-operative atrial fibrillation (OR 2.409, 95% CI 1.149–5.052), and the number of packed red blood cell units transfused (OR 1.121, 95% CI 1.065–1.180) were significantly associated with post-operative stroke. When the various blood product transfusions instead of transfused units were included in the multivariable analysis, solvent/detergent treated plasma (Octaplas®) transfusion (OR 2.149, 95% CI 1.141–4.047), but not red blood cell transfusion, was significantly associated with postoperative stroke. Use of blood products ranging from no transfusion (stroke rate 1.6%) to combined transfusion of red blood cells, platelets and Octaplas® was associated with a significant increase in post-operative stroke incidence (6.6%, adjusted analysis: OR 1.727, 95% 1.350–2.209). Patients who received >2 units of red blood cells, >4 units of Octaplas® units and >8 units of platelets had the highest stroke rate of 21%. CART analysis showed that increasing amount of transfused Octaplas®, platelets and history of extracardiac arteriopathy were significantly associated with post-operative stroke.

Conclusions

Transfusion of blood products after CABG has a strong, dose-dependent association with the risk of stroke. The use of Octaplas® and platelet transfusions seem to have an even larger impact on the development of stroke than red blood cell transfusions.  相似文献   

8.

Background

Limited data are available from Central and Eastern Europe on risk factors for severe complications of influenza. Such data are essential to prioritize prevention and treatment resources and to adapt influenza vaccination recommendations.

Objectives

To use sentinel surveillance data to identify risk factors for fatal outcomes among hospitalized patients with severe acute respiratory infections (SARI) and among hospitalized patients with laboratory-confirmed influenza.

Methods

Retrospective analysis of case-based surveillance data collected from sentinel hospitals in Romania during the 2009/2010 and 2010/2011 winter influenza seasons was performed to evaluate risk factors for fatal outcomes using multivariate logistic regression.

Results

During 2009/2010 and 2010/2011, sentinel hospitals reported 661 SARI patients of which 230 (35%) tested positive for influenza. In the multivariate analyses, infection with influenza A(H1N1)pdm09 was the strongest risk factor for death among hospitalized SARI patients (OR: 6·6; 95% CI: 3·3–13·1). Among patients positive for influenza A(H1N1)pdm09 virus infection (n = 148), being pregnant (OR: 7·1; 95% CI: 1·6–31·2), clinically obese (OR: 2·9;95% CI: 1·6–31·2), and having an immunocompromising condition (OR: 3·7;95% CI: 1·1–13·4) were significantly associated with fatal outcomes.

Conclusion

These findings are consistent with several other investigations of risk factors associated with influenza A(H1N1)pdm09 virus infections. They also support the more recent 2012 recommendations by the WHO Strategic Advisory Group of Experts on Immunization (SAGE) that pregnant women are an important risk group for influenza vaccination. Ongoing sentinel surveillance can be useful tool to monitor risk factors for complications of influenza virus infections during each influenza season, and pandemics as well.  相似文献   

9.

Introduction

The risk of acute hepatitis B among adults with diabetes mellitus is unknown. We investigated the association between diagnosed diabetes and acute hepatitis B.

Methods

Confirmed acute hepatitis B cases were reported in 2009–2010 to eight Emerging Infections Program (EIP) sites; diagnosed diabetes status was determined. Behavioral Risk Factor Surveillance System respondents residing in EIP sites comprised the comparison group. Odds ratios (ORs) comparing acute hepatitis B among adults with diagnosed diabetes versus without diagnosed diabetes were determined by multivariate logistic regression, adjusting for age, sex, and race/ethnicity, and stratified by the presence or absence of risk behaviors for hepatitis B virus (HBV) infection.

Results

During 2009–2010, EIP sites reported 865 eligible acute hepatitis B cases among persons aged ≥23 years; 95 (11.0%) had diagnosed diabetes. Comparison group diabetes prevalence was 9.1%. Among adults without hepatitis B risk behaviors and with reported diabetes status, the OR for acute hepatitis B comparing adults with and without diabetes was 1.9 (95% confidence interval [CI] = 1.4, 2.6); ORs for adults ages 23–59 and ≥60 years were 2.1 (95% CI = 1.6, 2.8) and 1.5 (95% = CI 0.9, 2.5), respectively.

Conclusions

Diabetes was independently associated with an increased risk for acute hepatitis B among adults without HBV risk behaviors.  相似文献   

10.

Background

For more than a decade, the presence of diabetes has been considered a coronary heart disease (CHD) “risk equivalent”.

Objective

The objective of this study was to revisit the concept of risk equivalence by comparing the risk of subsequent CHD events among individuals with or without history of diabetes or CHD in a large contemporary real-world cohort over a period of 10 years (2002 to 2011).

Design

Population-based prospective cohort analysis.

Participants

We studied a cohort of 1,586,061 adult members (ages 30–90 years) of Kaiser Permanente Northern California, an integrated health care delivery system.

Main Measurements

We calculated hazard ratios (HRs) from Cox proportional hazard models for CHD among four fixed cohorts, defined by prevalent (baseline) risk group: no history of diabetes or CHD (None), prior CHD alone (CHD), diabetes alone (DM), and diabetes and prior CHD (DM + CHD).

Key Results

We observed 80,012 new CHD events over the follow-up period (~10,980,800 person-years). After multivariable adjustment, the HRs (reference: None) for new CHD events were as follows: CHD alone, 2.8 (95 % CI, 2.7–2.85); DM alone 1.7 (95 % CI, 1.66–1.74); DM + CHD, 3.9 (95 % CI, 3.8–4.0). Individuals with diabetes alone had significantly lower risk of CHD across all age and sex strata compared to those with CHD alone (12.2 versus 22.5 per 1000 person-years). The risk of future CHD for patients with a history of either DM or CHD was similar only among those with diabetes of long duration (≥10 years).

Conclusions

Not all individuals with diabetes should be unconditionally assumed to be a risk equivalent of those with prior CHD.KEY WORDS: coronary heart disease, diabetes, epidemiology  相似文献   

11.

BACKGROUND:

Delay in the treatment of patients with tuberculosis (TB) increases the risk of poor clinical outcomes – including death and transmission of disease – and may be reducible.

OBJECTIVE:

To estimate delays in TB treatment in a Canadian, multicultural population and to examine factors associated with longer time to treatment.

METHODS:

Adult cases of active TB from January 1998 to December 2001 from the Ontario Reportable Disease Information System were included. Time to treatment was defined as the number of days between symptom onset and treatment.

RESULTS:

Data from 1753 TB patients (76% of eligible patients) were analyzed. Median time to treatment was 62 days (interquartile range 31 to 114 days). Time periods longer than the median time to treatment were independently associated with middle-aged patients (OR 1.54, 95% CI 1.21 to 1.98), foreign-born patients who had lived in Canada for more than 10 years (OR 1.47, 95% CI 1.02 to 2.12), patients with nonpulmonary disease (OR 1.57, 95% CI 1.28 to 1.92) and patients managed within certain health districts.

CONCLUSION:

A time to TB treatment of two months or more is common in Ontario, and associated with several factors. Future studies are needed to build on these findings to decrease delay and improve individual and public health outcomes.  相似文献   

12.

Background

Neutropenic patients are at risk of abdominal complications and yet the incidence and impact of these complications on patients’ morbidity and mortality have not been sufficiently evaluated. We aimed to assess a clinical rule for early detection of abdominal complications leading to death or transfer to intensive care in patients with chemotherapy-associated neutropenia.

Design and Methods

This observational multicenter study was carried out in seven German hematology-oncology departments. For inclusion, neutropenia of at least 5 consecutive days was required. Risk factors for “transfer to intensive care” and “death” were assessed by backward-stepwise binary logistic regression analyses. Chemotherapy-associated bowel syndrome was defined as a combination of fever (T ≥37.8 °C) and abdominal pain and/or lack of bowel movement for 72 hours or more. Five hundred and twenty-one neutropenic episodes were documented in 359 patients.

Results

The incidence of chemotherapy-associated bowel syndrome was 126/359 (35%) in first episodes of neutropenia. Transfer to intensive care occurred in 41/359 (11%) and death occurred in 17/359 (5%) first episodes. Chemotherapy-associated bowel syndrome and duration of neutropenia were identified as risk factors for transfer to intensive care (P<0.001; OR 4.753; 95% CI 2.297–9.833, and P=0.003; OR 1.061/d; 95% CI 1.021–1.103). Chemotherapy-associated bowel syndrome and mitoxantrone administration were identified as risk factors for death (P=0.005; OR 4.611; 95% CI 1.573–13.515 and P=0.026; OR 3.628; 95% CI 1.169–11.256).

Conclusions

The occurrence of chemotherapy-associated bowel syndrome has a significant impact on patients’ outcome. In future interventional clinical trials, this definition might be used as a selection criterion for early treatment of patients at risk of severe complications.  相似文献   

13.

Background

The risk of thromboembolic events in adults with primary immune thrombocytopenia has been little investigated despite findings of increased susceptibility in other thrombocytopenic autoimmune conditions. The objective of this study was to evaluate the risk of thromboembolic events among adult patients with and without primary immune thrombocytopenia in the UK General Practice Research Database.

Design and Methods

Using the General Practice Research Database, 1,070 adults (≥18 years) with coded records for primary immune thrombocytopenia first referenced between January 1st 1992 and November 30th 2007, and having at least one year pre-diagnosis and three months post-diagnosis medical history were matched (1:4 ratio) with 4,280 primary immune thrombocytopenia disease free patients by age, gender, primary care practice, and pre-diagnosis observation time. The baseline prevalence and incidence rate of thromboembolic events were quantified, with comparative risk modelled by Cox’s proportional hazards regression.

Results

Over a median 47.6 months of follow-up (range: 3.0–192.5 months), adjusted hazard ratios of 1.58 (95% CI, 1.01–2.48), 1.37 (95% CI, 0.94–2.00), and 1.41 (95% CI, 1.04–1.91) were found for venous, arterial, and combined (arterial and venous) thromboembolic events, respectively, when comparing the primary immune thrombocytopenia cohort with the primary immune thrombocytopenia disease free cohort. Further event categorization revealed an elevated incidence rate for each occurring venous thromboembolic subtype among the adult patients with primary immune thrombocytopenia.

Conclusions

Patients with primary immune thrombocytopenia are at increased risk for venous thromboembolic events compared with patients without primary immune thrombocytopenia.  相似文献   

14.
Objective Uncertainty exists regarding the relative performance of drug-eluting stents (DES) versus bare-metal stents (BMS) in octogenarians undergoing percutaneous coronary intervention (PCI). We undertook a meta-analysis to assess outcomes for DES and BMS in octogenarians undergoing PCI. Methods Electronic data bases of PubMed, Cochrane, and EMBASE were searched. We included randomized, controlled clinical trials (RCT) and observational studies comparing DES and BMS in octogenarians receiving PCI. The methodological qualities of eligible trials were assessed using a “risk of bias” tool. The endpoints included all-cause death, major adverse cardiac events (MACE), myocardial infarction (MI), target vessel revascularization (TVR), major bleeding, and stent thrombosis (ST). Odds ratios (OR) and 95% confidence intervals (95% CI) were calculated for each endpoint. Results A total of one RCT and six observational studies were included and analyzed in this meta-analysis. All trials were of acceptable quality. At 30 days, compared with DES-treated patients, BMS-treated patients had a higher incidence of mortality (OR: 3.91, 95% CI: 1.10–13.91; P = 0.03). The OR for MACE (1.52, 95% CI: 0.56–4.17; P = 0.13), MI (0.81, 95% CI: 0.37–2.17; P = 0.23), TVR (0.75, 95% CI: 0.17–3.41; P = 0.41), major bleeding (0.77, 95% CI: 0.35–1.68; P = 0.43), and ST (1.44, 95% CI: 0.32–6.45; P = 0.33) did not reach statistical significance. At one year follow-up, the OR did not favor BMS over MACE (MACE, defined as the composite of death, myocardial infarction, and TVR) (1.87; 95% CI: 1.22–2.87; P < 0.01), MI (1.91, 95% CI: 1.22–2.99; P < 0.01), TVR (3.08, 95% CI: 1.80–5.26; P < 0.01) and ST (3.37, 95% CI: 1.12–10.13; P < 0.01). The OR for mortality (1.51; 95% CI: 0.92–2.47; P = 0.10) and major bleeding (0.85, 95% CI: 0.47–1.55; P = 0.60) did not reach statistical significance. At > 1 year follow-up, the OR for all endpoints, including mortality, MACE, MI, TVR, major bleeding, and ST, did not reach statistical significance. Conclusions Our meta-analysis suggests that DES is associated with favorable outcomes as compared with BMS in octogenarians receiving PCI.  相似文献   

15.

Background

Earlier work demonstrated that ACGME duty hour reform did not adversely affect mortality, with slight improvement noted among specific subgroups.

Objective

To determine whether resident duty hour reform differentially affected the mortality risk of high severity patients or patients who experienced post-operative complications (failure-to-rescue).

Design

Observational study using interrupted time series analysis with data from July 1, 2000 - June 30, 2005. Fixed effects logistic regression was used to examine the change in the odds of mortality or failure-to-rescue (FTR) in more versus less teaching-intensive hospitals before and after duty hour reform.

Participants

All unique Medicare patients (n = 8,529,595) admitted to short-term acute care non-federal hospitals and all unique VA patients (n = 318,636 patients) with principal diagnoses of acute myocardial infarction, congestive heart failure, gastrointestinal bleeding, stroke or a DRG classification of general, orthopedic or vascular surgery.

Measurements and Main Results

We measured mortality within 30 days of hospital admission and FTR, measured by death among patients who experienced a surgical complication. The odds of mortality and FTR generally changed at similar rates for higher and lower risk patients in more vs. less teaching intensive hospitals. For example, comparing the mortality risk for the 10% of Medicare patients with highest risk to the other 90% of patients in post-reform year 1 for combined medical an OR of 1.01 [95% CI 0.90, 1.13], for combined surgical an OR of 0.91 [95% CI 0.80, 1.04], and for FTR an OR of 0.94 [95% CI 0.80, 1.09]. Findings were similar in year 2 for both Medicare and VA. The two exceptions were a relative increase in mortality for the highest risk medical (OR 1.63 [95% CI 1.08, 2.46]) and a relative decrease in the high risk surgical patients within VA in post-reform year 1 (OR 0.52 [95% CI 0.29, 0.96]).

Conclusions

ACGME duty hour reform was not associated with any consistent improvements or worsening in mortality or failure-to-rescue rates for high risk medical or surgical patients.KEY WORDS: medical errors internship and residency, education, medical, graduate, personnel staffing and scheduling, continuity of patient care  相似文献   

16.

Background

Duty hour restrictions limit shift length to 16 hours during the 1st post-graduate year. Although many programs utilize a 16-hour “long call” admitting shift on inpatient services, compliance with the 16-hour shift length and factors responsible for extended shifts have not been well examined.

Objective

To identify the incidence of and operational factors associated with extended long call shifts and residents’ perceptions of the safety and educational value of the 16-hour long call shift in a large internal medicine residency program.

Design, Participants, and Main Measures

Between August and December of 2010, residents were sent an electronic survey immediately following 16-hour long call shifts, assessing departure time and shift characteristics. We used logistic regression to identify independent predictors of extended shifts. In mid-December, all residents received a second survey to assess perceptions of the long call admitting model.

Key Results

Two-hundred and thirty surveys were completed (95 %). Overall, 92 of 230 (40 %) shifts included ≥1 team member exceeding the 16-hour limit. Factors independently associated with extended shifts per 3-member team were 3–4 patients (adjusted OR 5.2, 95 % CI 1.9–14.3) and > 4 patients (OR 10.6, 95 % CI 3.3–34.6) admitted within 6 hours of scheduled departure and > 6 total admissions (adjusted OR 2.9, 95 % CI 1.05–8.3). Seventy-nine of 96 (82 %) residents completed the perceptions survey. Residents believed, on average, teams could admit 4.5 patients after 5 pm and 7 patients during long call shifts to ensure compliance. Regarding the long call shift, 73 % agreed it allows for safe patient care, 60 % disagreed/were neutral about working too many hours, and 53 % rated the educational value in the top 33 % of a 9-point scale.

Conclusions

Compliance with the 16-hour long call shift is sensitive to total workload and workload timing factors. Knowledge of such factors should guide systems redesign aimed at achieving compliance while ensuring patient care and educational opportunities.KEY WORDS: medical education-graduate, medical education, systems-based practice, duty hours  相似文献   

17.

Background

Reduced growth is common in children with sickle cell anemia, but few data exist on associations with long-term clinical course. Our objective was to determine the prevalence of malnutrition at enrolment into a hospital-based cohort and whether poor nutritional status predicted morbidity and mortality within an urban cohort of Tanzanian sickle cell anemia patients.

Design and Methods

Anthropometry was conducted at enrolment into the sickle cell anemia cohort (n=1,618; ages 0.5–48 years) and in controls who attended screening (siblings, walk-ins and referrals) but who were found not to have sickle cell anemia (n=717; ages 0.5–64 years). Prospective surveillance recorded hospitalization at Muhimbili National Hospital and mortality between March 2004 and September 2009.

Results

Sickle cell anemia was associated with stunting (OR=1.92, P<0.001, 36.2%) and wasting (OR=1.66, P=0.002, 18.4%). The greatest growth deficits were observed in adolescents and in boys. Independent of age and sex, lower hemoglobin concentration was associated with increased odds of malnutrition in sickle cell patients. Of the 1,041 sickle cell anemia patients with a body mass index z-score at enrolment, 92% were followed up until September 2009 (n=908) or death (n=50). Body mass index and weight-for-age z-score predicted hospitalization (hazard ratio [HZR]=0.90, P=0.04 and HZR=0.88, P=0.02) but height-for-age z-score did not (HZR=0.93, NS). The mortality rate of 2.5 per 100 person-years was not associated with any of the anthropometric measures.

Conclusions

In this non-birth-cohort of sickle cell anemia with significant associated undernutrition, wasting predicted an increased risk of hospital admission. Targeted nutritional interventions should prioritize treatment and prevention of wasting.  相似文献   

18.

Background

Clones of glycosylphosphatidylinositol-anchor protein-deficient cells are characteristic in paroxysmal nocturnal hemoglobinuria and are present in about 40–50% of patients with severe aplastic anemia. Flow cytometry has allowed for sensitive and precise measurement of glycosylphosphatidylinositol-anchor protein-deficient red blood cells and neutrophils in severe aplastic anemia.

Design and Methods

We conducted a retrospective analysis of paroxysmal nocturnal hemoglobinuria clones measured by flow cytometry in 207 consecutive severe aplastic anemia patients who received immunosuppressive therapy with a horse anti-thymocyte globulin plus cyclosporine regimen from 2000 to 2008.

Results

The presence of a glycosylphosphatidylinositol-anchor protein-deficient clone was detected in 83 (40%) patients pre-treatment, and the median clone size was 9.7% (interquartile range 3.5–29). In patients without a detectable clone pre-treatment, the appearance of a clone after immunosuppressive therapy was infrequent, and in most with a clone pre-treatment, clone size often decreased after immunosuppressive therapy. However, in 30 patients, an increase in clone size was observed after immunosuppressive therapy. The majority of patients with a paroxysmal nocturnal hemoglobinuria clone detected after immunosuppressive therapy did not have an elevated lactate dehydrogenase, nor did they experience hemolysis or thrombosis, and they did not require specific interventions with anticoagulation and/or eculizumab. Of the 7 patients who did require therapy for clinical paroxysmal nocturnal hemoglobinuria symptoms and signs, all had an elevated lactate dehydrogenase and a clone size greater than 50%. In all, 18 (8.6%) patients had a clone greater than 50% at any given time of sampling.

Conclusions

The presence of a paroxysmal nocturnal hemoglobinuria clone in severe aplastic anemia is associated with low morbidity and mortality, and specific measures to address clinical paroxysmal nocturnal hemoglobinuria are seldom required.  相似文献   

19.

Background

Studies evaluating risk factors for in-hospital venous thromboembolism in children are limited by quality assurance of case definition and/or lack of controlled comparison. The objective of this study is to determine risk factors for the development of in-hospital venous thromboembolism in children.

Design and Methods

In a case-control study at The Children’s Hospital, Colorado, from 1st January 2003 to 31st December 2009 we employed diagnostic validation methods to determine pediatric in-hospital venous thromboembolism risk factors. Clinical data on putative risk factors were retrospectively collected from medical records of children with International Classification of Diseases, 9th edition codes of venous thromboembolism at discharge, in whom radiological reports confirmed venous thromboembolism and no signs/symptoms of venous thromboembolism were noted on admission.

Results

We verified 78 cases of in-hospital venous thromboembolism, yielding an average incidence of 5 per 10,000 hospitalized children per year. Logistical regression analyses revealed that mechanical ventilation, systemic infection, and hospitalization duration of five days or over were statistically significant, independent risk factors for in-hospital venous thromboembolism (OR=3.29, 95%CI=1.53–7.06, P=0.002; OR=3.05, 95%CI=1.57–5.94, P=0.001; and OR=1.03, 95%CI=1.01–1.04, P=0.001, respectively). Using these factors in a risk model, post-test probability of venous thromboembolism was 3.6%.

Conclusions

These data indicate that risk of in-hospital venous thromboembolism in children with this risk factor combination may exceed that of hospitalized adults in whom prophylactic anticoagulation is indicated. Substantiation of these findings via multicenter studies could provide the basis for future risk-stratified randomized control trials of pediatric venous thromboembolism prevention.  相似文献   

20.

BACKGROUND

Cigarette smoking is an important risk factor for adverse health events in HIV-infected populations. While recent US population-wide surveys report annual sustained smoking cessation rates of 3.4–8.5%, prospective data are lacking on cessation rates for HIV-infected smokers.

OBJECTIVE

To determine the sustained tobacco cessation rate and predictors of cessation among women with or at risk for HIV infection.

DESIGN

Prospective cohort study.

PARTICIPANTS

A total of 747 women (537 HIV-infected and 210 HIV-uninfected) who reported smoking at enrollment (1994–1995) in the Women’s Interagency HIV Study (WIHS) and remained in follow-up after 10 years. The participants were mostly minority (61% non-Hispanic Blacks and 22% Hispanics) and low income (68% with reported annual incomes of less than or equal to $12,000).

MEASUREMENTS AND MAIN RESULTS

The primary outcome was defined as greater than 12 months continuous cessation at year 10. Multivariate logistic regression was used to identify independent baseline predictors of subsequent tobacco cessation. A total of 121 (16%) women reported tobacco cessation at year 10 (annual sustained cessation rate of 1.8%, 95% CI 1.6–2.1%). Annual sustained cessation rates were 1.8% among both HIV-positive and HIV-negative women (p = 0.82). In multivariate analysis, the odds of tobacco cessation were significantly higher in women with more years of education (p trend = 0.02) and of Hispanic origin (OR = 1.87, 95% CI = 1.4–2.9) compared to Black women. Cessation was significantly lower in current or former illicit drug users (OR = 0.42 95% CI = 0.24–0.74 and OR = 0.65, 95% CI = 0.49–0.86, respectively, p trend = 0.03) and women reporting a higher number of cigarettes per day at baseline (p trend < 0.001).

CONCLUSIONS

HIV-infected and at-risk women in this cohort have lower smoking cessation rates than the general population. Given the high prevalence of smoking, the high risk of adverse health events from smoking, and low rates of cessation, it is imperative that we increase efforts and overcome barriers to help these women quit smoking.KEY WORDS: smoking cessation, HIV/AIDS, clinical epidemiology, vulnerable populations  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号