首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Context  Information on the school-age functioning and special health care needs of extremely low-birth-weight (ELBW, <1000 g) children is necessary to plan for medical and educational services. Objective  To examine neurosensory, developmental, and medical conditions together with the associated functional limitations and special health care needs of ELBW children compared with normal-birth-weight (NBW) term-born children (controls). Design, Setting, and Participants  A follow-up study at age 8 years of a cohort of 219 ELBW children born 1992 to 1995 (92% of survivors) and 176 NBW controls of similar sociodemographic status conducted in Cleveland, Ohio. Main Outcome Measures  Parent Questionnaire for Identifying Children with Chronic Conditions of 12 months or more and categorization of specific medical diagnoses and developmental disabilities based on examination of the children. Results  In logistic regression analyses adjusting for sociodemographic status and sex, ELBW children had significantly more chronic conditions than NBW controls, including functional limitations (64% vs 20%, respectively; odds ratio [OR], 8.1; 95% confidence interval [CI], 5.0-13.1; P<.001), compensatory dependency needs (48% vs 23%, respectively; OR, 3.0; 95% CI, 1.9-4.7; P<.001), and services above those routinely required by children (65% vs 27%, respectively; OR, 5.4; 95% CI, 3.4-8.5; P<.001). These differences remained significant when the 36 ELBW children with neurosensory impairments were excluded. Specific diagnoses and disabilities for ELBW vs NBW children included cerebral palsy (14% vs 0%, respectively; P<.001), asthma (21% vs 9%; OR, 3.0; 95% CI, 1.6-5.6; P = .001), vision of less than 20/200 (10% vs 3%; OR, 3.1; 95% CI, 1.2-7.8; P = .02), low IQ of less than 85 (38% vs 14%; OR, 4.5; 95% CI, 2.7-7.7; P<.001), limited academic skills (37% vs 15%; OR, 4.2; 95% CI, 2.5-7.3; P<.001), poor motor skills (47% vs 10%; OR, 7.8; 95% CI, 4.5-13.6; P<.001), and poor adaptive functioning (69% vs 34%; OR, 6.5; 95% CI, 4.0-10.6; P<.001). Conclusion  The ELBW survivors in school at age 8 years who were born in the 1990s have considerable long-term health and educational needs.   相似文献   

2.
Context  Adult survivors of childhood cancer are at risk for medical and psychosocial sequelae that may adversely affect their health status. Objectives  To compare the health status of adult survivors of childhood cancer and siblings and to identify factors associated with adverse outcomes. Design, Setting, and Participants  Health status was assessed in 9535 adult participants of the Childhood Cancer Survivor Study, a cohort of long-term survivors of childhood cancer who were diagnosed between 1970 and 1986. A randomly selected cohort of the survivors' siblings (n = 2916) served as a comparison group. Main Outcome Measures  Six health status domains were assessed: general health, mental health, functional status, activity limitations, cancer-related pain, and cancer-related anxiety/fears. The first 4 domains were assessed in the control group. Results  Survivors were significantly more likely to report adverse general health (odds ratio [OR], 2.5; 95% confidence interval [CI], 2.1-3.0; P<.001), mental health (OR, 1.8; 95% CI, 1.6-2.1; P<.001), activity limitations (OR, 2.7; 95% CI, 2.3-3.3; P<.001), and functional impairment (OR, 5.2; 95% CI, 4.1-6.6; P<.001), compared with siblings. Forty-four percent of survivors reported at least 1 adversely affected health status domain. Sociodemographic factors associated with reporting at least 1 adverse health status domain included being female (OR, 1.4; 95% CI, 1.3-1.6; P<.001), lower level of educational attainment (OR, 2.0; 95% CI, 1.8-2.2; P<.001), and annual income less than $20 000 (OR, 1.8; 95% CI, 1.6-2.1; P<.001). Relative to those survivors with childhood leukemia, an increased risk was observed for at least 1 adverse health status domain among those with bone tumors (OR, 2.1; 95% CI, 1.8-2.5; P<.001), central nervous system tumors (OR, 1.7; 95% CI, 1.5-2.0; P<.001), and sarcomas (OR, 1.2; 95% CI, 1.1-1.5; P = .01). Conclusion  Clinicians caring for adult survivors of childhood cancer should be aware of the substantial risk for adverse health status, especially among females, those with low educational attainment, and those with low household incomes.   相似文献   

3.
Risk factors for typhoid and paratyphoid fever in Jakarta, Indonesia   总被引:13,自引:1,他引:12  
Context  The proportion of paratyphoid fever cases to typhoid fever cases may change due to urbanization and increased dependency on food purchased from street vendors. For containment of paratyphoid a different strategy may be needed than for typhoid, because risk factors for disease may not coincide and current typhoid vaccines do not protect against paratyphoid fever. Objective  To determine risk factors for typhoid and paratyphoid fever in an endemic area. Design, Setting, and Participants  Community-based case-control study conducted from June 2001 to February 2003 in hospitals and outpatient health centers in Jatinegara district, Jakarta, Indonesia. Enrolled participants were 1019 consecutive patients with fever lasting 3 or more days, from which 69 blood culture–confirmed typhoid cases, 24 confirmed paratyphoid cases, and 289 control patients with fever but without Salmonella bacteremia were interviewed, plus 378 randomly selected community controls. Main Outcome Measures  Blood culture–confirmed typhoid or paratyphoid fever; risk factors for both diseases. Results  In 1019 fever patients we identified 88 (9%) Salmonella typhi and 26 (3%) Salmonella paratyphi A infections. Paratyphoid fever among cases was independently associated with consumption of food from street vendors (comparison with community controls: odds ratio [OR], 3.34; 95% confidence interval [CI], 1.41-7.91; with fever controls: OR, 5.17; 95% CI, 2.12-12.60) and flooding (comparison with community controls: OR, 4.52; 95% CI, 1.90-10.73; with fever controls: OR, 3.25; 95% CI, 1.31-8.02). By contrast, independent risk factors for typhoid fever using the community control group were mostly related to the household, ie, to recent typhoid fever in the household (OR, 2.38; 95% CI, 1.03-5.48); no use of soap for handwashing (OR, 1.91; 95% CI, 1.06-3.46); sharing food from the same plate (OR, 1.93; 95% CI, 1.10-3.37), and no toilet in the household (OR, 2.20; 95% CI, 1.06-4.55). Also, typhoid fever was associated with young age in years (OR, 0.96; 95% CI, 0.94-0.98). In comparison with fever controls, risk factors for typhoid fever were use of ice cubes (OR, 2.27; 95% CI, 1.31-3.93) and female sex (OR, 1.79; 95% CI, 1.04-3.06). Fecal contamination of drinking water was not associated with typhoid or paratyphoid fever. We did not detect fecal carriers among food handlers in the households. Conclusions  In Jakarta, typhoid and paratyphoid fever are associated with distinct routes of transmission, with the risk factors for disease either mainly within the household (typhoid) or outside the household (paratyphoid).   相似文献   

4.
Dietary phytoestrogens and lung cancer risk   总被引:6,自引:0,他引:6  
Schabath MB  Hernandez LM  Wu X  Pillow PC  Spitz MR 《JAMA》2005,294(12):1493-1504
Context  Despite lung-specific in vitro and in vivo studies that support a chemopreventive role for phytoestrogens, there has been little epidemiologic research focused on dietary intake of phytoestrogens and risk of lung cancer. Objective  To examine the relationship between dietary intake of phytoestrogens and risk of lung cancer. Design, Setting, and Participants  Ongoing US case-control study of 1674 patients with lung cancer (cases) and 1735 matched healthy controls. From July 1995 through October 2003, participants were personally interviewed with epidemiologic and food frequency questionnaires to collect demographic information and to quantify dietary intake of 12 individual phytoestrogens. Main Outcome Measure  Risk of lung cancer, estimated using unconditional multivariable logistic regression analyses stratified by sex and smoking status and adjusted for established and putative lung cancer risk factors. Results  Reductions in risk of lung cancer tended to increase with each increasing quartile of phytoestrogen intake. The highest quartiles of total phytosterols, isoflavones, lignans, and phytoestrogens were each associated with reductions in risk of lung cancer ranging from 21% for phytosterols (odds ratio [OR], 0.79; 95% confidence interval [CI], 0.64-0.97; P = .03 for trend) to 46% for total phytoestrogens from food sources only (OR, 0.54; 95% CI, 0.42-0.70; P<.001 for trend). Sex-specific effects were also apparent. For men, statistically significant trends for decreasing risk with increasing intake were noted for each phytoestrogen group, with protective effects for the highest quartile of intake ranging from 24% for phytosterols (OR, 0.76; 95% CI, 0.56-1.02; P = .04 for trend) to 44% for isoflavones (OR, 0.56; 95% CI, 0.41-0.76; P<.001 for trend), while in women, significant trends were only present for intake of total phytoestrogens from food sources only, with a 34% (OR, 0.66; 95% CI, 0.46-0.96; P = .01 for trend) protective effect for the highest quartile of intake. The apparent benefits of high phytoestrogen intake were evident in both never and current smokers but less apparent in former smokers. In women, statistically significant joint effects were evident between hormone therapy use and phytoestrogen intake. Specifically, high intake of the lignans enterolactone and enterodiol and use of hormone therapy were associated with a 50% (OR, 0.50; 95% CI, 0.31-0.68; P = .04 for interaction) reduction in risk of lung cancer. Conclusions  While there are limitations and concerns regarding case-control studies of diet and cancer, these data provide further support for the limited but growing epidemiologic evidence that phytoestrogens are associated with a decrease in risk of lung cancer. Confirmation of these findings is still required in large-scale, hypothesis-driven, prospective studies.   相似文献   

5.
Context.— Public health workers may work with clients whose behaviors are risks for both infectious disease and violence. Objective.— To assess frequency of violent threats and incidents experienced by public health workers and risk factors associated with incidents. Design.— Anonymous, self-administered questionnaires. Setting.— Texas sexually transmitted disease (STD), human immunodeficiency virus and acquired immunodeficiency syndrome (HIV/AIDS), and tuberculosis (TB) programs. Participants.— Questionnaires were completed by 364 (95.5%) of 381 public health workers assigned to the programs. The STD program employed 131 workers (36%), the HIV/AIDS program, 121 workers (33%), and the TB program, 112 workers (31%). Main Outcome Measures.— The frequencies with which workers had ever experienced (while on the job) verbal threats, weapon threats, physical attacks, and rape, and risk factors associated with those outcomes. Results.— A total of 139 (38%) of 364 workers reported 611 violent incidents. Verbal threats were reported by 136 workers (37%), weapon threats by 45 (12%), physical attacks by 14 (4%), and rape by 3 (1%). Five workers (1%) carried guns and/or knives while working. In multiple logistic regression, receipt of verbal threats was associated with worker's male sex (odds ratio [OR], 2.4; 95% confidence interval [CI], 1.5-4.0), white ethnicity (OR, 2.4; 95% CI, 1.4-4.1), experience of 5 years or longer (OR, 2.2; 95% CI, 1.3-3.8), weekend work (OR, 1.8; 95% CI, 1.1-3.1), and sexual remarks made to the worker by clients (OR, 2.0; 95% CI, 1.2-3.5). Receipt of weapon threats was associated with worker's male sex (OR, 5.7; 95% CI, 2.4-15.3), white ethnicity (OR, 4.0; 95% CI, 1.8-9.3), age of 40 years or older (OR, 2.5; 95% CI, 1.1-5.8), work experience of 5 years or longer (OR, 2.7; 95% CI, 1.2-6.0), rural work (OR, 3.6; 95% CI, 1.3-10.1), being alone with the opposite sex (OR, 3.7; 95% CI, 1.6-9.7), and interaction with homeless clients (OR, 5.2; 95% CI, 1.7-18.8). Physical attacks were associated with sexual remarks made to the worker by clients (OR, 4.2; 95% CI, 1.4-13.9). No risk factors predicting rape were identified. Conclusions.— Violence directed toward public field-workers is a common occupational hazard. An assessment of what situations, clients, and locations pose the risk of violence to public health workers is needed.   相似文献   

6.
Association between C-reactive protein and age-related macular degeneration   总被引:9,自引:0,他引:9  
Seddon JM  Gensler G  Milton RC  Klein ML  Rifai N 《JAMA》2004,291(6):704-710
Context  C-reactive protein (CRP) is a systemic inflammatory marker associated with risk for cardiovascular disease (CVD). Some risk factors for CVD are associated with age-related macular degeneration (AMD), but the association between CRP and AMD is unknown. Objective  To test the hypothesis that elevated CRP levels are associated with an increased risk for AMD. Design, Setting, and Participants  A total of 930 (91%) of 1026 participants at 2 centers in the Age-Related Eye Disease Study (AREDS), a multicenter randomized trial of antioxidant vitamins and minerals, were enrolled in this case-control study. There were 183 individuals without any maculopathy, 200 with mild maculopathy, 325 with intermediate disease, and 222 with advanced AMD (geographic atrophy or neovascular AMD). The AMD status was assessed by standardized grading of fundus photographs, and stored fasting blood specimens drawn between January 1996 and April 1997 were analyzed for high-sensitivity CRP levels. Main Outcome Measure  Association between CRP and AMD. Results  The CRP levels were significantly higher among participants with advanced AMD (case patients) than among those with no AMD (controls; median values, 3.4 vs 2.7 mg/L; P = .02). After adjustment for age, sex, and other variables, including smoking and body mass index, CRP levels were significantly associated with the presence of intermediate and advanced stages of AMD. The odds ratio (OR) for the highest vs the lowest quartile of CRP was 1.65 (95% confidence interval [CI], 1.07-2.55; P for trend = .02). The OR for CRP values at or above the 90th percentile (10.6 mg/L) was 1.92 (95% CI, 1.20-3.06), and the OR for CRP values at or above the mean plus 2 SDs (16.8 mg/L) was 2.03 (95% CI, 1.03-4.00). A trend for an increased risk for intermediate and advanced AMD with higher levels of CRP was seen for smokers (OR, 2.16; 95% CI, 1.33-3.49) and those who never smoked (OR, 2.03; 95% CI, 1.19-3.46) with the highest level of CRP. Conclusion  Our results suggest that elevated CRP level is an independent risk factor for AMD and may implicate the role of inflammation in the pathogenesis of AMD.   相似文献   

7.
Context  Aspirin therapy reduces the risk of cardiovascular disease in adults who are at increased risk. However, it is unclear if women derive the same benefit as men. Objective  To determine if the benefits and risks of aspirin treatment in the primary prevention of cardiovascular disease vary by sex. Data Sources and Study Selection  MEDLINE and the Cochrane Central Register of Controlled Trials databases (1966 to March 2005), bibliographies of retrieved trials, and reports presented at major scientific meetings. Eligible studies were prospective, randomized controlled trials of aspirin therapy in participants without cardiovascular disease that reported data on myocardial infarction (MI), stroke, and cardiovascular mortality. Six trials with a total of 95 456 individuals were identified; 3 trials included only men, 1 included only women, and 2 included both sexes. Data Extraction  Studies were reviewed to determine the number of patients randomized, mean duration of follow-up, and end points (a composite of cardiovascular events [nonfatal MI, nonfatal stroke, and cardiovascular mortality], each of these individual components separately, and major bleeding). Data Synthesis  Among 51 342 women, there were 1285 major cardiovascular events: 625 strokes, 469 MIs, and 364 cardiovascular deaths. Aspirin therapy was associated with a significant 12% reduction in cardiovascular events (odds ratio [OR], 0.88; 95% confidence interval [CI], 0.79-0.99; P = .03) and a 17% reduction in stroke (OR, 0.83; 95% CI, 0.70-0.97; P = .02), which was a reflection of reduced rates of ischemic stroke (OR, 0.76; 95% CI, 0.63-0.93; P = .008). There was no significant effect on MI or cardiovascular mortality. Among 44 114 men, there were 2047 major cardiovascular events: 597 strokes, 1023 MIs, and 776 cardiovascular deaths. Aspirin therapy was associated with a significant 14% reduction in cardiovascular events (OR, 0.86; 95% CI, 0.78-0.94; P = .01) and a 32% reduction in MI (OR, 0.68; 95% CI, 0.54-0.86; P = .001). There was no significant effect on stroke or cardiovascular mortality. Aspirin treatment increased the risk of bleeding in women (OR, 1.68; 95% CI, 1.13-2.52; P = .01) and in men (OR, 1.72; 95% CI, 1.35-2.20; P<.001). Conclusions  For women and men, aspirin therapy reduced the risk of a composite of cardiovascular events due to its effect on reducing the risk of ischemic stroke in women and MI in men. Aspirin significantly increased the risk of bleeding to a similar degree among women and men.   相似文献   

8.
Feikin DR  Lezotte DC  Hamman RF  Salmon DA  Chen RT  Hoffman RE 《JAMA》2000,284(24):3145-3150
Context  The risk of vaccine-preventable diseases among children who have philosophical and religious exemptions from immunization has been understudied. Objectives  To evaluate whether personal exemption from immunization is associated with risk of measles and pertussis at individual and community levels. Design, Setting, and Participants  Population-based, retrospective cohort study using data collected on standardized forms regarding all reported measles and pertussis cases among children aged 3 to 18 years in Colorado during 1987-1998. Main Outcome Measures  Relative risk of measles and pertussis among exemptors and vaccinated children; association between incidence rates among vaccinated children and frequency of exemptors in Colorado counties; association between school outbreaks and frequency of exemptors in schools; and risk associated with exposure to an exemptor in measles outbreaks. Results  Exemptors were 22.2 times (95% confidence interval [CI], 15.9-31.1) more likely to acquire measles and 5.9 times (95% CI, 4.2-8.2) more likely to acquire pertussis than vaccinated children. After adjusting for confounders, the frequency of exemptors in a county was associated with the incidence rate of measles (relative risk [RR], 1.6; 95% CI, 1.0-2.4) and pertussis (RR, 1.9; 95% CI, 1.7-2.1) in vaccinated children. Schools with pertussis outbreaks had more exemptors (mean, 4.3% of students) than schools without outbreaks (1.5% of students; P = .001). At least 11% of vaccinated children in measles outbreaks acquired infection through contact with an exemptor. Conclusions  The risk of measles and pertussis is elevated in personal exemptors. Public health personnel should recognize the potential effect of exemptors in outbreaks in their communities, and parents should be made aware of the risks involved in not vaccinating their children.   相似文献   

9.
Context  While gluten ingestion is responsible for the signs and symptoms of celiac disease, it is not known what factors are associated with initial appearance of the disease. Objective  To examine whether the timing of gluten exposure in the infant diet was associated with the development of celiac disease autoimmunity (CDA). Design, Setting, and Patients  Prospective observational study conducted in Denver, Colo, from 1994-2004 of 1560 children at increased risk for celiac disease or type 1 diabetes, as defined by possession of either HLA-DR3 or DR4 alleles, or having a first-degree relative with type 1 diabetes. The mean follow-up was 4.8 years. Main Outcome Measure  Risk of CDA defined as being positive for tissue transglutaminase (tTG) autoantibody on 2 or more consecutive visits or being positive for tTG once and having a positive small bowel biopsy for celiac disease, by timing of introduction of gluten-containing foods into the diet. Results  Fifty-one children developed CDA. Findings adjusted for HLA-DR3 status indicated that children exposed to foods containing wheat, barley, or rye (gluten-containing foods) in the first 3 months of life (3 [6%] CDA positive vs 40 [3%] CDA negative) had a 5-fold increased risk of CDA compared with children exposed to gluten-containing foods at 4 to 6 months (12 [23%] CDA positive vs 574 [38%] CDA negative) (hazard ratio [HR], 5.17; 95% confidence interval [CI], 1.44-18.57). Children not exposed to gluten until the seventh month or later (36 [71%] CDA positive vs 895 [59%] CDA negative) had a marginally increased risk of CDA compared with those exposed at 4 to 6 months (HR, 1.87; 95% CI, 0.97-3.60). After restricting our case group to only the 25 CDA-positive children who had biopsy-diagnosed celiac disease, initial exposure to wheat, barley, or rye in the first 3 months (3 [12%] CDA positive vs 40 [3%] CDA negative) or in the seventh month or later (19 [76%] CDA positive vs 912 [59%] CDA negative) significantly increased risk of CDA compared with exposure at 4 to 6 months (3 [12%] CDA positive vs 583 [38%] CDA negative) (HR, 22.97; 95% CI, 4.55-115.93; P = .001; and HR, 3.98; 95% CI, 1.18-13.46; P = .04, respectively). Conclusion  Timing of introduction of gluten into the infant diet is associated with the appearance of CDA in children at increased risk for the disease.   相似文献   

10.
Self-reported Antiretroviral Therapy in Injection Drug Users   总被引:15,自引:2,他引:13  
Context.— The US Public Health Service and the International AIDS Society–USA recently published recommendations for antiretroviral therapy (ART) for persons infected with human immunodeficiency virus (HIV); however, anecdotal evidence suggests that HIV-infected injection drug users (IDUs) may not be receiving optimal care as defined by the recommendations. Objective.— To assess ART use in HIV-infected IDUs. Design.— A cross-sectional survey of self-reported ART use between July 1996 and June 1997 in IDUs. Setting.— A community-based clinic affiliated with Johns Hopkins University, Baltimore, Md. Participants.— A total of 404 HIV-infected IDUs with CD4+ cell counts less than 0.50x109/L recruited into a longitudinal study in 1988 and 1989. Main Outcome Measure.— Self-reported ART use was assessed: no current therapy, monotherapy, or combination therapy with or without a protease inhibitor. Results.— One half (199/404 [49%]) of patients reported no recent ART. A total of 14% (58/404) had monotherapy, 23% (90/404) were receiving combination therapy without a protease inhibitor, and 14% (57/404) had triple-combination therapy with a protease inhibitor. A multivariate analysis of factors associated with ART showed that care continuity and recent HIV-related outpatient visit (odds ratio [OR], 4.30; 95% confidence interval [CI], 2.36-7.81 and OR, 2.84; 95% CI, 1.66-4.88, respectively), CD4+ cell count of less than 0.20x109 (OR, 2.41; 95% CI, 1.51-3.84), no current drug use and being in drug treatment (OR, 2.16; 95% CI, 1.34-3.47; OR, 2.12; 95% CI, 1.23-3.66, respectively), and unemployment (OR, 2.31; 95% CI, 1.21-4.40) were associated with reporting ART use. In other analysis, less likely to receive protease inhibitors were current drug injectors (OR, 0.5; 95% CI, 0.3-1.0) and those recently incarcerated (OR, 0.2; 95% CI, 0.03-0.9), but patients with acquired immunodeficiency syndrome were more likely to receive protease inhibitors (OR, 2.0; 95% CI, 0.9-4.6). Protease inhibitor use doubled (P<.01) from July and December 1996 to January and June 1997 (7.7% and 14.8%, respectively). Conclusions.— Those IDUs infected with HIV who were not receiving ART tended to be active drug users without clinical disease who have less contact with health care providers. Although we do not have information on clinical judgment regarding treatment decisions or whether persons were prescribed therapy not taken, the proportion of subjects reporting receiving ART suggests that strategies for improving treatment in this population are indicated. Expanding simultaneous treatment services for HIV infection and substance abuse would enhance the response to these related epidemics.   相似文献   

11.
Context  Although acute renal failure (ARF) is believed to be common in the setting of critical illness and is associated with a high risk of death, little is known about its epidemiology and outcome or how these vary in different regions of the world. Objectives  To determine the period prevalence of ARF in intensive care unit (ICU) patients in multiple countries; to characterize differences in etiology, illness severity, and clinical practice; and to determine the impact of these differences on patient outcomes. Design, Setting, and Patients  Prospective observational study of ICU patients who either were treated with renal replacement therapy (RRT) or fulfilled at least 1 of the predefined criteria for ARF from September 2000 to December 2001 at 54 hospitals in 23 countries. Main Outcome Measures  Occurrence of ARF, factors contributing to etiology, illness severity, treatment, need for renal support after hospital discharge, and hospital mortality. Results  Of 29 269 critically ill patients admitted during the study period, 1738 (5.7%; 95% confidence interval [CI], 5.5%-6.0%) had ARF during their ICU stay, including 1260 who were treated with RRT. The most common contributing factor to ARF was septic shock (47.5%; 95% CI, 45.2%-49.5%). Approximately 30% of patients had preadmission renal dysfunction. Overall hospital mortality was 60.3% (95% CI, 58.0%-62.6%). Dialysis dependence at hospital discharge was 13.8% (95% CI, 11.2%-16.3%) for survivors. Independent risk factors for hospital mortality included use of vasopressors (odds ratio [OR], 1.95; 95% CI, 1.50-2.55; P<.001), mechanical ventilation (OR, 2.11; 95% CI, 1.58-2.82; P<.001), septic shock (OR, 1.36; 95% CI, 1.03-1.79; P = .03), cardiogenic shock (OR, 1.41; 95% CI, 1.05-1.90; P = .02), and hepatorenal syndrome (OR, 1.87; 95% CI, 1.07-3.28; P = .03). Conclusion  In this multinational study, the period prevalence of ARF requiring RRT in the ICU was between 5% and 6% and was associated with a high hospital mortality rate.   相似文献   

12.
Context  Infection with Epstein-Barr virus (EBV) has been associated with an increased risk of multiple sclerosis (MS), but the temporal relationship remains unclear. Objective  To determine whether antibodies to EBV are elevated before the onset of MS. Design, Setting, and Participants  Nested case-control study conducted among more than 3 million US military personnel with blood samples collected between 1988 and 2000 and stored in the Department of Defense Serum Repository. Cases were identified as individuals granted temporary or permanent disability because of MS. For each case (n = 83), 2 controls matched by age, sex, race/ethnicity, and dates of blood sample collection were selected. Serial samples collected before the onset of symptoms were available for 69 matched case-control sets. Main Outcome Measures  Antibodies including IgA against EBV viral capsid antigen (VCA), and IgG against VCA, nuclear antigens (EBNA complex, EBNA-1, and EBNA-2), diffuse and restricted early antigens, and cytomegalovirus. Results  The average time between blood collection and MS onset was 4 years (range, <1-11 years). The strongest predictors of MS were serum levels of IgG antibodies to EBNA complex or EBNA-1. Among individuals who developed MS, serum antibody titers to EBNA complex were similar to those of controls before the age of 20 years (geometric mean titers: cases = 245, controls = 265), but 2- to 3-fold higher at age 25 years and older (cases = 684, controls = 282; P<.001). The risk of MS increased with these antibody titers; the relative risk (RR) in persons with EBNA complex titers of at least 1280 compared with those with titers less than 80 was 9.4 (95% confidence interval [CI], 2.5-35.4; P for trend <.001). In longitudinal analyses, a 4-fold increase in anti-EBNA complex or anti–EBNA-1 titers during the follow-up was associated with a 3-fold increase in MS risk (EBNA complex: RR , 3.0; 95% CI, 1.3-6.5; EBNA-1: RR, 3.0; 95% CI, 1.2-7.3). No association was found between cytomegalovirus antibodies and MS. Conclusion  These results suggest an age-dependent relationship between EBV infection and development of MS.   相似文献   

13.
Multiple sclerosis and Epstein-Barr virus   总被引:9,自引:0,他引:9  
Context  Infection with Epstein-Barr virus (EBV) has been associated with an increased risk of multiple sclerosis (MS), but the temporal relationship remains unclear. Objective  To determine whether antibodies to EBV are elevated before the onset of MS. Design, Setting, and Population  Nested case-control study conducted among more than 3 million US military personnel with blood samples collected between 1988 and 2000 and stored in the Department of Defense Serum Repository. Cases were identified as individuals granted temporary or permanent disability because of MS. For each case (n = 83), 2 controls matched by age, sex, race/ethnicity, and dates of blood sample collection were selected. Main Outcome Measures  Antibodies including IgA against EBV viral capsid antigen (VCA) and IgG against VCA, nuclear antigens (EBNA complex, EBNA-1, and EBNA-2), diffuse and restricted early antigens, and cytomegalovirus. Results  The average time between blood collection and MS onset was 4 years. The strongest predictors of MS were serum levels of IgG antibodies to VCA or EBNA complex. The risk of MS increased monotonically with these antibody titers; relative risk (RR) in persons in the highest category of VCA (2560) compared with those in the lowest (160) was 19.7 (95% confidence interval [CI], 2.2-174; P for trend = .004). For EBNA complex titers, the RR for those in the highest category (1280) was 33.9 (95% CI, 4.1-283; P for trend <.001) vs those in the lowest category (40). Similarly strong positive associations between EBV antibodies and risk of MS were already present in samples collected 5 or more years before MS onset. No association was found between cytomegalovirus antibodies and MS. Conclusion  These results suggest a relationship between EBV infection and development of MS.   相似文献   

14.
Context  Randomized trials have demonstrated that adding a drug to a single-agent or to a 2-agent regimen increased the tumor response rate in patients with advanced non–small-cell lung cancer (NSCLC), although its impact on survival remains controversial. Objective  To evaluate the clinical benefit of adding a drug to a single-agent or 2-agent chemotherapy regimen in terms of tumor response rate, survival, and toxicity in patients with advanced NSCLC. Data Sources and Study Selection  Data from all randomized controlled trials performed between 1980 and 2001 (published between January 1980 and October 2003) comparing a doublet regimen with a single-agent regimen or comparing a triplet regimen with a doublet regimen in patients with advanced NSCLC. There were no language restrictions. Searches of MEDLINE and EMBASE were performed using the search terms non–small-cell lung carcinoma/drug therapy, adenocarcinoma, large-cell carcinoma, squamous-cell carcinoma, lung, neoplasms, clinical trial phase III, and randomized trial. Manual searches were also performed to find conference proceedings published between January 1982 and October 2003. Data Extraction  Two independent investigators reviewed the publications and extracted the data. Pooled odds ratios (ORs) for the objective tumor response rate, 1-year survival rate, and toxicity rate were calculated using the fixed-effect model. Pooled median ratios (MRs) for median survival also were calculated using the fixed-effect model. ORs and MRs lower than unity (<1.0) indicate a benefit of a doublet regimen compared with a single-agent regimen (or a triplet regimen compared with a doublet regimen). Data Synthesis  Sixty-five trials (13 601 patients) were eligible. In the trials comparing a doublet regimen with a single-agent regimen, a significant increase was observed in tumor response (OR, 0.42; 95% confidence interval [CI], 0.37-0.47; P<.001) and 1-year survival (OR, 0.80; 95% CI, 0.70-0.91; P<.001) in favor of the doublet regimen. The median survival ratio was 0.83 (95% CI, 0.79-0.89; P<.001). An increase also was observed in the tumor response rate (OR, 0.66; 95% CI, 0.58-0.75; P<.001) in favor of the triplet regimen, but not for 1-year survival (OR, 1.01; 95% CI, 0.85-1.21; P = .88). The median survival ratio was 1.00 (95% CI, 0.94-1.06; P = .97). Conclusion  Adding a second drug improved tumor response and survival rate. Adding a third drug had a weaker effect on tumor response and no effect on survival.   相似文献   

15.
Li S  Chen W  Srinivasan SR  Bond MG  Tang R  Urbina EM  Berenson GS 《JAMA》2003,290(17):2271-2276
Context  Carotid artery intima-media thickness (IMT) is associated with cardiovascular risk factors and is recognized as an important predictive measure of clinical coronary atherosclerosis events in middle-aged and elderly populations. However, information on the association of carotid IMT in young adults with different risk factors measured in childhood, adulthood, or as a cumulative burden of each of the risk factors measured serially from childhood to adulthood is limited. Objective  To examine the association between carotid IMT in young adults and traditional cardiovascular risk factors measured since childhood. Design, Setting, and Participants  A cohort study of 486 adults aged 25 to 37 years from a semirural black and white community in Bogalusa, La (71% white, 39% men), who had at least 3 measurements of traditional risk factors since childhood, conducted between September 1973 and December 1996. Main Outcome Measure  Association of carotid IMT with risk factors, including systolic blood pressure, lipoprotein levels, and body mass index. Results  Male vs female (0.757 mm vs 0.719 mm) and black vs white (0.760 mm vs 0.723 mm) participants had increased carotid IMT (P<.001 for both). In multivariable analyses, significant predictors for being in top vs lower 3 quartiles of carotid IMT in young adults were childhood measures of low-density lipoprotein cholesterol (LDL-C) level (odds ratio [OR], 1.42, corresponding to 1-SD change specific for age, race, and sex; 95% confidence interval [CI], 1.14-1.78) and body mass index (BMI; OR, 1.25; 95% CI, 1.01-1.54); adulthood measures of LDL-C level (OR, 1.46; 95% CI, 1.16-1.82), high-density lipoprotein cholesterol (HDL-C) level (OR, 0.67; 95% CI, 0.51-0.88), and systolic blood pressure (OR, 1.36; 95% CI, 1.08-1.72); and long-term cumulative burden of LDL-C (OR, 1.58; 95% CI, 1.24-2.01) and HDL-C (OR, 0.75; 95% CI, 0.58-0.97) levels measured serially from childhood to adulthood. An increasing trend in carotid IMT across quartiles of LDL-C level measured in childhood was observed, with a mean value of 0.761 mm (95% CI, 0.743-0.780 mm) for those at the top quartile vs 0.724 mm (95% CI, 0.715-0.734 mm) for those in the lower 3 quartiles (P<.001). Conclusions  Childhood measures of LDL-C level and BMI predict carotid IMT in young adults. The prevention implications of these findings remains to be explored.   相似文献   

16.
Context  Patients with unstable angina or non–ST-segment elevation myocardial infarction (NSTEMI) can be cared for with a routine invasive strategy involving coronary angiography and revascularization or more conservatively with a selective invasive strategy in which only those with recurrent or inducible ischemia are referred for acute intervention. Objective  To conduct a meta-analysis that compares benefits and risks of routine invasive vs selective invasive strategies. Data Sources  Randomized controlled trials identified through search of MEDLINE and the Cochrane databases (1970 through June 2004) and hand searching of cross-references from original articles and reviews. Study Selection  Trials were included that involved patients with unstable angina or NSTEMI who received a routine invasive or a selective invasive strategy. Data Extraction  Major outcomes of death and myocardial infarction (MI) occurring from initial hospitalization to the end of follow-up were extracted from published results of eligible trials. Data Synthesis  A total of 7 trials (N = 9212 patients) were eligible. Overall, death or MI was reduced from 663 (14.4%) of 4604 patients in the selective invasive group to 561 (12.2%) of 4608 patients in the routine invasive group (odds ratio [OR], 0.82; 95% confidence interval [CI], 0.72-0.93; P = .001). There was a nonsignificant trend toward fewer deaths (6.0% vs 5.5%; OR, 0.92; 95% CI, 0.77-1.09; P = .33) and a significant reduction in MI alone (9.4% vs 7.3%; OR, 0.75; 95% CI, 0.65-0.88; P<.001). Higher-risk patients with elevated cardiac biomarker levels at baseline benefited more from routine intervention, with no significant benefit observed in lower-risk patients with negative baseline marker levels. During the initial hospitalization, a routine invasive strategy was associated with a significantly higher early mortality (1.1% vs 1.8% for selective vs routine, respectively; OR, 1.60; 95% CI, 1.14-2.25; P = .007) and the composite of death or MI (3.8% vs 5.2%; OR, 1.36; 95% CI, 1.12-1.66; P = .002). But after discharge, the routine invasive strategy was associated with fewer subsequent deaths (4.9% vs 3.8%; OR, 0.76; 95% CI, 0.62-0.94; P = .01) and the composite of death or MI (11.0% vs 7.4%; OR, 0.64; 95% CI, 0.56-0.75; P<.001). At the end of follow-up, there was a 33% reduction in severe angina (14.0% vs 11.2%; OR, 0.77; 95% CI, 0.68-0.87; P<.001) and a 34% reduction in rehospitalization (41.3% vs 32.5%; OR, 0.66; 95% CI, 0.60-0.72; P<.001) with a routine invasive strategy. Conclusions  A routine invasive strategy exceeded a selective invasive strategy in reducing MI, severe angina, and rehospitalization over a mean follow-up of 17 months. But routine intervention was associated with a higher early mortality hazard and a trend toward a mortality reduction at follow-up. Future strategies should explore ways to minimize the early hazard and enhance later benefits by focusing on higher-risk patients and optimizing timing of intervention and use of proven therapies.   相似文献   

17.
Context  Plasma human immunodeficiency virus (HIV) RNA level predicts HIV disease progression, but the extent to which it explains the variability in rate of CD4 cell depletion is poorly characterized. Objective  To estimate the proportion of variability in rate of CD4 cell loss predicted by presenting plasma HIV RNA levels in untreated HIV-infected persons. Design  Repeated-measures analyses of 2 multicenter cohorts, comprising observations beginning on May 12, 1984, and ending on August 26, 2004. Analyses were conducted between August 2004 and March 2006. Setting  Two cohorts of HIV-infected persons: patients followed up at 4 US teaching medical institutions or participating in either the Research in Access to Care for the Homeless Cohort (REACH) or the San Francisco Men's Health Study (SFMHS) cohorts and participants in the Multicenter AIDS Cohort Study (MACS) cohort. Participants  Antiretroviral treatment–naive, chronically HIV-infected persons (n = 1289 and n = 1512 for each of the 2 cohorts) untreated during the observation period (6 months) and with at least 1 HIV RNA level and 2 CD4 cell counts available. Approximately 35% were nonwhite, and 35% had risk factors other than male-to-male sexual contact. Main Outcome Measures  The extent to which presenting plasma HIV RNA level could explain the rate of model-derived yearly CD4 cell loss, as estimated by the coefficient of determination (R2). Results  In both cohorts, higher presenting HIV RNA levels were associated with greater subsequent CD4 cell decline. In the study cohort, median model–estimated CD4 cell decrease among participants with HIV RNA levels of 500 or less, 501 to 2000, 2001 to 10 000, 10 001 to 40 000, and more than 40 000 copies/mL were 20, 39, 48, 56, and 78 cells/µL, respectively. Despite this trend across broad categories of HIV RNA levels, only a small proportion of CD4 cell loss variability (4%-6%) could be explained by presenting plasma HIV RNA level. Analyses using multiple HIV RNA measurements or restricting to participants with high HIV RNA levels improved this correlation minimally (R2, 0.09), and measurement error was estimated to attenuate these associations only marginally (deattenuated R2 in the 2 cohorts, 0.05 and 0.08, respectively). Conclusions  Presenting HIV RNA level predicts the rate of CD4 cell decline only minimally in untreated persons. Other factors, as yet undefined, likely drive CD4 cell losses in HIV infection. These findings have implications for treatment decisions in HIV infection and for understanding the pathogenesis of progressive immune deficiency.   相似文献   

18.
Migraine and risk of cardiovascular disease in women   总被引:9,自引:0,他引:9  
Context  Migraine with aura has been associated with an adverse cardiovascular risk profile and prothrombotic factors that, along with migraine-specific physiology, may increase the risk of vascular events. Although migraine with aura has been associated with increased risk of ischemic stroke, an association with cardiovascular disease (CVD) and, specifically, coronary events remains unclear. Objective  To evaluate the association between migraine with and without aura and subsequent risk of overall and specific CVD. Design, Setting, and Participants  Prospective cohort study of 27 840 US women aged 45 years or older who were participating in the Women's Health Study, were free of CVD and angina at study entry (1992-1995), and who had information on self-reported migraine and aura status, and lipid measurements. This report is based on follow-up data through March 31, 2004. Main Outcome Measures  The primary outcome measure was the combined end point of major CVD (first instance of nonfatal ischemic stroke, nonfatal myocardial infarction, or death due to ischemic CVD); other measures were first ischemic stroke, myocardial infarction, coronary revascularization, angina, and death due to ischemic CVD. Results  At baseline, 5125 women (18.4%) reported any history of migraine; of the 3610 with active migraine (migraine in the prior year), 1434 (39.7%) indicated aura symptoms. During a mean of 10 years of follow-up, 580 major CVD events occurred. Compared with women with no migraine history, women who reported active migraine with aura had multivariable-adjusted hazard ratios of 2.15 (95% confidence interval [CI], 1.58-2.92; P<.001) for major CVD, 1.91 (95% CI, 1.17-3.10; P = .01) for ischemic stroke, 2.08 (95% CI, 1.30-3.31; P = .002) for myocardial infarction, 1.74 (95% CI, 1.23-2.46; P = .002) for coronary revascularization, 1.71 (95% CI, 1.16-2.53; P = .007) for angina, and 2.33 (95% CI, 1.21-4.51; P = .01) for ischemic CVD death. After adjusting for age, there were 18 additional major CVD events attributable to migraine with aura per 10 000 women per year. Women who reported active migraine without aura did not have increased risk of any vascular events or angina. Conclusions  In this large, prospective cohort of women, active migraine with aura was associated with increased risk of major CVD, myocardial infarction, ischemic stroke, and death due to ischemic CVD, as well as with coronary revascularization and angina. Active migraine without aura was not associated with increased risk of any CVD event.   相似文献   

19.
Context  Varenicline, a partial agonist at the 42 nicotinic acetylcholine receptor, has the potential to aid smoking cessation by relieving nicotine withdrawal symptoms and reducing the rewarding properties of nicotine. Objective  To determine the efficacy and safety of varenicline for smoking cessation compared with placebo or sustained-release bupropion (bupropion SR). Design, Setting, and Participants  A randomized, double-blind, placebo-controlled trial conducted between June 2003 and March 2005 at 14 research centers with a 12-week treatment period and follow-up of smoking status to week 52. Of 1413 adult smokers who volunteered for the study, 1027 were enrolled; 65% of randomized participants completed the study. Intervention  Varenicline titrated to 1 mg twice daily (n = 344) or bupropion SR titrated to 150 mg twice daily (n = 342) or placebo (n = 341) for 12 weeks, plus weekly brief smoking cessation counseling. Main Outcome Measures  Continuous abstinence from smoking during the last 4 weeks of treatment (weeks 9-12; primary end point) and through the follow-up period (weeks 9-24 and 9-52). Results  During the last 4 weeks of treatment (weeks 9-12), 43.9% of participants in the varenicline group were continuously abstinent from smoking compared with 17.6% in the placebo group (odds ratio [OR], 3.85; 95% confidence interval [CI], 2.69-5.50; P<.001) and 29.8% in the bupropion SR group (OR, 1.90; 95% CI, 1.38-2.62; P<.001). For weeks 9 through 24, 29.7% of participants in the varenicline group were continuously abstinent compared with 13.2% in the placebo group (OR, 2.83; 95% CI, 1.91-4.19; P<.001) and 20.2% in the bupropion group (OR, 1.69; 95% CI, 1.19-2.42; P = .003). For weeks 9 through 52, 23% of participants in the varenicline group were continuously abstinent compared with 10.3% in the placebo group (OR, 2.66; 95% CI, 1.72-4.11; P<.001) and 14.6% in the bupropion SR group (OR, 1.77; 95% CI, 1.19-2.63; P = .004). Treatment was discontinued due to adverse events by 10.5% of participants in the varenicline group, 12.6% in the bupropion SR group, and 7.3% in the placebo group. The most common adverse event with varenicline was nausea, which occurred in 101 participants (29.4%). Conclusions  Varenicline is an efficacious, safe, and well-tolerated smoking cessation pharmacotherapy. Varenicline's short-term and long-term efficacy exceeded that of both placebo and bupropion SR. Trial Registration  clinicaltrials.gov Identifier: NCT00143364   相似文献   

20.
Chorioamnionitis and cerebral palsy in term and near-term infants   总被引:13,自引:0,他引:13  
Wu YW  Escobar GJ  Grether JK  Croen LA  Greene JD  Newman TB 《JAMA》2003,290(20):2677-2684
Context  Half of all cases of cerebral palsy (CP) occur in term infants, for whom risk factors have not been clearly defined. Recent studies suggest a possible role of chorioamnionitis. Objective  To determine whether clinical chorioamnionitis increases the risk of CP in term and near-term infants. Design, Setting, and Patients  Case-control study nested within a cohort of 231 582 singleton infants born at 36 or more weeks' gestation between January 1, 1991, and December 31, 1998, in the Kaiser Permanente Medical Care Program, a managed care organization providing care for more than 3 million residents of northern California. Case patients were identified from electronic records and confirmed by chart review by a child neurologist, and comprised all children with moderate to severe spastic or dyskinetic CP not due to postnatal brain injury or developmental abnormalities (n = 109). Controls (n = 218) were randomly selected from the study population. Main Outcome Measure  Association between clinical chorioamnionitis and increased risk of CP in term and near-term infants. Results  Most CP cases had hemiparesis (40%) or quadriparesis (38%); 87% had been diagnosed by a neurologist and 83% had undergone neuroimaging. Chorioamnionitis, considered present if a treating physician made a diagnosis of chorioamnionitis or endometritis clinically, was noted in 14% of cases and 4% of controls (odds ratio [OR], 3.8; 95% confidence interval [CI], 1.5-10.1; P = .001). Independent risk factors identified in multiple logistic regression included chorioamnionitis (OR, 4.1; 95% CI, 1.6-10.1), intrauterine growth restriction (OR, 4.0; 95% CI, 1.3-12.0), maternal black ethnicity (OR, 3.6; 95% CI, 1.4-9.3), maternal age older than 25 years (OR, 2.6; 95% CI, 1.3-5.2), and nulliparity (OR, 1.8; 95% CI, 1.0-3.0). The population-attributable fraction of chorioamnionitis for CP is 11%. Conclusion  Our data suggest that chorioamnionitis is an independent risk factor for CP among term and near-term infants.   相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号