首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BACKGROUND: Increased risk for tuberculosis (TB) disease has been identified in foreign-born persons in the United States, particularly during the first 5 years after their arrival in the United States. This could be explained by undetected TB disease at entry, increased prevalence of latent TB infection (LTBI), increased progression from LTBI to TB, or a combination of these factors. METHODS: We performed a cluster analysis of TB cases in Boston and a case-control study of risk factors for TB with an unclustered isolate among Boston residents with LTBI to determine whether such persons have an increased risk for reactivation of disease. RESULTS: Of 321 case patients with TB seen between 1996 and 2000, 133 isolates were clustered and 188 were not. In multivariate analysis, foreign birth was associated with an unclustered isolate (odds ratio [OR], 2.2; 95% confidence interval [CI], 1.2 to 3.8; p < 0.01), while being a close contact of a TB case was negatively associated (OR, 0.22; 95% CI, 0.07 to 0.73; p = 0.02). When 188 TB patients with unclustered isolates were compared to 188 age-matched control subjects with LTBI, there was no association between the occurrence of TB and foreign birth (OR, 0.71; 95% CI, 0.42 to 1.3); among foreign-born persons, there was no association between the occurrence of TB and being in the United States 相似文献   

2.
Kwara A  Herold JS  Machan JT  Carter EJ 《Chest》2008,133(4):862-868
BACKGROUND: The treatment of latent tuberculosis infection (LTBI) is essential for tuberculosis elimination in the United States, but the major limitation is poor adherence to therapy. To aid the design of targeted adherence interventions, we investigated the factors associated with noncompletion of isoniazid (INH) therapy for LTBI. METHODS: A retrospective analysis of patients with who failed to complete vs those who completed 9 months of INH therapy at the RISE TB Clinic (Miriam Hospital; Providence, RI) in 2003 was performed. Factors associated with treatment noncompletion were examined using univariate and multiple logistic regression analysis. RESULTS: Of 845 patients with LTBI, 690 patients (81.6%) initiated INH therapy, of whom 426 patients (61.7%) completed therapy, and 246 patients (35.6%) were lost to follow-up. Treatment was discontinued in 18 patients (2.6%). Patients who failed to complete therapy were younger (mean age, 30.6 vs 33.8 years, respectively; p = 0.006), and were more likely to be uninsured (42.9% vs 29.8%, respectively; p = 0.0004), to be postpartum (66.7% vs 37.3%, respectively; p = 0.043), and to report treatment side effects (54.8% vs 30.1%, respectively; p < 0.0001). Reported treatment side effects (odds ratio [OR], 3.6; 95% confidence interval [CI], 2.2 to 6.2) and lack of medical insurance (OR, 1.7; 95% CI, 1.1 to 2.7) were each associated with treatment noncompletion in a model including both. Also, pregnant women were more likely than nonpregnant women to fail to initiate INH treatment (52.1% vs 14.7%, respectively; p < 0.0001). CONCLUSIONS: LTBI patients who are young, pregnant or postpartum, uninsured, and/or report treatment side effects may require additional case management to improve INH treatment completion rates.  相似文献   

3.
Eisner MD  Trupin L  Katz PP  Yelin EH  Earnest G  Balmes J  Blanc PD 《Chest》2005,127(6):1890-1897
OBJECTIVE: To develop a comprehensive disease-specific COPD severity instrument for survey-based epidemiologic research. STUDY DESIGN AND SETTING: Using a population-based sample of 383 US adults with self-reported physician-diagnosed COPD, we developed a disease-specific COPD severity instrument. The severity score was based on structured telephone interview responses and included five overall aspects of COPD severity: respiratory symptoms, systemic corticosteroid use, other COPD medication use, previous hospitalization or intubation, and home oxygen use. We evaluated concurrent validity by examining the association between the COPD severity score and three health status domains: pulmonary function, physical health-related quality of life (HRQL), and physical disability. Pulmonary function was available for a subgroup of the sample (FEV1, n = 49; peak expiratory flow rate [PEFR], n = 93). RESULTS: The COPD severity score had high internal consistency reliability (Cronbach alpha = 0.80). Among the 49 subjects with FEV1 data, higher COPD severity scores were associated with poorer percentage of predicted FEV1 (r = - 0.40, p = 0.005). In the 93 subjects with available PEFR measurements, greater COPD severity was also related to worse percentage of predicted PEFR (r = - 0.35, p < 0.001). Higher COPD severity scores were strongly associated with poorer physical HRQL (r = - 0.58, p < 0.0001) and greater restricted activity attributed to a respiratory condition (r = 0.59, p < 0.0001). Higher COPD severity scores were also associated with a greater risk of difficulty with activities of daily living (odds ratio [OR], 2.3; 95% confidence interval [CI], 1.8 to 3.0) and inability to work (OR, 4.2; 95% CI, 3.0 to 5.8). CONCLUSION: The COPD severity score is a reliable and valid measure of disease severity, making it a useful research tool. The severity score, which does not require pulmonary function measurement, can be used as a study outcome or to adjust for disease severity.  相似文献   

4.
Pulmonary impairment after tuberculosis   总被引:3,自引:0,他引:3  
BACKGROUND: Pulmonary impairment subsequent to a cure of pulmonary tuberculosis has been described only in selected populations. METHODS: We compared pulmonary function in a case-control study of 107 prospectively identified patients with pulmonary tuberculosis who had completed at least 20 weeks of therapy and 210 patients with latent tuberculosis infection (LTBI). RESULTS: Both groups had similar risk factors for pulmonary impairment. Impairment was present in 59% of tuberculosis subjects and 20% of LTBI control subjects. FVC, FEV1, FEV1/FVC ratio, and the midexpiratory phase of forced expiratory flow were significantly lower in the treated pulmonary tuberculosis patients than in the comparison group. Ten patients with a history of pulmonary tuberculosis (9.4%) had less than half of their expected vital capacity vs one patient (0.53%) in the LTBI group. Another 42 patients (39%) with tuberculosis had between 20% and 50% of the expected vital capacity vs 36 patients with LTBI (17%). After adjusting for risk, survivors of tuberculosis were 5.4 times more likely to have abnormal pulmonary function test results than were LTBI patients (p > 0.001; 95% confidence interval, 2.98 to 9.68). Birth in the United States (odds ratio [OR], 2.64; p = 0.003) and age (OR, 1.03; p = 0.005) increased the odds of impairment. Pulmonary impairment was more common in cigarette smokers; however, after adjusting for demographic and other risk factors, the difference did not reach statistical significance (p = 0.074). CONCLUSIONS: These findings indicate that pulmonary impairment after tuberculosis is associated with disability worldwide and support more aggressive case prevention strategies and posttreatment evaluation. For many persons with tuberculosis, a microbiological cure is the beginning not the end of their illness.  相似文献   

5.
Agarwal R  Srinivas R  Nath A  Jindal SK 《Chest》2008,133(6):1463-1473
BACKGROUND AND AIM: ARDS can occur from the following two pathogenetic pathways: a direct pulmonary injury (ARDSp); and an indirect injury (ARDSexp). The predisposing clinical factor can influence the pathogenesis and clinical outcome of ARDS. This metaanalysis was aimed at evaluating whether there is any difference in mortality between the two groups. METHODS: We searched the MEDLINE, EMBASE, and CINAHL databases for relevant studies published from 1987 to 2007, and included studies that have reported mortality in the two groups of ARDS. We calculated the odds ratio (OR) and 95% confidence interval (CI) to assess mortality in patients with ARDSp vs patients with ARDSexp and pooled the results using three different statistical models. RESULTS: Our search yielded 34 studies. In all, the studies involved 4,311 patients with 2,330 patients in the ARDSp group and 1,981 patients in the ARDSexp group. The OR of mortality in ARDSp group compared to the ARDSexp group was 1.11 (95% CI, 0.88 to 1.39), as determined by the random-effects model; 1.04 (95% CI, 0.92 to 1.18), as determined by the fixed-effects model; and 1.04 (95% CI, 0.92 to 1.18), as determined by the exact method, indicating that mortality is similar in the two groups. The mortality was no different whether the studies were classified as prospective (OR, 1.15; 95% CI, 0.87 to 1.51) or retrospective (OR, 1.01; 95% CI, 0.61 to 1.69); small (OR, 1.11; 95% CI, 0.77 to 1.60) or large (OR, 1.1; 95% CI, 0.82 to 1.49); or observational (OR, 1.10; 95% CI, 0.82 to 1.49) or interventional (OR, 0.97; 95% CI, 0.79 to 1.19). There was methodological and statistical heterogeneity (I(2), 50.9%; 95% CI, 21.3 to 66.2%; chi(2) statistic, 67.22; p = 0.0004). CONCLUSIONS: The results of this study suggest that there is no difference in mortality between these two groups. Further studies should focus on specific etiologies within the subgroups rather than focusing on the broader division of ARDSp and ARDSexp.  相似文献   

6.
Diel R  Nienhaus A  Loddenkemper R 《Chest》2007,131(5):1424-1434
OBJECTIVES: To assess the cost-effectiveness of the new QuantiFERON-TB Gold In-Tube (QFT-G) [Cellestis; Carnegie, VIC, Australia] assay for screening and treating of persons who have had close contact with tuberculosis (TB) patients and are suspected of having latent tuberculosis infection (LTBI) [hereafter called close-contacts] in Germany. METHODS: The health and economic outcomes of isoniazid treatment of 20-year-old close-contacts were compared in a Markov model over a period of 20 years, using two different cutoff values for the tuberculin skin test (TST), the QFT-G assay alone, or the QFT-G assay as a confirmatory test for the TST results. RESULTS: QFT-G assay-based treatment led to cost savings of $542.9 and 3.8 life-days gained per LTBI case. TST-based treatment at a 10-mm induration size cutoff gained $177.4 and 2.0 life-days gained per test-positive contact. When the cutoff induration size for the TST was reduced to 5 mm, the incremental cost-effectiveness ratio fell below the willingness-to-pay threshold ($30,170 per life-years gained) but resulted in unnecessary treatment of 77% of contacts owing to false-positive TST results. Combination with the 5-mm induration size TST cutoff value compared to the results of the QFT-G assay alone reduced the total costs per 1,000 contacts by 1.8% to $222,869. The number treated to prevent 1 TB case was 22 for the two QFT-G assay-based procedures, 40 for the TST at a cutoff induration size of 10 mm, and 96 for the TST at a cutoff induration size of 5 mm. When the sensitivity rates of the TST and the QFT-G assay were compounded, the QFT-G assay strategy alone was slightly less costly (0.6%) than the two-step approach. CONCLUSIONS: Using the QFT-G assay, but especially combining the QFT-G assay following the TST screening of close-contacts at a cutoff induration size of 5 mm before LTBI treatment is highly cost-effective in reducing the disease burden of TB.  相似文献   

7.
Jia X  Malhotra A  Saeed M  Mark RG  Talmor D 《Chest》2008,133(4):853-861
BACKGROUND: Low tidal volume (Vt) ventilation for ARDS is a well-accepted concept. However, controversy persists regarding the optimal ventilator settings for patients without ARDS receiving mechanical ventilation. This study tested the hypothesis that ventilator settings influence the development of new ARDS. METHODS: Retrospective analysis of patients from the Multi Parameter Intelligent Monitoring of Intensive Care-II project database who received mechanical ventilation for > or = 48 h between 2001 and 2005. RESULTS: A total of 2,583 patients required > 48 h of ventilation. Of 789 patients who did not have ARDS at hospital admission, ARDS developed in 152 patients (19%). Univariate analysis revealed high peak inspiratory pressure (odds ratio [OR], 1.53 per SD; 95% confidence interval [CI], 1.28 to 1.84), increasing positive end-expiratory pressure (OR, 1.35 per SD; 95% CI, 1.15 to 1.58), and Vt (OR, 1.36 per SD; 95% CI, 1.12 to 1.64) to be significant risk factors. Major nonventilator risk factors for ARDS included sepsis, low pH, elevated lactate, low albumin, transfusion of packed RBCs, transfusion of plasma, high net fluid balance, and low respiratory compliance. Multivariable logistic regression showed that peak pressure (OR, 1.31 per SD; 95% CI, 1.08 to 1.59), high net fluid balance (OR, 1.3 per SD; 95% CI, 1.09 to 1.56), transfusion of plasma (OR, 1.26 per SD; 95% CI, 1.07 to 1.49), sepsis (OR, 1.57; 95% CI, 1.00 to 2.45), and Vt (OR, 1.29 per SD; 95% CI, 1.02 to 1.52) were significantly associated with the development of ARDS. CONCLUSIONS: The associations between the development of ARDS and clinical interventions, including high airway pressures, high Vt, positive fluid balance, and transfusion of blood products, suggests that ARDS may be a preventable complication in some cases.  相似文献   

8.
Lalvani A 《Chest》2007,131(6):1898-1906
The century-old tuberculin skin test (TST) was until recently the only means of diagnosing latent tuberculosis infection (LTBI). Recent advances in mycobacterial genomics and human cellular immunology have resulted in two new blood tests that detect tuberculosis infection by measuring in vitro T-cell interferon (IFN)-gamma release in response to two unique antigens that are highly specific for Mycobacterium tuberculosis but absent from bacille Calmette-Guérin (BCG) vaccine and most nontuberculous mycobacteria. One assay, the enzyme-linked immunospot (ELISpot) [T-SPOT.TB; Oxford Immunotec; Oxford, UK] enumerates IFN-gamma-secreting T cells, while the other assay measures IFN-gamma concentration in supernatant by enzyme-linked immunosorbent assay (ELISA) [QuantiFERON-TB Gold; Cellestis; Carnegie, Australia]. A large and growing clinical evidence base indicates that both tests are more specific than the skin test because they are not confounded by prior BCG vaccination. In active tuberculosis, ELISA has similar sensitivity to the skin test, while ELISpot is significantly more sensitive. Current cross-sectional evidence suggests that for diagnosis of LTBI, sensitivity of ELISA is similar to TST, while ELISpot appears more sensitive. High specificity will enable clinicians to avoid unnecessary preventive treatment in BCG-vaccinated persons without infection who commonly have false-positive TST results. High sensitivity could enable accurate targeting of preventive treatment to patients with infection at the highest risk of progression to active tuberculosis who frequently have false-negative TST results due to impaired cellular immunity. However, longitudinal studies are needed to define the predictive value of positive blood test results for progression to tuberculosis.  相似文献   

9.
Metersky ML  Ma A  Houck PM  Bratzler DW 《Chest》2007,131(2):466-473
BACKGROUND: The questions of whether the use of antibiotics that are active against atypical organisms is beneficial in the treatment of community-acquired pneumonia and of the potential mechanisms of any beneficial effects remain unresolved. Proposed mechanisms include activity against atypical organisms vs the immunomodulatory effects of these antibiotics. The study of outcomes of a large cohort of patients with bacteremic pneumonia provides a unique opportunity to address these questions by excluding patients with primary atypical infection. METHODS: We reviewed data from the charts of 2,209 Medicare patients who were admitted to hospitals across the United States from either home or a nursing facility with bacteremic pneumonia between 1998 and 2001. Patients were stratified according to the type of antibiotic treatment. Multivariate modeling was performed to assess the relationship between the class of antibiotic used and several outcome variables. RESULTS: The initial use of any antibiotic active against atypical organisms was independently associated with a decreased risk of 30-day mortality (odds ratio [OR], 0.76; 95% confidence interval [CI], 0.59 to 0.98; p = 0.03) and hospital admission within 30 days of discharge (OR, 0.67; 95% CI, 0.51 to 0.89; p = 0.02). Further analysis revealed that the benefits of atypical treatment were associated with the use of macrolides, but not the use of fluoroquinolones or tetracyclines, with macrolides conferring lower risks of in-hospital mortality (OR, 0.59; 95% CI, 0.40 to 0.88; p = 0.01), 30-day mortality (OR, 0.61; 95% CI, 0.43 to 0.87; p = 0.007), and hospital readmission within 30 days of discharge (OR, 0.59; 95% CI, 0.42 to 0.85; p = 0.004). CONCLUSIONS: Initial antibiotic treatment including a macrolide agent is associated with improved outcomes in Medicare patients hospitalized with bacteremic pneumonia. These results have implications regarding the mechanism by which the use of a macrolide for treatment of pneumonia is associated with improved outcomes.  相似文献   

10.
BACKGROUND: Low-risk patients with community-acquired pneumonia are often hospitalized despite guideline recommendations for outpatient treatment. METHODS: Using data from a randomized trial conducted in 32 emergency departments, we performed a propensity-adjusted analysis to compare 30-day mortality rates, time to the return to work and to usual activities, and patient satisfaction with care between 944 outpatients and 549 inpatients in pneumonia severity index risk classes I to III who did not have evidence of arterial oxygen desaturation, or medical or psychosocial contraindications to outpatient treatment. RESULTS: After adjusting for quintile of propensity score for outpatient treatment, which eliminated all significant differences for baseline characteristics, outpatients were more likely to return to work (odds ratio [OR], 2.0; 95% confidence interval [CI], 1.5 to 2.6) or, for nonworkers, to usual activities (OR, 1.4; 95% CI, 1.1 to 1.8) than were inpatients. Satisfaction with the site-of-treatment decision (OR, 1.1; 95% CI, 0.7 to 1.8), with emergency department care (OR, 1.4; 95% CI, 0.9 to 1.9), and with overall medical care (OR, 1.1; 95% CI, 0.8 to 1.6) was not different between outpatients and inpatients. The overall mortality rate was higher for inpatients than outpatients (2.6% vs 0.1%, respectively; p < 0.01); the mortality rate was not different among the 242 outpatients and 242 inpatients matched by their propensity score (0.4% vs 0.8%, respectively; p = 0.99). CONCLUSIONS: After adjusting for the propensity of site of treatment, outpatient treatment was associated with a more rapid return to usual activities and to work, and with no increased risk of mortality. The higher observed mortality rate among all low-risk inpatients suggests that physician judgment is an important complement to objective risk stratification in the site-of-treatment decision for patients with pneumonia.  相似文献   

11.
BACKGROUND: Estimating the clinical probability of malignancy in patients with a solitary pulmonary nodule (SPN) can facilitate the selection and interpretation of subsequent diagnostic tests. METHODS: We used multiple logistic regression analysis to identify independent clinical predictors of malignancy and to develop a parsimonious clinical prediction model to estimate the pretest probability of malignancy in a geographically diverse sample of 375 veterans with SPNs. We used data from Department of Veterans Affairs (VA) administrative databases and a recently completed VA Cooperative Study that evaluated the accuracy of positron emission tomography (PET) scans for the diagnosis of SPNs. RESULTS: The mean (+/- SD) age of subjects in the sample was 65.9 +/- 10.7 years. The prevalence of malignant SPNs was 54%. Most participants were either current smokers (n = 177) or former smokers (n = 177). Independent predictors of malignant SPNs included a positive smoking history (odds ratio [OR], 7.9; 95% confidence interval [CI], 2.6 to 23.6), older age (OR, 2.2 per 10-year increment; 95% CI, 1.7 to 2.8), larger nodule diameter (OR, 1.1 per 1-mm increment; 95% CI, 1.1 to 1.2), and time since quitting smoking (OR, 0.6 per 10-year increment; 95% CI, 0.5 to 0.7). Model accuracy was very good (area under the curve of the receiver operating characteristic, 0.79; 95% CI, 0.74 to 0.84), and there was excellent agreement between the predicted probability and the observed frequency of malignant SPNs. CONCLUSIONS: Our prediction rule can be used to estimate the pretest probability of malignancy in patients with SPNs, and thereby facilitate clinical decision making when selecting and interpreting the results of diagnostic tests such as PET imaging.  相似文献   

12.
Khan H  Belsher J  Yilmaz M  Afessa B  Winters JL  Moore SB  Hubmayr RD  Gajic O 《Chest》2007,131(5):1308-1314
BACKGROUND: Transfusion has long been identified as a risk factor for acute lung injury (ALI)/ARDS. No study has formally evaluated the transfusion of specific blood products as a risk factor for ALI/ARDS in critically ill medical patients. METHOD: In this single-center retrospective cohort study, 841 consecutive critically ill patients were studied for the development of ALI/ARDS. Patients who received blood product transfusions were compared with those who did not, in univariate and multivariate propensity analyses. RESULTS: Two hundred ninety-eight patients (35%) received blood transfusions. Transfused patients were older (mean [+/- SD] age, 67 +/- 17 years vs 62 +/- 19 years; p < 0.001) and had higher acute physiologic and chronic health evaluation (APACHE) III scores (74 +/- 32 vs 58 +/- 23; p < 0.001) than those who had not received transfusions. ALI/ARDS developed more commonly (25% vs 18%; p = 0.025) in patients exposed to transfusion. Seventeen patients received massive RBC transfusions (ie, > 10 U of blood transfused within 24 h), of whom 13 also received fresh-frozen plasma (FFP) and 11 received platelet transfusions. When adjusted for the probability of transfusion and other ALI/ARDS risk factors, any transfusion was associated with the development of ALI/ARDS (odds ratio [OR], 2.14; 95% confidence interval [CI], 1.24 to 3.75). Among those patients receiving individual blood products, ALI/ARDS was more likely to develop in patients who received FFP transfusions (OR, 2.48; 95% CI, 1.29 to 4.74) and platelet transfusions (OR, 3.89; 95% CI, 1.36 to 11.52) than in those who received only RBC transfusions (OR, 1.39; 95% CI, 0.79 to 2.43). CONCLUSION: Transfusion is associated with an increased risk of the development of ALI/ARDS in critically ill medical patients. The risk is higher with transfusions of plasma-rich blood products, FFP, and platelets, than with RBCs.  相似文献   

13.
BACKGROUND: Pulmonary embolism (PE) is a potentially fatal disease with risks of recurrent venous thrombotic events (venous thromboembolism [VTE]) and major bleeding from anticoagulant therapy. Identifying risk factors for recurrent VTE, bleeding, and mortality may guide clinical decision making. OBJECTIVE: To evaluate the incidence of recurrent VTE, hemorrhagic complications, and mortality in patients with PE, and to identify risk factors and the time course of these events. DESIGN: We evaluated consecutive patients with PE derived from a prospective management study, who were followed for 3 months, treated with anticoagulants, and underwent objective diagnostic testing for suspected recurrent VTE or bleeding. RESULTS: Of 673 patients with complete follow-up, 20 patients (3.0%; 95% confidence interval [CI], 1.8 to 4.6%) had recurrent VTE. Eleven of 14 patients with recurrent PE had a fatal PE (79%; 95% CI, 49 to 95%), occurring mostly in the first week after diagnosis of initial PE. In 23 patients (3.4%; 95% CI, 2.2 to 5.1%), a hemorrhagic complication occurred, 10 of which were major bleeds (1.5%; 95% CI, 0.7 to 2.7%), and 2 were fatal (0.3%; 95% CI, 0.04 to 1.1%). During the 3-month follow-up, 55 patients died (8.2%; 95% CI, 6.2 to 10.5%). Risk factors for recurrent VTE were immobilization for > 3 days and being an inpatient; having COPD or malignancies were risk factors for bleeding. Higher age, immobilization, malignancy, and being an inpatient were risk factors for mortality. CONCLUSIONS: Recurrent VTE occurred in a small percentage of patients treated for an acute PE, and the majority of recurrent PEs were fatal. Immobilization, hospitalization, age, COPD, and malignancies were risk factors for recurrent VTE, bleeding, and mortality. Close monitoring may be indicated in these patients, precluding them from out-of-hospital start of treatment.  相似文献   

14.
ObjectivesThis study sought to develop a scoring model predicting percutaneous coronary intervention (PCI) success in chronic total occlusions.BackgroundCoronary chronic total occlusion is the lesion subtype in which angioplasty is most likely to fail. Chronic total occlusion for PCI (CTO-PCI) failure is associated with higher 1-year mortality and major adverse cardiac events compared with successful CTO-PCI. Although several independent predictors of final procedural success have been identified, no study has yet produced a model predicting final procedural outcome.MethodsData from 1,657 consecutive patients who underwent a first-attempt CTO-PCI were prospectively collected. The scoring model was developed in a derivation cohort of 1,143 patients (70%) using a multivariable stepwise analysis to identify independent predictors of CTO-PCI failure. The model was then validated in the remaining 514 (30%).ResultsThe overall procedural success rate was 72.5%. Independent predictors of CTO-PCI failure were identified and included in the clinical and lesion-related score (CL-score) as follows: previous coronary artery bypass graft surgery +1.5 (odds ratio [OR]: 2.49, 95% confidence interval [CI]: 1.56 to 3.96), previous myocardial infarction +1 (OR: 1.6, 95% CI: 1.17 to 2.2), severe lesion calcification +2 (OR: 2.72, 95% CI :1.78 to 4.16), longer CTOs +1.5 (≥20 mm OR: 2.04, 95% CI: 1.54 to 2.7), non–left anterior descending coronary artery location +1 (OR: 1.56, 95% CI: 1.14 to 2.15), and blunt stump morphology +1 (OR: 1.39, 95% CI: 1.05 to 1.81). Score values of 0 to 1, >1 and <3, ≥3 and <5, and ≥5 identified subgroups at high, intermediate, low, and very low probability, respectively, of CTO-PCI success (derivation cohort: 84.9%, 74.9%, 58%, and 31.9%; p < 0,0001; validation cohort: 88.3%, 73.1%, 59.4%, and 46.2%; p < 0.0001).ConclusionsThis clinical and angiographic score predicted the final CTO-PCI procedural outcome of our study population.  相似文献   

15.
Gould MK  Ghaus SJ  Olsson JK  Schultz EM 《Chest》2008,133(5):1167-1173
BACKGROUND: Timeliness is an important dimension of quality of care for patients with lung cancer. METHODS: We reviewed the records of consecutive patients in whom non-small cell lung cancer (NSCLC) had been diagnosed between January 1, 2002, and December 31, 2003, at the Veterans Affairs Palo Alto Health Care System. We used multivariable statistical methods to identify independent predictors of timely care and examined the effect of timeliness on survival. RESULTS: We identified 129 veterans with NSCLC (mean age, 67 years; 98% men; 83% white), most of whom had adenocarcinoma (51%) or squamous cell carcinoma (30%). A minority of patients (18%) presented with a solitary pulmonary nodule (SPN). The median time from the initial suspicion of cancer to treatment was 84 days (interquartile range, 38 to 153 days). Independent predictors of treatment within 84 days included hospitalization within 7 days (odds ratio [OR], 8.2; 95% confidence interval [CI], 2.9 to 23), tumor size of > 3.0 cm (OR, 4.8; 95% CI, 1.8 to 12.4), the presence of additional chest radiographic abnormalities (OR, 3.0; 95% CI, 1.1 to 8.5), and the presence of one or more symptoms suggesting metastasis (OR, 2.6; 95% CI, 1.1 to 6.2). More timely care was not associated with better survival time (adjusted hazard ratio, 1.6; 95% CI, 1.3 to 1.9). However, in patients with SPNs, there was a trend toward better survival time when the time to treatment was < 84 days. CONCLUSIONS: The time to treatment for patients with NSCLC was often longer than recommended. Patients with larger tumors, symptoms, and other chest radiographic abnormalities receive more timely care. In patients with malignant SPNs, survival may be better when treatment is initiated promptly.  相似文献   

16.
BACKGROUND: The glutathione S-transferase P1 (GSTP1) gene is involved in detoxification of electrophilic substances of tobacco smoke. A polymorphism at nucleotide 315 of this gene alters its enzymatic activity. OBJECTIVE: We analyzed the association between the variability in the GSTP1 gene and impairment in lung function in smokers with and without alpha(1)-antitrypsin (AAT) deficiency and COPD.Population and method: The study population consisted of 99 patients with smoking-related COPD and 69 patients with AAT deficiency; 198 healthy volunteers provided the frequency of the different polymorphisms in the general population. GSTP1 genotyping was performed by a real-time polymerase chain reaction amplification assay. RESULTS: The frequency (0.28) of the 105Val polymorphism was identical in COPD patients and the general population. However, the frequency was significantly increased (0.44) in patients with AAT deficiency (odds ratio [OR], 2.09; 95% confidence interval [CI], 1.17 to 3.72 compared to control subjects; and OR, 2.41; 95% CI, 1.27 to 4.59 compared to COPD). FEV(1) percentage of predicted was significantly impaired in AAT-deficient carriers of 105Val. This effect was not observed in COPD patients. CONCLUSIONS: These findings suggest that the frequency of the GSTP1 105Val polymorphism is increased in patients with AAT deficiency. Globally, GSTP1 genotypes, age, and tobacco smoking explained 41% of total FEV(1) percentage of predicted variability in patients with AAT deficiency. The modulatory role of GSTP1 in lung disease has only been observed in smokers lacking AAT.  相似文献   

17.
BACKGROUND: The tuberculin skin test (TST) has a low specificity in the setting of bacille Calmette-Guérin (BCG) vaccination. Interferon-gamma release assays (IGRAs) appear to be more specific but have not been validated in this population under routine clinical conditions. We sought to validate the routine clinical use of the T-SPOT.TB test (Oxford Immunotec; Oxford, UK), an IGRA, in a predominantly foreign-born population with a high rate of BCG vaccination. METHODS: We compared the TST and the T-SPOT.TB test in 96 subjects at a New York City Department of Health tuberculosis clinic. We aimed to determine which test better predicted being a close contact of a case of active tuberculosis, a surrogate for latent tuberculosis infection. RESULTS: A positive T-SPOT.TB test result was strongly associated with being a close contact of a case of active tuberculosis after adjustment for potential confounders (adjusted odds ratio, 2.9; 95% confidence interval, 1.1 to 7.3; p = 0.03). A positive TST result was associated with being a contact only in subjects without BCG vaccination (p = 0.02). The T-SPOT.TB test was more specific for being a close contact than the TST (p < 0.001). Specificity in BCG-vaccinated subjects was 3% for the TST compared with 70% for the T-SPOT.TB test (p < 0.001). CONCLUSIONS: The T-SPOT.TB test is superior in routine clinical use to the TST for identifying high-risk individuals among foreign-born populations with high rates of BCG vaccination.  相似文献   

18.
ObjectivesThis study sought to assess the frequency and clinical impact of dual antiplatelet therapy (DAPT) nonadherence.BackgroundThere are limited data on the impact of DAPT nonadherence during the first year after a second-generation drug-eluting stent placement.MethodsAfter successful Endeavor zotarolimus-eluting stent implantation, 2,265 patients were enrolled in a registry with limited exclusions and monitored during 12 months of prescribed DAPT. Predictors of any nonadherence (ANA) at 6 months were analyzed by multivariable analysis, and the association between ANA at 6 or 12 months with the endpoints of death, myocardial infarction, and stent thrombosis was assessed.ResultsThe study population included 30% female patients, 34% with diabetes and 36% with acute coronary syndromes. ANA occurred in 208 patients (9.6%) before 6 months and 378 patients (18.5%) before 1 year. Major bleeding (odds ratio [OR]: 12.83, 95% confidence interval [CI]: 7.55 to 21.80, p < 0.001) was the only predictor of ANA at 6 months. In time-dependent analyses, ANA before 6 months was associated with an increased risk of death or myocardial infarction (7.6% vs. 3.0%, p < 0.001) and a numerical increase in stent thrombosis (2.0% vs. 0.9%, p = 0.12). After adjustment for baseline differences, ANA within 6 months remained associated with death or MI (OR: 1.95, 95% CI: 1.02 to 3.75). ANA occurring after 6 months did not increase the risk of subsequent ischemic events.ConclusionsDAPT ANA occurs frequently and is associated with increased risk for thrombotic complications if it occurs within the first 6 months. Major bleeding was a significant correlate of DAPT ANA within 6 months. (EDUCATE: The MEDTRONIC Endeavor Drug Eluting Stenting: Understanding Care, Antiplatelet Agents and Thrombotic Events; NCT01069003)  相似文献   

19.
BACKGROUND: It remains unknown whether pneumococcal bacteremia increases the risk of poor outcomes in hospitalized patients with community-acquired pneumonia (CAP). The objective of this study was to investigate whether the presence of pneumococcal bacteremia influences the clinical outcomes of hospitalized patients with CAP. METHODS: We performed secondary analyses of the Community-Acquired Pneumonia Organization database of hospitalized patients with CAP and pneumococcal bacteremia, and patients with CAP and negative blood culture findings. To identify the effect of pneumococcal bacteremia on patient outcomes, we modeled all-cause mortality and CAP-related mortality using logistic regression analysis, and time to clinical stability and length of hospital stay using Cox proportional hazards models. RESULTS: We studied 125 subjects with pneumococcal bacteremic CAP and 1,847 subjects with nonbacteremic CAP. The multivariable regression analysis revealed a lack of association of pneumococcal bacteremic CAP and time to clinical stability (hazard ratio, 0.87; 95% confidence interval [CI], 0.7 to 1.1; p = 0.25), length of hospital stay (hazard ratio, 1.14; 95% CI, 0.91 to 1.43; p = 0.25), all-cause mortality (odds ratio [OR], 0.68; 95% CI, 0.36 to 1.3; p = 0.25), and CAP-related mortality (OR, 0.86; 95% CI, 0.35 to 2.06; p = 0.73). CONCLUSIONS: Pneumococcal bacteremia does not increase the risk of poor outcomes in patients with CAP. Factors related to severity of disease are confounders of the association between pneumococcal bacteremia and poor outcomes. This study indicates that the presence of pneumococcal bacteremia by itself should not be a contraindication for deescalation of therapy in clinically stable hospitalized patients with CAP.  相似文献   

20.
ObjectivesThis study sought to determine the effect of radial access on outcomes in women undergoing percutaneous coronary intervention (PCI) using a registry-based randomized trial.BackgroundWomen are at increased risk of bleeding and vascular complications after PCI. The role of radial access in women is unclear.MethodsWomen undergoing cardiac catheterization or PCI were randomized to radial or femoral arterial access. Data from the CathPCI Registry and trial-specific data were merged into a final study database. The primary efficacy endpoint was Bleeding Academic Research Consortium type 2, 3, or 5 bleeding or vascular complications requiring intervention. The primary feasibility endpoint was access site crossover. The primary analysis cohort was the subgroup undergoing PCI; sensitivity analyses were conducted in the total randomized population.ResultsThe trial was stopped early for a lower than expected event rate. A total of 1,787 women (691 undergoing PCI) were randomized at 60 sites. There was no significant difference in the primary efficacy endpoint between radial or femoral access among women undergoing PCI (radial 1.2% vs. 2.9% femoral, odds ratio [OR]: 0.39; 95% confidence interval [CI]: 0.12 to 1.27); among women undergoing cardiac catheterization or PCI, radial access significantly reduced bleeding and vascular complications (0.6% vs. 1.7%; OR: 0.32; 95% CI: 0.12 to 0.90). Access site crossover was significantly higher among women assigned to radial access (PCI cohort: 6.1% vs. 1.7%; OR: 3.65; 95% CI: 1.45 to 9.17); total randomized cohort: (6.7% vs. 1.9%; OR: 3.70; 95% CI: 2.14 to 6.40). More women preferred radial access.ConclusionsIn this pragmatic trial, which was terminated early, the radial approach did not significantly reduce bleeding or vascular complications in women undergoing PCI. Access site crossover occurred more often in women assigned to radial access. (SAFE-PCI for Women; NCT01406236)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号