首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
As medical expenses continue to rise, methods to properly analyze cost outcomes are becoming of increasing relevance when seeking to compare average costs across treatments. Inverse probability weighted regression models have been developed to address the challenge of cost censoring in order to identify intent‐to‐treat effects (i.e., to compare mean costs between groups on the basis of their initial treatment assignment, irrespective of any subsequent changes to their treatment status). In this paper, we describe a nested g‐computation procedure that can be used to compare mean costs between two or more time‐varying treatment regimes. We highlight the relative advantages and limitations of this approach when compared with existing regression‐based models. We illustrate the utility of this approach as a means to inform public policy by applying it to a simulated data example motivated by costs associated with cancer treatments. Simulations confirm that inference regarding intent‐to‐treat effects versus the joint causal effects estimated by the nested g‐formula can lead to markedly different conclusions regarding differential costs. Therefore, it is essential to prespecify the desired target of inference when choosing between these two frameworks. The nested g‐formula should be considered as a useful, complementary tool to existing methods when analyzing cost outcomes.  相似文献   

2.
Marginal structural models (MSMs) allow estimating the causal effect of a time-varying exposure on an outcome in the presence of time-dependent confounding. The parameters of MSMs can be estimated utilizing an inverse probability of treatment weight estimator under certain assumptions. One of these assumptions is that the proposed causal model relating the outcome to exposure history is correctly specified. However, in practice, the true model is unknown. We propose a test that employs the observed data to attempt validating the assumption that the model is correctly specified. The performance of the proposed test is investigated with a simulation study. We illustrate our approach by estimating the effect of repeated exposure to psychosocial stressors at work on ambulatory blood pressure in a large cohort of white-collar workers in Québec City, Canada. Code examples in SAS and R are provided to facilitate the implementation of the test.  相似文献   

3.
Optimal timing of initiating antiretroviral therapy has been a controversial topic in HIV research. Two highly publicized studies applied different analytical approaches, a dynamic marginal structural model and a multiple imputation method, to different observational databases and came up with different conclusions. Discrepancies between the two studies' results could be due to differences between patient populations, fundamental differences between statistical methods, or differences between implementation details. For example, the two studies adjusted for different covariates, compared different thresholds, and had different criteria for qualifying measurements. If both analytical approaches were applied to the same cohort holding technical details constant, would their results be similar? In this study, we applied both statistical approaches using observational data from 12,708 HIV‐infected persons throughout the USA. We held technical details constant between the two methods and then repeated analyses varying technical details to understand what impact they had on findings. We also present results applying both approaches to simulated data. Results were similar, although not identical, when technical details were held constant between the two statistical methods. Confidence intervals for the dynamic marginal structural model tended to be wider than those from the imputation approach, although this may have been due in part to additional external data used in the imputation analysis. We also consider differences in the estimands, required data, and assumptions of the two statistical methods. Our study provides insights into assessing optimal dynamic treatment regimes in the context of starting antiretroviral therapy and in more general settings. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

4.
5.
In the presence of time‐dependent confounding, there are several methods available to estimate treatment effects. With correctly specified models and appropriate structural assumptions, any of these methods could provide consistent effect estimates, but with real‐world data, all models will be misspecified and it is difficult to know if assumptions are violated. In this paper, we investigate five methods: inverse probability weighting of marginal structural models, history‐adjusted marginal structural models, sequential conditional mean models, g‐computation formula, and g‐estimation of structural nested models. This work is motivated by an investigation of the effects of treatments in cystic fibrosis using the UK Cystic Fibrosis Registry data focussing on two outcomes: lung function (continuous outcome) and annual number of days receiving intravenous antibiotics (count outcome). We identified five features of this data that may affect the performance of the methods: misspecification of the causal null, long‐term treatment effects, effect modification by time‐varying covariates, misspecification of the direction of causal pathways, and censoring. In simulation studies, under ideal settings, all five methods provide consistent estimates of the treatment effect with little difference between methods. However, all methods performed poorly under some settings, highlighting the importance of using appropriate methods based on the data available. Furthermore, with the count outcome, the issue of non‐collapsibility makes comparison between methods delivering marginal and conditional effects difficult. In many situations, we would recommend using more than one of the available methods for analysis, as if the effect estimates are very different, this would indicate potential issues with the analyses.  相似文献   

6.
When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time‐dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each ‘trial’ is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

7.
Causal inference for non‐censored response variables, such as binary or quantitative outcomes, is often based on either (1) direct standardization (‘G‐formula’) or (2) inverse probability of treatment assignment weights (‘propensity score’). To do causal inference in survival analysis, one needs to address right‐censoring, and often, special techniques are required for that purpose. We will show how censoring can be dealt with ‘once and for all’ by means of so‐called pseudo‐observations when doing causal inference in survival analysis. The pseudo‐observations can be used as a replacement of the outcomes without censoring when applying ‘standard’ causal inference methods, such as (1) or (2) earlier. We study this idea for estimating the average causal effect of a binary treatment on the survival probability, the restricted mean lifetime, and the cumulative incidence in a competing risks situation. The methods will be illustrated in a small simulation study and via a study of patients with acute myeloid leukemia who received either myeloablative or non‐myeloablative conditioning before allogeneic hematopoetic cell transplantation. We will estimate the average causal effect of the conditioning regime on outcomes such as the 3‐year overall survival probability and the 3‐year risk of chronic graft‐versus‐host disease. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

8.
9.
Cost‐effectiveness analysis is an important tool that can be applied to the evaluation of a health treatment or policy. When the observed costs and outcomes result from a nonrandomized treatment, making causal inference about the effects of the treatment requires special care. The challenges are compounded when the observation period is truncated for some of the study subjects. This paper presents a method of unbiased estimation of cost‐effectiveness using observational study data that is not fully observed. The method—twice‐weighted multiple interval estimation of a marginal structural model—was developed in order to analyze the cost‐effectiveness of treatment protocols for advanced dementia residents living nursing homes when they become acutely ill. A key feature of this estimation approach is that it facilitates a sensitivity analysis that identifies the potential effects of unmeasured confounding on the conclusions concerning cost‐effectiveness. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

10.
Progression of a chronic disease can lead to the development of secondary illnesses. An example is the development of active tuberculosis (TB) in HIV‐infected individuals. HIV disease progression, as indicated by declining CD4 + T‐cell count (CD4), increases both the risk of TB and the risk of AIDS‐related mortality. This means that CD4 is a time‐dependent confounder for the effect of TB on AIDS‐related mortality. Part of the effect of TB on AIDS‐related mortality may be indirect by causing a drop in CD4. Estimating the total causal effect of TB on AIDS‐related mortality using standard statistical techniques, conditioning on CD4 to adjust for confounding, then gives an underestimate of the true effect. Marginal structural models (MSMs) can be used to obtain an unbiased estimate. We describe an easily implemented algorithm that uses G‐computation to fit an MSM, as an alternative to inverse probability weighting (IPW). Our algorithm is simplified by utilizing individual baseline parameters that describe CD4 development. Simulation confirms that the algorithm can produce an unbiased estimate of the effect of a secondary illness, when a marker for primary disease progression is both a confounder and intermediary for the effect of the secondary illness. We used the algorithm to estimate the total causal effect of TB on AIDS‐related mortality in HIV‐infected individuals, and found a hazard ratio of 3.5 (95 per cent confidence interval 1.2–9.1). Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

11.
Evaluating the impacts of clinical or policy interventions on health care utilization requires addressing methodological challenges for causal inference while also analyzing highly skewed data. We examine the impact of registering with a Family Medicine Group, an integrated primary care model in Quebec, on hospitalization and emergency department visits using propensity scores to adjust for baseline characteristics and marginal structural models to account for time‐varying exposures. We also evaluate the performance of different marginal structural generalized linear models in the presence of highly skewed data and conduct a simulation study to determine the robustness of alternative generalized linear models to distributional model mis‐specification. Although the simulations found that the zero‐inflated Poisson likelihood performed the best overall, the negative binomial likelihood gave the best fit for both outcomes in the real dataset. Our results suggest that registration to a Family Medicine Group for all 3 years caused a small reduction in the number of emergency room visits and no significant change in the number of hospitalizations in the final year. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

12.
It is often the case that interest lies in the effect of an exposure on each of several distinct event types. For example, we are motivated to investigate in the impact of recent injection drug use on deaths due to each of cancer, end‐stage liver disease, and overdose in the Canadian Co‐infection Cohort (CCC). We develop a marginal structural model that permits estimation of cause‐specific hazards in situations where more than one cause of death is of interest. Marginal structural models allow for the causal effect of treatment on outcome to be estimated using inverse‐probability weighting under the assumption of no unmeasured confounding; these models are particularly useful in the presence of time‐varying confounding variables, which may also mediate the effect of exposures. An asymptotic variance estimator is derived, and a cumulative incidence function estimator is given. We compare the performance of the proposed marginal structural model for multiple‐outcome data to that of conventional competing risks models in simulated data and demonstrate the use of the proposed approach in the CCC. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

13.
Correct specification of the inverse probability weighting (IPW) model is necessary for consistent inference from a marginal structural Cox model (MSCM). In practical applications, researchers are typically unaware of the true specification of the weight model. Nonetheless, IPWs are commonly estimated using parametric models, such as the main‐effects logistic regression model. In practice, assumptions underlying such models may not hold and data‐adaptive statistical learning methods may provide an alternative. Many candidate statistical learning approaches are available in the literature. However, the optimal approach for a given dataset is impossible to predict. Super learner (SL) has been proposed as a tool for selecting an optimal learner from a set of candidates using cross‐validation. In this study, we evaluate the usefulness of a SL in estimating IPW in four different MSCM simulation scenarios, in which we varied the specification of the true weight model specification (linear and/or additive). Our simulations show that, in the presence of weight model misspecification, with a rich and diverse set of candidate algorithms, SL can generally offer a better alternative to the commonly used statistical learning approaches in terms of MSE as well as the coverage probabilities of the estimated effect in an MSCM. The findings from the simulation studies guided the application of the MSCM in a multiple sclerosis cohort from British Columbia, Canada (1995–2008), to estimate the impact of beta‐interferon treatment in delaying disability progression. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

14.
Cox models are commonly used in the analysis of time to event data. One advantage of Cox models is the ability to include time‐varying covariates, often a binary covariate that codes for the occurrence of an event that affects an individual subject. A common assumption in this case is that the effect of the event on the outcome of interest is constant and permanent for each subject. In this paper, we propose a modification to the Cox model to allow the influence of an event to exponentially decay over time. Methods for generating data using the inverse cumulative density function for the proposed model are developed. Likelihood ratio tests and AIC are investigated as methods for comparing the proposed model to the commonly used permanent exposure model. A simulation study is performed, and 3 different data sets are presented as examples.  相似文献   

15.
Inverse probability of treatment weighted (IPTW) estimation for marginal structural models (MSMs) requires the specification of a nuisance model describing the conditional relationship between treatment allocation and confounders. However, there is still limited information on the best strategy for building these treatment models in practice. We developed a series of simulations to systematically determine the effect of including different types of candidate variables in such models. We explored the performance of IPTW estimators across several scenarios of increasing complexity, including one designed to mimic the complexity typically seen in large pharmacoepidemiologic studies.Our results show that including pure predictors of treatment (i.e. not confounders) in treatment models can lead to estimators that are biased and highly variable, particularly in the context of small samples. The bias and mean-squared error of the MSM-based IPTW estimator increase as the complexity of the problem increases. The performance of the estimator is improved by either increasing the sample size or using only variables related to the outcome to develop the treatment model. Estimates of treatment effect based on the true model for the probability of treatment are asymptotically unbiased.We recommend including only pure risk factors and confounders in the treatment model when developing an IPTW-based MSM.  相似文献   

16.
Joint models for longitudinal and time‐to‐event data are particularly relevant to many clinical studies where longitudinal biomarkers could be highly associated with a time‐to‐event outcome. A cutting‐edge research direction in this area is dynamic predictions of patient prognosis (e.g., survival probabilities) given all available biomarker information, recently boosted by the stratified/personalized medicine initiative. As these dynamic predictions are individualized, flexible models are desirable in order to appropriately characterize each individual longitudinal trajectory. In this paper, we propose a new joint model using individual‐level penalized splines (P‐splines) to flexibly characterize the coevolution of the longitudinal and time‐to‐event processes. An important feature of our approach is that dynamic predictions of the survival probabilities are straightforward as the posterior distribution of the random P‐spline coefficients given the observed data is a multivariate skew‐normal distribution. The proposed methods are illustrated with data from the HIV Epidemiology Research Study. Our simulation results demonstrate that our model has better dynamic prediction performance than other existing approaches. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.  相似文献   

17.
For the estimation of controlled direct effects (i.e., direct effects controlling intermediates that are set at a fixed level for all members of the population) without bias, two fundamental assumptions must hold: the absence of unmeasured confounding factors for treatment and outcome and for intermediate variables and outcome. Even if these assumptions hold, one would nonetheless fail to estimate direct effects using standard methods, for example, stratification or regression modeling, when the treatment influences confounding factors. For such situations, the sequential g‐estimation method for structural nested mean models has been developed for estimating controlled direct effects in point‐treatment situations. In this study, we demonstrate that this method can be applied to longitudinal data with time‐varying treatments and repeatedly measured intermediate variables. We sequentially estimate the parameters in two structural nested mean models: one for a repeatedly measured intermediate and the other one for direct effects of a time‐varying treatment. The method was applied to data from a large primary prevention trial for coronary events, in which pravastatin was used to lower the cholesterol levels in patients with moderate hypercholesterolemia. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

18.
Marginal structural Cox models are used for quantifying marginal treatment effects on outcome event hazard function. Such models are estimated using inverse probability of treatment and censoring (IPTC) weighting, which properly accounts for the impact of time‐dependent confounders, avoiding conditioning on factors on the causal pathway. To estimate the IPTC weights, the treatment assignment mechanism is conventionally modeled in discrete time. While this is natural in situations where treatment information is recorded at scheduled follow‐up visits, in other contexts, the events specifying the treatment history can be modeled in continuous time using the tools of event history analysis. This is particularly the case for treatment procedures, such as surgeries. In this paper, we propose a novel approach for flexible parametric estimation of continuous‐time IPTC weights and illustrate it in assessing the relationship between metastasectomy and mortality in metastatic renal cell carcinoma patients. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
The goal of mediation analysis is to identify and explicate the mechanism that underlies a relationship between a risk factor and an outcome via an intermediate variable (mediator). In this paper, we consider the estimation of mediation effects in zero‐inflated (ZI) models intended to accommodate ‘extra’ zeros in count data. Focusing on the ZI negative binomial models, we provide a mediation formula approach to estimate the (overall) mediation effect in the standard two‐stage mediation framework under a key sequential ignorability assumption. We also consider a novel decomposition of the overall mediation effect for the ZI context using a three‐stage mediation model. Estimation of the components of the overall mediation effect requires an assumption involving the joint distribution of two counterfactuals. Simulation study results demonstrate low bias of mediation effect estimators and close‐to‐nominal coverage probability of confidence intervals. We also modify the mediation formula method by replacing ‘exact’ integration with a Monte Carlo integration method. The method is applied to a cohort study of dental caries in very low birth weight adolescents. For overall mediation effect estimation, sensitivity analysis was conducted to quantify the degree to which key assumption must be violated to reverse the original conclusion. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

20.
Mediators are intermediate variables in the causal pathway between an exposure and an outcome. Mediation analysis investigates the extent to which exposure effects occur through these variables, thus revealing causal mechanisms. In this paper, we consider the estimation of the mediation effect when the outcome is binary and multiple mediators of different types exist. We give a precise definition of the total mediation effect as well as decomposed mediation effects through individual or sets of mediators using the potential outcomes framework. We formulate a model of joint distribution (probit‐normal) using continuous latent variables for any binary mediators to account for correlations among multiple mediators. A mediation formula approach is proposed to estimate the total mediation effect and decomposed mediation effects based on this parametric model. Estimation of mediation effects through individual or subsets of mediators requires an assumption involving the joint distribution of multiple counterfactuals. We conduct a simulation study that demonstrates low bias of mediation effect estimators for two‐mediator models with various combinations of mediator types. The results also show that the power to detect a nonzero total mediation effect increases as the correlation coefficient between two mediators increases, whereas power for individual mediation effects reaches a maximum when the mediators are uncorrelated. We illustrate our approach by applying it to a retrospective cohort study of dental caries in adolescents with low and high socioeconomic status. Sensitivity analysis is performed to assess the robustness of conclusions regarding mediation effects when the assumption of no unmeasured mediator‐outcome confounders is violated. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号