首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Dynamic prediction uses longitudinal biomarkers for real‐time prediction of an individual patient's prognosis. This is critical for patients with an incurable disease such as cancer. Biomarker trajectories are usually not linear, nor even monotone, and vary greatly across individuals. Therefore, it is difficult to fit them with parametric models. With this consideration, we propose an approach for dynamic prediction that does not need to model the biomarker trajectories. Instead, as a trade‐off, we assume that the biomarker effects on the risk of disease recurrence are smooth functions over time. This approach turns out to be computationally easier. Simulation studies show that the proposed approach achieves stable estimation of biomarker effects over time, has good predictive performance, and is robust against model misspecification. It is a good compromise between two major approaches, namely, (i) joint modeling of longitudinal and survival data and (ii) landmark analysis. The proposed method is applied to patients with chronic myeloid leukemia. At any time following their treatment with tyrosine kinase inhibitors, longitudinally measured BCR‐ABL gene expression levels are used to predict the risk of disease progression. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

2.
This article considers the problem of examining time‐varying causal effect moderation using observational, longitudinal data in which treatment, candidate moderators, and possible confounders are time varying. The structural nested mean model (SNMM) is used to specify the moderated time‐varying causal effects of interest in a conditional mean model for a continuous response given time‐varying treatments and moderators. We present an easy‐to‐use estimator of the SNMM that combines an existing regression‐with‐residuals (RR) approach with an inverse‐probability‐of‐treatment weighting (IPTW) strategy. The RR approach has been shown to identify the moderated time‐varying causal effects if the time‐varying moderators are also the sole time‐varying confounders. The proposed IPTW+RR approach provides estimators of the moderated time‐varying causal effects in the SNMM in the presence of an additional, auxiliary set of known and measured time‐varying confounders. We use a small simulation experiment to compare IPTW+RR versus the traditional regression approach and to compare small and large sample properties of asymptotic versus bootstrap estimators of the standard errors for the IPTW+RR approach. This article clarifies the distinction between time‐varying moderators and time‐varying confounders. We illustrate the methodology in a case study to assess if time‐varying substance use moderates treatment effects on future substance use. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

3.
Relating time‐varying biomarkers of Alzheimer's disease to time‐to‐event using a Cox model is complicated by the fact that Alzheimer's disease biomarkers are sparsely collected, typically only at study entry; this is problematic since Cox regression with time‐varying covariates requires observation of the covariate process at all failure times. The analysis might be simplified by using study entry as the time origin and treating the time‐varying covariate measured at study entry as a fixed baseline covariate. In this paper, we first derive conditions under which using an incorrect time origin of study entry results in consistent estimation of regression parameters when the time‐varying covariate is continuous and fully observed. We then derive conditions under which treating the time‐varying covariate as fixed at study entry results in consistent estimation. We provide methods for estimating the regression parameter when a functional form can be assumed for the time‐varying biomarker, which is measured only at study entry. We demonstrate our analytical results in a simulation study and apply our methods to data from the Rush Religious Orders Study and Memory and Aging Project and data from the Alzheimer's Disease Neuroimaging Initiative.  相似文献   

4.
Cox models are commonly used in the analysis of time to event data. One advantage of Cox models is the ability to include time‐varying covariates, often a binary covariate that codes for the occurrence of an event that affects an individual subject. A common assumption in this case is that the effect of the event on the outcome of interest is constant and permanent for each subject. In this paper, we propose a modification to the Cox model to allow the influence of an event to exponentially decay over time. Methods for generating data using the inverse cumulative density function for the proposed model are developed. Likelihood ratio tests and AIC are investigated as methods for comparing the proposed model to the commonly used permanent exposure model. A simulation study is performed, and 3 different data sets are presented as examples.  相似文献   

5.
Simulations and Monte Carlo methods serve an important role in modern statistical research. They allow for an examination of the performance of statistical procedures in settings in which analytic and mathematical derivations may not be feasible. A key element in any statistical simulation is the existence of an appropriate data‐generating process: one must be able to simulate data from a specified statistical model. We describe data‐generating processes for the Cox proportional hazards model with time‐varying covariates when event times follow an exponential, Weibull, or Gompertz distribution. We consider three types of time‐varying covariates: first, a dichotomous time‐varying covariate that can change at most once from untreated to treated (e.g., organ transplant); second, a continuous time‐varying covariate such as cumulative exposure at a constant dose to radiation or to a pharmaceutical agent used for a chronic condition; third, a dichotomous time‐varying covariate with a subject being able to move repeatedly between treatment states (e.g., current compliance or use of a medication). In each setting, we derive closed‐form expressions that allow one to simulate survival times so that survival times are related to a vector of fixed or time‐invariant covariates and to a single time‐varying covariate. We illustrate the utility of our closed‐form expressions for simulating event times by using Monte Carlo simulations to estimate the statistical power to detect as statistically significant the effect of different types of binary time‐varying covariates. This is compared with the statistical power to detect as statistically significant a binary time‐invariant covariate. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
Ordinal responses are very common in longitudinal data collected from substance abuse research or other behavioral research. This study develops a new statistical model with free SAS macros that can be applied to characterize time‐varying effects on ordinal responses. Our simulation study shows that the ordinal‐scale time‐varying effects model has very low estimation bias and sometimes offers considerably better performance when fitting data with ordinal responses than a model that treats the response as continuous. Contrary to a common assumption that an ordinal scale with several levels can be treated as continuous, our results indicate that it is not so much the number of levels on the ordinal scale but rather the skewness of the distribution that makes a difference on relative performance of linear versus ordinal models. We use longitudinal data from a well‐known study on youth at high risk for substance abuse as a motivating example to demonstrate that the proposed model can characterize the time‐varying effect of negative peer influences on alcohol use in a way that is more consistent with the developmental theory and existing literature, in comparison with the linear time‐varying effect model. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

7.
Recent studies found that infection‐related hospitalization was associated with increased risk of cardiovascular (CV) events, such as myocardial infarction and stroke in the dialysis population. In this work, we develop time‐varying effects modeling tools in order to examine the CV outcome risk trajectories during the time periods before and after an initial infection‐related hospitalization. For this, we propose partly conditional and fully conditional partially linear generalized varying coefficient models (PL‐GVCMs) for modeling time‐varying effects in longitudinal data with substantial follow‐up truncation by death. Unconditional models that implicitly target an immortal population is not a relevant target of inference in applications involving a population with high mortality, like the dialysis population. A partly conditional model characterizes the outcome trajectory for the dynamic cohort of survivors, where each point in the longitudinal trajectory represents a snapshot of the population relationships among subjects who are alive at that time point. In contrast, a fully conditional approach models the time‐varying effects of the population stratified by the actual time of death, where the mean response characterizes individual trends in each cohort stratum. We compare and contrast partly and fully conditional PL‐GVCMs in our aforementioned application using hospitalization data from the United States Renal Data System. For inference, we develop generalized likelihood ratio tests. Simulation studies examine the efficacy of estimation and inference procedures. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

8.
目的 探讨混合线性模型在带有时依协变量的重复测量资料分析中的应用.方法 以治疗轻、中度原发性高血压病临床试验资料为例,考虑到给药方案在各个时间点随病情而变化,利用SAS中的MIXED过程,选择合适的协方差结构来实现带有时依协变量的重复测量资料的统计分析.结果 时依协变量(给药方案)对治疗轻、中度原发性高血压病有统计学意义(P<0.05);时间因素有统计学意义(P<0.05);给药方案与时间因素之间有交互效应(P<0.05)、给药方案与处理因素之间有交互效应(P<0.05).结论 采用混合线性模型对带有时依协变量的临床试验重复测量资料进行统计分析,可以更客观地进行药物疗效评价.  相似文献   

9.
This paper provides guidance for researchers with some mathematical background on the conduct of time‐to‐event analysis in observational studies based on intensity (hazard) models. Discussions of basic concepts like time axis, event definition and censoring are given. Hazard models are introduced, with special emphasis on the Cox proportional hazards regression model. We provide check lists that may be useful both when fitting the model and assessing its goodness of fit and when interpreting the results. Special attention is paid to how to avoid problems with immortal time bias by introducing time‐dependent covariates. We discuss prediction based on hazard models and difficulties when attempting to draw proper causal conclusions from such models. Finally, we present a series of examples where the methods and check lists are exemplified. Computational details and implementation using the freely available R software are documented in Supplementary Material. The paper was prepared as part of the STRATOS initiative.  相似文献   

10.
We propose an approach to conduct mediation analysis for survival data with time‐varying exposures, mediators, and confounders. We identify certain interventional direct and indirect effects through a survival mediational g‐formula and describe the required assumptions. We also provide a feasible parametric approach along with an algorithm and software to estimate these effects. We apply this method to analyze the Framingham Heart Study data to investigate the causal mechanism of smoking on mortality through coronary artery disease. The estimated overall 10‐year all‐cause mortality risk difference comparing “always smoke 30 cigarettes per day” versus “never smoke” was 4.3 (95% CI = (1.37, 6.30)). Of the overall effect, we estimated 7.91% (95% CI: = 1.36%, 19.32%) was mediated by the incidence and timing of coronary artery disease. The survival mediational g‐formula constitutes a powerful tool for conducting mediation analysis with longitudinal data.  相似文献   

11.
We evaluate two‐phase designs to follow‐up findings from genome‐wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation‐maximization‐based inference under a semiparametric maximum likelihood formulation tailored for post‐GWAS inference. A GWAS‐SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT‐SNP‐dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme‐QT strata yields significant power improvements compared to marginal QT‐ or SNP‐based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure.  相似文献   

12.
We have developed a method, called Meta‐STEPP (subpopulation treatment effect pattern plot for meta‐analysis), to explore treatment effect heterogeneity across covariate values in the meta‐analysis setting for time‐to‐event data when the covariate of interest is continuous. Meta‐STEPP forms overlapping subpopulations from individual patient data containing similar numbers of events with increasing covariate values, estimates subpopulation treatment effects using standard fixed‐effects meta‐analysis methodology, displays the estimated subpopulation treatment effect as a function of the covariate values, and provides a statistical test to detect possibly complex treatment‐covariate interactions. Simulation studies show that this test has adequate type‐I error rate recovery as well as power when reasonable window sizes are chosen. When applied to eight breast cancer trials, Meta‐STEPP suggests that chemotherapy is less effective for tumors with high estrogen receptor expression compared with those with low expression. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

13.
For censored survival outcomes, it can be of great interest to evaluate the predictive power of individual markers or their functions. Compared with alternative evaluation approaches, approaches based on the time‐dependent receiver operating characteristics (ROC) rely on much weaker assumptions, can be more robust, and hence are preferred. In this article, we examine evaluation of markers' predictive power using the time‐dependent ROC curve and a concordance measure that can be viewed as a weighted area under the time‐dependent area under the ROC curve profile. This study significantly advances from existing time‐dependent ROC studies by developing nonparametric estimators of the summary indexes and, more importantly, rigorously establishing their asymptotic properties. It reinforces the statistical foundation of the time‐dependent ROC‐based evaluation approaches for censored survival outcomes. Numerical studies, including simulations and application to an HIV clinical trial, demonstrate the satisfactory finite‐sample performance of the proposed approaches. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

14.
Observational comparative effectiveness and safety studies are often subject to immortal person‐time, a period of follow‐up during which outcomes cannot occur because of the treatment definition. Common approaches, like excluding immortal time from the analysis or naïvely including immortal time in the analysis, are known to result in biased estimates of treatment effect. Other approaches, such as the Mantel–Byar and landmark methods, have been proposed to handle immortal time. Little is known about the performance of the landmark method in different scenarios. We conducted extensive Monte Carlo simulations to assess the performance of the landmark method compared with other methods in settings that reflect realistic scenarios. We considered four landmark times for the landmark method. We found that the Mantel–Byar method provided unbiased estimates in all scenarios, whereas the exclusion and naïve methods resulted in substantial bias when the hazard of the event was constant or decreased over time. The landmark method performed well in correcting immortal person‐time bias in all scenarios when the treatment effect was small, and provided unbiased estimates when there was no treatment effect. The bias associated with the landmark method tended to be small when the treatment rate was higher in the early follow‐up period than it was later. These findings were confirmed in a case study of chronic obstructive pulmonary disease. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

15.
The self‐controlled case series method is a statistical approach to investigating associations between acute outcomes and transient exposures. The method uses cases only and compares time at risk after the transient exposure with time at risk outside the exposure period within an individual, using conditional Poisson regression. The risk of outcome and exposure often varies over time, for example, with age, and it is important to allow for such time dependence within the analysis. The standard approach for modelling time‐varying covariates is to split observation periods into blocks according to categories of the covariate and then to model the relationship using indicators for each category. However, this can be inefficient and can lead to problems with collinearity if the exposure occurs at approximately the same time in all individuals. As an alternative, we propose using fractional polynomials to model the relationship between the time‐varying covariate and incidence of the outcome. We present the results from an analysis exploring the association between rotavirus vaccination and intussusception risk as well as a simulation study. We conclude that fractional polynomials provide a useful approach to adjusting for time‐varying covariates but that it is important to explore the sensitivity of the results to the number of categories and the method of adjustment. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

16.
Meta‐analysis of time‐to‐event outcomes using the hazard ratio as a treatment effect measure has an underlying assumption that hazards are proportional. The between‐arm difference in the restricted mean survival time is a measure that avoids this assumption and allows the treatment effect to vary with time. We describe and evaluate meta‐analysis based on the restricted mean survival time for dealing with non‐proportional hazards and present a diagnostic method for the overall proportional hazards assumption. The methods are illustrated with the application to two individual participant meta‐analyses in cancer. The examples were chosen because they differ in disease severity and the patterns of follow‐up, in order to understand the potential impacts on the hazards and the overall effect estimates. We further investigate the estimation methods for restricted mean survival time by a simulation study. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

17.
Often in many biomedical and epidemiologic studies, estimating hazards function is of interest. The Breslow's estimator is commonly used for estimating the integrated baseline hazard, but this estimator requires the functional form of covariate effects to be correctly specified. It is generally difficult to identify the true functional form of covariate effects in the presence of time-dependent covariates. To provide a complementary method to the traditional proportional hazard model, we propose a tree-type method which enables simultaneously estimating both baseline hazards function and the effects of time-dependent covariates. Our interest will be focused on exploring the potential data structures rather than formal hypothesis testing. The proposed method approximates the baseline hazards and covariate effects with step-functions. The jump points in time and in covariate space are searched via an algorithm based on the improvement of the full log-likelihood function. In contrast to most other estimating methods, the proposed method estimates the hazards function rather than integrated hazards. The method is applied to model the risk of withdrawal in a clinical trial that evaluates the anti-depression treatment in preventing the development of clinical depression. Finally, the performance of the method is evaluated by several simulation studies.  相似文献   

18.
In clinical trials, patients with different biomarker features may respond differently to the new treatments or drugs. In personalized medicine, it is important to study the interaction between treatment and biomarkers in order to clearly identify patients that benefit from the treatment. With the local partial‐likelihood estimation (LPLE) method proposed by Fan J, Lin H, Zhou Y. Local partial‐likelihood estimation for lifetime data. The Annals of Statistics 2006; 34 (1):290?325, the treatment effect can be modeled as a flexible function of the biomarker. In this paper, we propose a bootstrap test method for survival outcome data based on the LPLE, for assessing whether the treatment effect is a constant among all patients or varies as a function of the biomarker. The test method is called local partial‐likelihood bootstrap (LPLB) and is developed by bootstrapping the martingale residuals. The test statistic measures the amount of change in treatment effects across the entire range of the biomarker and is derived based on asymptotic theories for martingales. The LPLB method is nonparametric and is shown in simulations and data analysis examples to be flexible enough to identify treatment effects in a biomarker‐defined subset and more powerful to detect treatment‐biomarker interaction of complex forms than the Cox regression model with a simple interaction. We use data from a breast cancer and a prostate cancer clinical trial to illustrate the proposed LPLB test. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

19.
We examine the relationship between total mortality, deaths due to motor vehicle accidents, cardiovascular disease and measures of business cycles for the USA, using a time‐varying parameter model for the periods 1961–2010. We first present a theoretical model to outline the transmission mechanism from business cycles to health status, to motivate our empirical framework and to explain why the relationship between mortality and the economy may have changed over time. We find overwhelming evidence of structural breaks in the relationship between mortality and business cycles over the sample period. Overall, the relationship between total mortality, cardiovascular mortality and the economy has become less procyclical over time and even countercyclical in recent times for certain age groups. Deaths due to motor vehicle accidents have remained strongly procyclical. Using drugs and medical patent data and data on hours worked, we argue that important advances in medical technology and changes in the effects that working hours have on health are important reasons for this time‐varying relationship. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

20.
Flexible survival models are in need when modelling data from long term follow‐up studies. In many cases, the assumption of proportionality imposed by a Cox model will not be valid. Instead, a model that can identify time varying effects of fixed covariates can be used. Although there are several approaches that deal with this problem, it is not always straightforward how to choose which covariates should be modelled having time varying effects and which not. At the same time, it is up to the researcher to define appropriate time functions that describe the dynamic pattern of the effects. In this work, we suggest a model that can deal with both fixed and time varying effects and uses simple hypotheses tests to distinguish which covariates do have dynamic effects. The model is an extension of the parsimonious reduced rank model of rank 1. As such, the number of parameters is kept low, and thus, a flexible set of time functions, such as b‐splines, can be used. The basic theory is illustrated along with an efficient fitting algorithm. The proposed method is applied to a dataset of breast cancer patients and compared with a multivariate fractional polynomials approach for modelling time‐varying effects. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号