Penalization,bias reduction,and default priors in logistic and related categorical and survival regressions |
| |
Authors: | Sander Greenland Mohammad Ali Mansournia |
| |
Affiliation: | 1. Department of Epidemiology, Fielding School of Public Health, University of California, Los Angeles, CA, U.S.A.;2. Department of Statistics, College of Letters and Science, University of California, Los Angeles, CA, U.S.A.;3. Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran |
| |
Abstract: | Penalization is a very general method of stabilizing or regularizing estimates, which has both frequentist and Bayesian rationales. We consider some questions that arise when considering alternative penalties for logistic regression and related models. The most widely programmed penalty appears to be the Firth small‐sample bias‐reduction method (albeit with small differences among implementations and the results they provide), which corresponds to using the log density of the Jeffreys invariant prior distribution as a penalty function. The latter representation raises some serious contextual objections to the Firth reduction, which also apply to alternative penalties based on t‐distributions (including Cauchy priors). Taking simplicity of implementation and interpretation as our chief criteria, we propose that the log‐F(1,1) prior provides a better default penalty than other proposals. Penalization based on more general log‐F priors is trivial to implement and facilitates mean‐squared error reduction and sensitivity analyses of penalty strength by varying the number of prior degrees of freedom. We caution however against penalization of intercepts, which are unduly sensitive to covariate coding and design idiosyncrasies. Copyright © 2015 John Wiley & Sons, Ltd. |
| |
Keywords: | Bayes estimators bias correction Firth bias reduction Jeffreys prior logistic regression maximum likelihood penalized likelihood regularization shrinkage sparse data stabilization |
|
|