Robust Estimation in Generalized Linear Model Using Density Power Divergence
- 한국자료분석학회
- Journal of The Korean Data Analysis Society (JKDAS)
- Vol.21 No.1
- : KCI등재
- 2019.02
- 1 - 10 (10 pages)
In parametric density estimation, the maximum likelihood estimator (MLE) is known to be asymptotically efficient but not robust with respect to both model misspecification and outliers. Basu et al. (1998) suggest a family of divergence measures called `density power divergences . Each measure in this family depends on a single tuning parameter , which controls the trade-off between the efficiency and the robustness of the estimators. The Kullback-Leibler divergence (Kullback, Leibler, 1951) and -distance belong to this family. With a selected tuning parameter the density power divergence can be used a criterion for estimation. The minimizer of this divergence is called a minimum density power divergence estimator (MDPDE). In this paper, we will apply the power divergence idea to the regression data. We will generalize the definition of density power divergence for generalized linear model and derive the estimating criterion for the parameters. The resulting minimum density power divergence estimators (MDPDE) of the regression parameters are expected to be robust with respect to the outliers. The robustness of the MDPDE will be investigated via simulation study.
1. Introduction
2. The minimum density power divergence estimator
3. Robust estimation in generalized linear model using density power divergence
4. Simulation study
5. Concluding Remarks