상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
커버이미지 없음
KCI등재 학술저널

Robust Estimation in Generalized Linear Model Using Density Power Divergence

DOI : 10.37727/jkdas.2019.21.1.1
  • 17

In parametric density estimation, the maximum likelihood estimator (MLE) is known to be asymptotically efficient but not robust with respect to both model misspecification and outliers. Basu et al. (1998) suggest a family of divergence measures called `density power divergences . Each measure in this family depends on a single tuning parameter , which controls the trade-off between the efficiency and the robustness of the estimators. The Kullback-Leibler divergence (Kullback, Leibler, 1951) and -distance belong to this family. With a selected tuning parameter the density power divergence can be used a criterion for estimation. The minimizer of this divergence is called a minimum density power divergence estimator (MDPDE). In this paper, we will apply the power divergence idea to the regression data. We will generalize the definition of density power divergence for generalized linear model and derive the estimating criterion for the parameters. The resulting minimum density power divergence estimators (MDPDE) of the regression parameters are expected to be robust with respect to the outliers. The robustness of the MDPDE will be investigated via simulation study.

1. Introduction

2. The minimum density power divergence estimator

3. Robust estimation in generalized linear model using density power divergence

4. Simulation study

5. Concluding Remarks

로딩중