상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
161337.jpg
학술대회자료

Robust Interval Estimation Using Density Power Divergence

  • 2

It is well known that the maximum likelihood estimator (MLE) is asymptotically efficient but not robust with respect to both model misspecification and outliers. Basu et al. (1998) suggest a family of density-based divergence measures called `density power divergences . Each measure in this family is indexed by a single tuning parameter α, which controls the trade-off between robustness and asymptotic efficiency of the estimators. The Kullback-Leibler divergence (Kullback, Leibler, 1951) and &#119871;₂ -distance are members of this family. With a suitably chosen tuning parameter, a minimum density power divergence estimator (MDPDE) can be obtained. For 0<α<1, the estimator is in between MLE (efficient-but-nonrobust) and minimum &#119871;₂ -distance estimator &#119871;₂&#119864; (robust-but-inefficient). Hong, Kim(2001) suggest a data-driven selection of α. In this paper we will suggest a confidence interval using MDPDE when the data set is contaminated. Bootstrap resampling will be used to obtain the confidence interval. The resulting confidence intervals (called MDPD bootstrap confidence intervals) are expected to be robust with respect to the outliers. The performance of the MDPDE bootstrap confidence intervals are investigated via simulation study.

Ⅰ. Introduction

Ⅱ. Robust confidence intervals using density power divergence

Ⅲ. Simulation Study

Ⅳ. Concluding Remarks