상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
학술대회자료

Uniform Asymptotic Normality in Stationary and Unit Root Autoregression

Uniform Asymptotic Normality in Stationary and Unit Root Autoregression

  • 0
커버이미지 없음

While differencing transformations can eliminate nonstationarity, they typically reduce signal strength and correspondingly reduce rates of convergence in unit root autoregressions. The present paper shows that aggregating moment conditions that are formulated in differences provides an orderly mechanism for preserving information and signal strength in autoregressions with some very desirable properties. In rst order autoregression, a partially aggregated estimator based on moment conditions in differences is shown to have a limiting normal distribution which holds uniformly in the autoregressive coef cient including stationary and unit root cases. The rate of convergence is p n when j j < 1 and the limit distribution is the same as the Gaussian maximum likelihood estimator (MLE), but when = 1 the rate of convergence to the normal distribution is within a slowly varying factor of n: A fully aggregated estimator is shown to have the same limit behavior in the stationary case and to have nonstandard limit distributions in unit root and near integrated cases which reduce both the bias and the variance of the MLE in the vicinity of unity. This result shows that it is possible to improve on the asymptotic behavior of the MLE without using an arti cial shrinkage technique or otherwise accelerating convergence at unity at the cost of performance in the neighborhood of unity.

1 Introduction

2 New Moment Conditions and Information Aggregation

3 Partial Information Using a Single-Lag Difference

4 Full Information Aggregation Using All Lag Differences

5 Partial Aggregation and Uniform Asymptotic Normality

6 Conclusion

(0)

(0)

로딩중