상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
153027.jpg
KCI등재 학술저널

Stacking Based Ensemble with Instance Selection in Random KNN

DOI : 10.37727/jkdas.2020.22.4.1303
  • 26

Nearest neighbor classification is a simple and quite effective method. But it has some drawbacks in that it suffers from both so called the curse of dimensionality and computational burden when there are lots of examples. A lot of features in high dimensional data degrade the performance of KNN due to the noisy and redundant variables. Hence some kinds of feature selection mechanism is essential to KNN. Random KNN simplifies the feature selection problem by utilizing an ensemble of KNNs constructed with randomly selected subsets of features. Two kinds of enhancements of the random KNN methodology are proposed in this paper. One is to add an instance selection procedure and the other is to determine the weights by training another combining model instead of simple voting in the random KNN. These two additional procedures applied to the random KNN can improve the classification performance of the original random KNN, which is verified by some data analyses with a real data set.

1. Introduction

2. KNN and Ensemble of KNNs

3. Proposed Schemes to improve random KNNs

4. Data Analysis

5. Conclusion

References

로딩중