상세검색
최근 검색어 전체 삭제
다국어입력
즐겨찾기0
152971.jpg
KCI등재 학술저널

On-Line LS-SVM Regression with Pruned Support Vectors - based on cross validatory choice of hyper-parameters

LS-SVM regression is known to be a good substitute for the traditional statistical regression method. LS-SVM is an SVM(support vector machine) version which involves equality constraints instead of inequality constraints and works with a squared loss function, which leads the solution to be obtained from a linear Karush-Kuhn-Tucker conditions instead of a quadratic programming problem. But computational difficulties are remained to operate the inversion of matrix of large data set. For the analysis of the on-line data set or the large data sets, we suggest an on-line LS-SVM regression with pruning support vectors and modifying the hyper-parameters in each step. In numerical studies we show that with relatively small number of pruned support vectors almost same prediction performance can be obtained as a batch LS-SVM regression in a sense of MSE.

1. Introduction

2. LS-SVM Regression

3. On-Line LS-SVM Regression with Pruning SVs

4. Numerical Study

5. Concluding Remarks

References