摘要
This paper is concerned with the theoretical foundation of support vector machines (SVMs). The purpose is to develop further an exact relationship between SVMs and the statistical learning theory (SLT). As a representative, the standard C-support vector classification (C-SVC) is considered here. More precisely, we show that the decision function obtained by C-SVC is just one of the decision functions obtained by solving the optimization problem derived directly from the structural risk minimization principle. In addition, an interesting meaning of the parameter C in C-SVC is given by showing that C corresponds to the size of the decision function candidate set in the structural risk minimization principle.
This paper is concerned with the theoretical foundation of support vector machines (SVMs). The purpose is to develop further an exact relationship between SVMs and the statistical learning theory (SLT). As a representative, the standard C-support vector classification (C-SVC) is considered here. More precisely, we show that the decision function obtained by C-SVC is just one of the decision functions obtained by solving the optimization problem derived directly from the structural risk minimization principle. In addition, an interesting meaning of the parameter C in C-SVC is given by showing that C corresponds to the size of the decision function candidate set in the structural risk minimization principle.
作者
ZHANG ChunHua 1 , TIAN YingJie 2 & DENG NaiYang 3,1 School of Information, Renmin University of China, Beijing 100872, China
2 Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing 100080, China
3 College of Science, China Agricultural University, Beijing 100083, China
基金
supported by National Natural Science Foundation of China(Grant No. 10971223,10601064)
Key Project of National Natural Science Foundation of China (Grant No.10631070,70531040)
the Science Foundation of Renmin University of China (Grant No.06XNB055)