2008年6月
Global convergence of SMO algorithm for support vector regression
IEEE TRANSACTIONS ON NEURAL NETWORKS
- ,
- ,
- 巻
- 19
- 号
- 6
- 開始ページ
- 971
- 終了ページ
- 982
- 記述言語
- 英語
- 掲載種別
- 研究論文(学術雑誌)
- DOI
- 10.1109/TNN.2007.915116
- 出版者・発行元
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given l training samples, SVR is formulated as a convex quadratic programming (QP) problem with l pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.
- リンク情報
- ID情報
-
- DOI : 10.1109/TNN.2007.915116
- ISSN : 1045-9227
- Web of Science ID : WOS:000256670500005