Papers

Peer-reviewed
Nov, 2006

Global convergence of decomposition learning methods for support vector machines

IEEE TRANSACTIONS ON NEURAL NETWORKS
  • Norikazu Takahashi
  • ,
  • Tetsuo Nishi

Volume
17
Number
6
First page
1362
Last page
1369
Language
English
Publishing type
Research paper (scientific journal)
DOI
10.1109/TNN.2006.880584
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC

Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required,. decomposition methods are applicable to large QP problems. In this paper,. We will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables.

Link information
DOI
https://doi.org/10.1109/TNN.2006.880584
Web of Science
https://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=JSTA_CEL&SrcApp=J_Gate_JST&DestLinkType=FullRecord&KeyUT=WOS:000241933100002&DestApp=WOS_CPL
ID information
  • DOI : 10.1109/TNN.2006.880584
  • ISSN : 1045-9227
  • Web of Science ID : WOS:000241933100002

Export
BibTeX RIS