Zaki, N,M. and Deris, S. and Chin, K.K. (2003) Extending the decomposition algorithm for support vector machines training. Journal of Information and Communication Technology, 1 (2). pp. 17-29. ISSN 2180-3862
Preview |
PDF
Download (984kB) | Preview |
Abstract
The Support Vector Machine (SVM) is found to de a capable learning machine. It has the ability to handle difficult pattern recognition tasks such as speech recognition, and has demonstrated reasonable performance. The formulation in a SVM is elegant in that it is simplified to a convex Quadratic IProgramming (QP) problem. Theoretically the training is guaranteed to converge to a global optimal. The training of SVM is not as straightforward as it seems. Numerical problems will cause the training to give non- optimal decision boundaries. Using a conventional optimizer to train SVM is not the ideal solution. One can design a dedicated optimizer that will take full advantage of the specific nature of the QP problem in SVM training. The decomposition algorithm developed by Osuna et al. (1997a) reduces the training cost to an acceptable level. In this paper we have analyzed and developed an extension to Osuna's method in order 110 achieve better performance. The modified method can be used to solve the training of practical SVMs, in which the training might not otherwise converge.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Support vector machines, Decomposition, Pattern recognition, Learning |
Subjects: | Q Science > QA Mathematics |
Divisions: | UNSPECIFIED |
Depositing User: | Mrs. Norazmilah Yaakub |
Date Deposited: | 28 Jul 2010 13:41 |
Last Modified: | 28 Jul 2010 13:41 |
URI: | https://repo.uum.edu.my/id/eprint/345 |
Actions (login required)
View Item |