CFP last date
20 May 2024
Reseach Article

Constructing Support Vector Machines with Reduced Classifier Complexity

Published on June 2015 by Ankur M. Bobade, N. N. Khalsa, S. M. Deshmukh
National Conference on Emerging Trends in Advanced Communication Technologies
Foundation of Computer Science USA
NCETACT2015 - Number 2
June 2015
Authors: Ankur M. Bobade, N. N. Khalsa, S. M. Deshmukh
ac4954e8-34f9-48d7-a0b5-9f85ed52405d

Ankur M. Bobade, N. N. Khalsa, S. M. Deshmukh . Constructing Support Vector Machines with Reduced Classifier Complexity. National Conference on Emerging Trends in Advanced Communication Technologies. NCETACT2015, 2 (June 2015), 1-5.

@article{
author = { Ankur M. Bobade, N. N. Khalsa, S. M. Deshmukh },
title = { Constructing Support Vector Machines with Reduced Classifier Complexity },
journal = { National Conference on Emerging Trends in Advanced Communication Technologies },
issue_date = { June 2015 },
volume = { NCETACT2015 },
number = { 2 },
month = { June },
year = { 2015 },
issn = 0975-8887,
pages = { 1-5 },
numpages = 5,
url = { /proceedings/ncetact2015/number2/20984-2018/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Proceeding Article
%1 National Conference on Emerging Trends in Advanced Communication Technologies
%A Ankur M. Bobade
%A N. N. Khalsa
%A S. M. Deshmukh
%T Constructing Support Vector Machines with Reduced Classifier Complexity
%J National Conference on Emerging Trends in Advanced Communication Technologies
%@ 0975-8887
%V NCETACT2015
%N 2
%P 1-5
%D 2015
%I International Journal of Computer Applications
Abstract

Support vector machines (SVMs), though perfect, are not chosen in applications requiring great classi?cation speed, due to the number of support vectors being large. To conquer this problem we devise a primitive method with the following properties: (1) it decouples the idea of basis functions from the concept of support vectors; (2) it materialistically ?nds a set of kernel basis functions of a speci?ed maximum size (dmax) to approximate the SVM primitive cost function well; (3) it is ef?cient and roughly scales as O(ndmax2)where n is the number of training examples; and, (4) the number of basis functions it requires to accomplish an accuracy close to the SVM accuracy is usually far less than the number of SVM support vectors.

References
  1. J. Adler, B. D. Rao, and K. Kreutz-Delgado. Comparison of basis selection methods,1996.
  2. F. Bach and M. Jordan. Predictive low-rank decomposition for kernel methods, 2005.
  3. K. P. Bennett, M. Momma, and M. J. Embrechts. MARK: A boosting algorithm for heterogeneous kernel models,2002.
  4. J. Bi, T. Zhang, and K. P. Bennet. Column generation boosting methods for mixture of kernels. 2004.
  5. C. J. C. Burges and B. Sch¨olkopf. Improving the accuracy and speed of support vector learning machines,1997.
  6. O. Chapelle. Training a support vector machine in the primitive. Journal of Machine Learning Re-search, 2005.
  7. D. DeCoste and B. Sch¨olkopf. Training invariant support vector machines. Machine Learning, 2002.
  8. T. Downs, K. E. Gates, and A. Masters. Exact simpli?cation of support vector solutions, 2001.
  9. J. H. Friedman. Greedy function approximation: a gradient boosting machine. Annals of Statistics, 2001.
  10. T. Joachims. Making large-scale SVM learning practical. In Advances in Kernel Methods – Support Vector Learning. MIT Press, Cambridge, Massachussetts, 1999.
  11. S. S. Keerthi and W. Chu. A matching pursuit approach to sparse Gaussian process regression.
  12. S. S. Keerthi and D. DeCoste. A modi?ed ?nite Newton method for fast solution of large scale linear svms, 2005.
Index Terms

Computer Science
Information Sciences

Keywords

Support Vectors (svs) Svms Classi?cation Sparse Design.