Unsupervised optimization of support vector machine parameters

Mary Cassabaum, Donald Waagen, Jeffrey J. Rodríguez, Harry Schmitt

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

Selection of the kernel parameters is critical to the performance of Support Vector Machines (SVMs), directly impacting the generalization and classification efficacy of the SVM. An automated procedure for parameter selection is clearly desirable given the intractable problem of exhaustive search methods. The authors' previous work in this area involved analyzing the SVM training data margin distributions for a Gaussian kernel in order to guide the kernel parameter selection process. The approach entailed several iterations of training the SVM in order to minimize the number of support vectors. Our continued investigation of unsupervised kernel parameter selection has led to a scheme employing selection of the parameters before training occurs. Statistical methods are applied to the Gram matrix to determine kernel optimization in an unsupervised fashion. This preprocessing framework removes the requirement for iterative SVM training. Empirical results will be presented for the "toy" checkerboard and quadboard problems.

Original languageEnglish (US)
Pages (from-to)316-325
Number of pages10
JournalProceedings of SPIE - The International Society for Optical Engineering
Volume5426
DOIs
StatePublished - 2004
EventAutomatic Target Recognition XIV - Orlando, FL, United States
Duration: Apr 13 2004Apr 15 2004

Keywords

  • Classification
  • Parameter optimization
  • Support vector machines

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Unsupervised optimization of support vector machine parameters'. Together they form a unique fingerprint.

Cite this