Abstract
Selection of the kernel parameters is critical to the performance of Support Vector Machines (SVMs), directly impacting the generalization and classification efficacy of the SVM. An automated procedure for parameter selection is clearly desirable given the intractable problem of exhaustive search methods. The authors' previous work in this area involved analyzing the SVM training data margin distributions for a Gaussian kernel in order to guide the kernel parameter selection process. The approach entailed several iterations of training the SVM in order to minimize the number of support vectors. Our continued investigation of unsupervised kernel parameter selection has led to a scheme employing selection of the parameters before training occurs. Statistical methods are applied to the Gram matrix to determine kernel optimization in an unsupervised fashion. This preprocessing framework removes the requirement for iterative SVM training. Empirical results will be presented for the "toy" checkerboard and quadboard problems.
Original language | English (US) |
---|---|
Pages (from-to) | 316-325 |
Number of pages | 10 |
Journal | Proceedings of SPIE - The International Society for Optical Engineering |
Volume | 5426 |
DOIs | |
State | Published - 2004 |
Event | Automatic Target Recognition XIV - Orlando, FL, United States Duration: Apr 13 2004 → Apr 15 2004 |
Keywords
- Classification
- Parameter optimization
- Support vector machines
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Condensed Matter Physics
- Computer Science Applications
- Applied Mathematics
- Electrical and Electronic Engineering