Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Support vector machines for pattern classification (Advances in Pattern Recognition)
Abe S., Springer-Verlag New York, Inc., Secaucus, NJ, 2005. 343 pp. Type: Book (9781852339296)
Date Reviewed: Feb 28 2006

The use of support vector machines (SVMs) is a relatively new and very promising classification technique, developed by Vapnik and his group at AT&T Bell Laboratories as an alternative training technique for polynomial, radial basis function, and multi-layer perceptron classifiers. According to the theory of SVMs, while traditional techniques for pattern recognition are based on the attempt to optimize the performance of the training set that is the minimization of the empirical risk, SVMs minimize the structural risk, that is, the probability of misclassifying yet-to-be-seen patterns for a fixed but unknown probability distribution of data. This classification paradigm, relying on the theory of uniform convergence in probability, proves to be equivalent to the minimization of an upper bound on the generalization error.

Since structural risk minimization (SRM) is an inductive principle that seeks to minimize a bound on the generalization error of a model, rather than minimizing the mean square error (MSE) over the data set, training an SVM to obtain the maximum margin classifier is based on an objective function whose optimization requires the solution of a large-scale quadratic programming problem with linear and box constraints. The most distinguished and attractive features of this classification paradigm are the ability to condense the information contained by the training set, and the use of families of decision surfaces of a relatively low Vapnik-Chervonenkis dimension.

This book is an excellent and comprehensive monograph on the domain of SVMs, supplying, in its 11 chapters, an extensive analysis of various useful two-class and multi-class classification techniques, and of function approximation topics. Following a brief introduction, the first three chapters extensively present two-class and multi-class SVMs. A series of variants of SVMs--for instance, the least square SVM, linear programming SVM, robust SVM, Bayesian SVMs, and committee machines--are analyzed in the fourth chapter of the book. The next chapters are focused on the preselection of support vector candidates and training methods using decomposition techniques to speed up training, and on feature selection based on SVMs. The major drawbacks of three-layer neural networks as universal approximators are that the training process is, in general, quite slow, and their generalization ability depends on the initial synaptic weights. Several strategies to maximize the margins of a three-layer neural network classifier that is trained layer by layer are proposed in the ninth chapter, and their recognition performance is compared with SVMs. Next, an architecture of a kernel version of a fuzzy classifier with ellipsoidal regions, an improvement of generalization ability by transductive training based on unlabeled data, and an architecture of a fuzzy classifier with polyhedral regions and an efficient rule-generation method are presented. The final chapter of the book is devoted to extensions of various SVMs to function approximation.

This book provides practitioners with a large toolkit of new algorithms, kernels, and solutions that are ready to implement and suitable for solving a broad class of pattern recognition tasks. It furnishes students and researchers with an amiable, and at the same time mathematically rigorous, presentation of the rapidly expanding field of SVMs. The numerous examples and computer tests are welcome and helpful for understanding this new class of approaches to pattern classification and recognition. In my opinion, the book will prove invaluable for a wide class of people involved in pattern recognition, machine learning, and neural networks.

Reviewer:  L. State Review #: CR132501 (0701-0048)
Bookmark and Share
 
Classifier Design And Evaluation (I.5.2 ... )
 
 
Heuristic Methods (I.2.8 ... )
 
 
Pattern Analysis (I.5.2 ... )
 
 
Design Methodology (I.5.2 )
 
 
Problem Solving, Control Methods, And Search (I.2.8 )
 
Would you recommend this review?
yes
no
Other reviews under "Classifier Design And Evaluation": Date
Linear discrimination with symmetrical models
Bobrowski L. Pattern Recognition 19(1): 101-109, 1986. Type: Article
Feb 1 1988
An application of a graph distance measure to the classification of muscle tissue patterns
Sanfeliu A. (ed), Fu K., Prewitt J. International Journal of Pattern Recognition and Artificial Intelligence 1(1): 17-42, 1987. Type: Article
Dec 1 1989
Selective networks and recognition automata
George N. J., Edelman G.  Computer culture: the scientific, intellectual, and social impact of the computer (, New York,2011984. Type: Proceedings
May 1 1987
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy