Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Search and optimization by metaheuristics : techniques and algorithms inspired by nature
Du K., Swamy M., Birkhäuser Basel, New York, NY, 2016. 434 pp. Type: Book (978-3-319411-91-0)
Date Reviewed: Jun 27 2017

Search and optimization algorithms form the backbone of most modern artificial intelligence (AI) techniques. Many of the algorithms that follow the traditional path of AI use search to find optimal (or near optimal) solutions to a given problem. Machine learning algorithms, in particular deep learning [1], use optimization techniques to train complex models that can generalize well; indeed, much of the generalization power of deep learning models can be traced to the use of stochastic gradient descent [2]. There is, however, a class of problems for which optimization algorithms and iterative methods (such as stochastic gradient descent) cannot be applied. In such cases, metaheuristics may be applied. A metaheuristic is a high-level strategy that guides the search and optimization process so as to efficiently find a near optimal solution to the problem. Search and optimization by metaheuristics aims to be a guide to this area of research and contains details of hundreds of different variants of metaheuristic approaches, all of which are inspired by nature.

The book is loosely organized into five sections, each of which focuses on a different flavor of metaheuristic. These sections are: evolution-based approaches (six chapters), swarm intelligence approaches (seven chapters), sciences-based approaches (four chapters), human-based approaches (three chapters), and general optimization problems (two chapters). Each chapter covers a specific approach, such as genetic programming, ant colony optimization, and harmony search. It is worth saying at the outset that each chapter covers a lot of ground and is accompanied by a meticulous set of references that can be used as a springboard for further reading and research. It is also worth stating that each of the main algorithms is accompanied by example pseudocode; according to the introduction, MATLAB code is available on the book’s website. These two points do, however, come with two caveats. First, the pseudocode is often left undiscussed or uncommented and so it can at times be difficult to fit what is being discussed within the main text with what is shown in the pseudocode. Second, I could find no address for the book’s website in the book and an Internet search produced no results.

Although the book is well researched, well referenced, and contains a truly impressive breadth of information, there are a few problems that need highlighting, the first and most problematic of which is the lack of consistency between chapters. I think that this book could have benefited from some form of coherent framework within which each metaheuristic is explained. For example, in [3] four metaheuristic algorithms are compared for a common problem and are described in terms of their common structure (search landscape, initial solution, solution representation, and neighbor solution). If the authors had employed such a structure for each of the methods they describe, readers could quickly compare the similarities and differences between different approaches and gain a better understanding of the approaches both individually and as a whole.

The second problem is that since the breadth of topics covered within each chapter is so broad, the important information relating to the fundamentals is often skimmed over. For example, when discussing particle swarm optimization (PSO) in chapter 9, a little over two pages are given to explaining the basic PSO algorithm, with the remaining 14 pages given to variants and extensions. I think that if this book is to be used as an accessible introduction, as the authors state it is intended to be, then it needs to be used in conjunction with the appropriate references (of which there are many per topic) to fill in the possible gaps in explanation.

This book will find its main audience among postgraduate students and researchers within the field of search and optimization who are looking for a broad reference text to quickly obtain key ideas and references. I could also see it being used as a textbook to accompany a final-level undergraduate or postgraduate module; however, the instructors would need to flesh out a lot of the concepts, and the exercises at the end of each chapter are only partially useful as I could not find any corresponding solutions. Overall, this is an impressively detailed book that just misses the mark to make it a one-stop shop for those interested in metaheuristics for search and optimization.

Reviewer:  Harry Strange Review #: CR145383 (1709-0596)
1) Goodfellow, I.; Bengio, Y. Deep learning. MIT Press, Cambridge, MA, 2016.
2) Keskar, N. S.; Mudigere, D.; Nocedal, J.; Smelyanskiy, M.; Tang, P. T. P.; Nocedal, J. On large-batch training for deep learning: generalization gap and sharp minima. In Proc. of the International Conference on Learning Representations (ICLR). ICLR, 2017, 1–16.
3) Azimi, Z. N. Comparison of metaheuristic algorithms for examination timetabling problem. Journal of Applied Mathematics and Computing 16, 1-2(2004), 337–354.
Bookmark and Share
  Reviewer Selected
Featured Reviewer
 
 
Problem Solving, Control Methods, And Search (I.2.8 )
 
 
Sorting And Searching (F.2.2 ... )
 
 
Optimization (G.1.6 )
 
Would you recommend this review?
yes
no
Other reviews under "Problem Solving, Control Methods, And Search": Date
The use of a commercial microcomputer database management system as the basis for bibliographic information retrieval
Armstrong C. Journal of Information Science 8(5): 197-201, 1984. Type: Article
Jun 1 1985
Naive algorithm design techniques--a case study
Kant E., Newell A. (ed)  Progress in artificial intelligence (, Orsay, France,511985. Type: Proceedings
Mar 1 1986
SOAR: an architecture for general intelligence
Laird J. (ed), Newell A., Rosenbloom P. Artificial Intelligence 33(1): 1-64, 1987. Type: Article
Aug 1 1988
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy