Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Markov chains and stochastic stability (2nd ed.)
Meyn S., Tweedie R., Cambridge University Press, New York, NY, 2009. 622 pp. Type: Book (9780521731829)
Date Reviewed: Sep 14 2009

First published in 1993, this second edition is a classic book on Markov chains. As Sean Meyn notes in the preface, this second edition is a way to honor the second author, Richard Tweedie, who passed in 2001, leaving significant contributions to the field. The new edition contains updated material that follows the developments in the field of Markov chains since the original edition. These developments mainly concern computational aspects, such as the construction of efficient optimization algorithms and simulation algorithms.

The book is regarded as a classic because it presents, in a comprehensive and holistic manner, both the theoretical concepts of Markov chain stability and their practical importance in various applications. The elegance and depth of the mathematical treatment of all the topics covered make the book an invaluable tool for studying the laws of Markov chain behavior and their applications. Generally, it is a mathematically advanced book that focuses on the mathematics used for the representation and study of Markov chains, revealing their beauty. Despite the mathematical formality, researchers from various disciplines and practitioners interested in applications will appreciate the intuitive way in which the various concepts are presented and the way the theory is connected to applications, through numerous examples. In general, the book distills the authors’ research expertise and presents the philosophy of Markov chains through diachronic ideas.

The well-organized material successfully merges a wide variety of approaches from literature, a wise selection of application domains for explaining the fundamental ideas, a collection and classification of Markov models, and finally the way the stability analysis is presented as the “cord binding together the theory and the application of Markov chains.”

The theory concerns Markov chains on general state spaces. Potential readers may be theoreticians and practitioners from various disciplines, including engineers, statisticians, operations researchers, and computer scientists. Computer scientists, in particular, will be interested in the results related to the construction of simulation algorithms.

The material is organized in 20 chapters and classified in three parts that study stability under three complementary perspectives. Part 1, “Communication and Regeneration,” contains chapters 1 to 7. It is devoted to the study of the key concept of irreducibility. The fundamental concepts of Markov models are first presented by various examples of random processes from engineering, queueing systems, financial models, storage systems, biology, simulation, human endeavor, and telecommunications. Stability is discussed in all of these applications intuitively, as a practical notion. The authors then discuss the basic assumptions of a Markov model, emphasizing the fact that Markovian representations can be used for non-Markovian processes, such as state space models and regenerative models. Simple formulas present several Markov models (from time series, control theory, and queueing systems), together with their deterministic analogues; this is very useful for making numerical calculations and experimenting with graphs of the models. The concept of stability is also initially presented in a heuristic way, identifying three levels that are then studied in detail: irreducibility, recurrence, and ergodic behavior. After the initial heuristic presentation, Markov chains are defined under a more formal perspective, using the concept of transition probabilities that govern the evolution of the chain. Irreducibility is then studied in great detail, due to its relation to stability. One of the most important highlights of Part 1 is the introduction of topological considerations for identifying conditions of open and compact sets related to irreducibility and stability.

Part 2, “Stability Structures,” includes chapters 8 to 12. Part 2 is devoted to stability in terms of recurrence--that is, a return to the center of the space. Recurrent chains with stable properties are studied, and stability is linked to invariant measures and stationary probabilities that represent the stable situation of interest.

Part 3, “Convergence,” is comprised of chapters 13 to 20. It deals with the convergence of a chain to a stationary regime. The convergence of the transitional probabilities to a stationary probability measure is studied, and the ergodic chains are defined and studied. The rate of convergence plays a key role in Part 3; results on geometric and uniform ergodicity are provided. Several limit theorems are given for the recurrent chains. Part 3 concludes with a very interesting chapter that discusses additional topics on Markov chains, including spectral theory, simulation, and continuous time models.

The appendices at the end of the book are very useful, as they assemble the definitions, the assumptions, and the results of all the chapters and provide the most important prerequisite mathematical concepts.

In conclusion, this book is advanced, requiring serious mathematical background. It is recommended essential reading for all scientists working with Markov chains, especially with their stability, either theoretically or practically.

Reviewer:  Lefteris Angelis Review #: CR137297 (1008-0770)
Bookmark and Share
  Reviewer Selected
 
 
Markov Processes (G.3 ... )
 
 
Monte Carlo (I.6.8 ... )
 
 
Stochastic Analysis (D.4.8 ... )
 
Would you recommend this review?
yes
no
Other reviews under "Markov Processes": Date
Continuous-time Markov chains and applications
Yin G., Zhang Q., Springer-Verlag New York, Inc., New York, NY, 1998. Type: Book (9780387982441)
Jan 1 1999
Stochastic dynamic programming and the control of queueing systems
Sennott L., Wiley-Interscience, New York, NY, 1999. Type: Book (9780471161202)
Jan 1 1999
Lower bounds for randomized mutual exclusion
Kushilevitz E., Mansour Y., Rabin M., Zuckerman D. SIAM Journal on Computing 27(6): 1550-1563, 1998. Type: Article
Jul 1 1999
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy