Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Deep learning illustrated
Krohn J., Beyleveld G., Bassens A., Addison-Wesley, Boston, MA, 2020. 416 pp. Type: Book (978-1-351166-94-2)
Date Reviewed: Mar 10 2021

Deep learning is an emerging topic. This book aims to cover deep learning and its applications through programming examples and illustrations. The book is divided into 14 chapters and four parts.

The first part (four chapters) focuses on introductory concepts of deep learning. The first chapter discusses how machine vision is compared with biological vision. This resemblance is a key factor in understanding how consecutive layers can be useful in a deep neural network (DNN) for machine vision. Chapter 2 draws an analogy between natural language processing (NLP) using DNN and human language understanding. This chapter describes how words can be converted into numbers using vectors. Chapter 3 is focused on machine vision, for example, image processing and explaining n-dimensional spaces for image arithmetic, whereas chapter 4 elaborates on fundamental concepts of machine learning, including deep learning, artificial neural networks (ANNs), and NLP.

Part 2 explains different theoretical concepts. These include shallow networks in chapter 5 and the basic components of an ANN, such as the perceptron and activation functions, in chapter 6. Chapter 7 strengthens concepts of DNNs such as dense layers, forward propagation, and softmax. Chapter 8 describes concepts related to DNN training, including cost functions, batch size, and gradient descent. This chapter also describes an implementation using the Keras library. Chapter 9, the last chapter of this section, explains various topics such as initializing weights, analyzing cost, and dealing with the vanishing gradient problem. It covers implementation details using TensorFlow and Keras.

Part 3 focuses on different applications of deep learning. Chapter 10 explains concepts of machine vision: convolutional neural networks (CNNs), VGGNet, and different applications using fast R-CNN, faster R-CNN, and Yolo. Chapter 11 elaborates on NLP techniques. It explains data pre-processing, vectorization and word embeddings, n-grams, and word2vec. Code-based examples are included for enhanced learning. The chapter also discusses various types of DNNs, such as RNN, long short-term memory (LSTM), and sequence- and attention-based models. Chapter 12 explains generative adversarial networks (GANs), which are useful in generating datasets. Chapter 13 explains reinforcement learning. Code-based examples to support implementation are a strong focus of this section.

Part 4 contains chapter 14, which discusses topics related to developing DNN-based projects. This is a brief chapter with little explanation and discussion.

The book is easy to read. It is useful for readers and learners with intermediary requirements and expectations of deep learning. The book’s strengths include discussions related to various applications of deep learning and its use of examples and programming code.

The writing style is mixed with both reader-centric as well as author-centric approaches. Overall, a nice effort with easy comprehension and useful illustrations.

Reviewer:  Jawwad Shamsi Review #: CR147209 (2107-0172)
Bookmark and Share
  Reviewer Selected
 
 
Learning (I.2.6 )
 
Would you recommend this review?
yes
no
Other reviews under "Learning": Date
Learning in parallel networks: simulating learning in a probabilistic system
Hinton G. (ed) BYTE 10(4): 265-273, 1985. Type: Article
Nov 1 1985
Macro-operators: a weak method for learning
Korf R. Artificial Intelligence 26(1): 35-77, 1985. Type: Article
Feb 1 1986
Inferring (mal) rules from pupils’ protocols
Sleeman D.  Progress in artificial intelligence (, Orsay, France,391985. Type: Proceedings
Dec 1 1985
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy