Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Deep neural networks classification over encrypted data
Hesamifard E., Takabi H., Ghasemi M.  CODASPY 2019 (Proceedings of the Ninth ACM Conference on Data and Application Security and Privacy, Richardson, TX, Mar 25-27, 2019)97-108.2019.Type:Proceedings
Date Reviewed: Feb 13 2020

When we speak about a convolutional neural network (CNN) as a more complex deep learning algorithm, there are privacy-preserving issues that could be addressed in any study on the topic. In deep learning, CNNs are used to analyze complex cloud data, where privacy preservation is a big challenge to machine learning classification services. Thus, the authors have developed a technique that provides functionality and efficiency for privacy-preserving classification in machine learning as a service (MLaaS) when cloud services for analyzing big data are in use. Further, deep learning and especially CNNs are more effective than traditional computer algorithms, meaning they are more accurate than a machine learning algorithm in a big data classification.

CNNs as deep artificial neural networks (ANNs) are used in many cases to classify complex big data. However, it is hard to achieve an acceptable level of privacy preservation within CNNs that use sensitive data over the cloud. This issue makes the study more valuable, because it is strongly focused on privacy preserving with a well-defined model.

It is evidently clear that “preserving the privacy of sensitive data in different machine learning algorithms” exists in theory and real-life deployment. However, when we are confronted with huge datasets located in the cloud, there are many challenges to keeping data anonymous. In addition to a brief introduction to deep learning and CNNs, the authors describe the model and training environment, focusing on a client-server structure where the encrypted data is in the communication process, thus ensuring invisibility of input data. In this way, the proposed model provides privacy preservation in the classification process during the prediction outputs.

Literature on deep learning exists in many fields; however, research on data encrypted training models using deep neural networks and especially CNNs is rare. Thus, this study deserves particular attention from readers interested in system privacy.

“Protect[ing] the privacy of the data during the learning process,” using the appropriate “encryption scheme to support the secure computation of the high-order back-propagation algorithm efficiently for deep computation model training on the cloud,” is only the starting point of the study. Because operations in a neural network can be implemented over encrypted data but are not practical within standard neuron schemes, the authors propose other complementary functions to operate over encrypted data, ensuring privacy preservation.

The approach is presented in a very clear way, making the privacy-preserving classification process through CNNs accessible to many readers. It is definitely recommended reading for computer science (CS) students and professionals in artificial intelligence (AI)-supported machine learning development.

Reviewer:  F. J. Ruzic Review #: CR146889 (2007-0165)
Bookmark and Share
  Featured Reviewer  
 
Neural Nets (C.1.3 ... )
 
 
Privacy (K.4.1 ... )
 
 
Security and Protection (C.2.0 ... )
 
 
Learning (I.2.6 )
 
Would you recommend this review?
yes
no
Other reviews under "Neural Nets": Date
Neural networks: an introduction
Müller B., Reinhardt J., Springer-Verlag New York, Inc., New York, NY, 1990. Type: Book (9780387523804)
May 1 1993
The computing neuron
Durbin R. (ed), Miall C., Mitchison G., Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1989. Type: Book (9780201183481)
May 1 1993
A practical guide to neural nets
McCord-Nelson M., Illingworth W., Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1991. Type: Book (9780201523768)
May 1 1993
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy