Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Comparative study of computational algorithms for the Lasso with high-dimensional, highly correlated data
Kim B., Yu D., Won J.  Applied Intelligence 48 (8): 1933-1952, 2018. Type: Article
Date Reviewed: Oct 24 2018

High-dimensional data analysis boils down to statistical regression, or selecting the variables that guarantee the most stable results across a wide variety of situations: as the number of variables increases, so do the possible outcomes, because more and more variables are involved and the interactions among them become unpredictable. Thus, the least absolute shrinkage and selection operator (LASSO) variable selection method lowers the contribution of many variables to the overall solution to almost negligible levels, which is particularly useful in most cases.

This paper performs a comparative study of the strengths and weaknesses of algorithms for high-dimensional data analysis. It analyzes five algorithms implementing the LASSO method: coordinate descent (CD), CD with active shooting, majorization-minimization using local quadratic approximation (MM-LQA), alternating direction method of multipliers (ADMM), and the fast iterative shrinkage thresholding algorithm (FISTA). The paper highlights the factors affecting their performance and convergence toward stable results. It starts with their description, both in mathematical and pseudocode terms. Then it presents a method, developed by the authors, to compare their sensitivity; this method measures the performance of every algorithm in terms of number of iterations required to converge to a stable solution, its corresponding computation time, and the value of the objective function it converges to. The next step consists of presenting the results with detailed tables plotting the convergence of each individual algorithm, for different parameter settings and for different number of iterations; no clear winner emerges, as each algorithm performs best in different situations. The same methodology is then used when the algorithms are applied to a real-world scenario (cancer biomarker discovery), with comparable results presented with the same accuracy. A final table summarizes the strengths and weaknesses of each algorithm.

The paper is objective: it is a comparative study, thus it does not need to prove the merits of an algorithm over the others, and this constitutes its strength.

Reviewer:  Andrea Paramithiotti Review #: CR146294 (1902-0041)
Bookmark and Share
  Featured Reviewer  
Algorithm Design And Analysis (G.4 ... )
Optimization (B.1.4 ... )
Would you recommend this review?
Other reviews under "Algorithm Design And Analysis": Date
Near optimal online algorithms and fast approximation algorithms for resource allocation problems
Devanur N., Jain K., Sivan B., Wilkens C.  Journal of the ACM 66(1): 1-41, 2019. Type: Article
Sep 21 2021
The algorithm design manual (3rd ed.)
Skiena S.,  Springer International Publishing, Cham, Switzerland, 2020. 793 pp. Type: Book (978-3-030542-55-9)
May 12 2021
 Introduction to distributed self-stabilizing algorithms
Altisen K., Devismes S., Dubois S., Petit F.,  Morgan&Claypool Publishers, San Rafael, CA, 2019. 166 pp. Type: Book (978-1-681735-36-8)
May 12 2020

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright © 2000-2021 ThinkLoud, Inc.
Terms of Use
| Privacy Policy