Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
An analytical framework for particle and volume data of large-scale combustion simulations
Sauer F., Yu H., Ma K.  UltraVis 2013 (Proceedings of the 8th International Workshop on Ultrascale Visualization, Denver, CO, Nov 17-21, 2013)1-8.2013.Type:Proceedings
Date Reviewed: Jan 24 2014

The study of computational fluid dynamics applied to numerical combustion modeling is complex and requires the use of advanced hardware and software resources and both numerical and nonnumerical research tools. This paper provides an interesting approach using an integrated framework for parallel data analysis and visualization. The framework combines Lagrangian particle data and Eulerian methods for representing field data. The approach is designed for use in complex application areas such as combustion modeling. The associated tools allow users to focus their studies with a query-based examination involving both particle and volumetric properties. It is also possible to do queries involving particle and volumetric properties (in the reverse manner).

The paper consists of seven sections: an introduction, related work, the approach (which also provides an overview of the paper), results, conclusions and future work, acknowledgments, and references. The key sections are, of course, the approach and the results.

A key step in this approach involves representing field data as a set of voxels and using topological classifications associated with each voxel. This enables the development of segment features that, in turn, lead to segment feature tracking as the flow field evolves. The authors use a workflow approach that combines these two classes of data to iterate field data until it is consistent with the particle data. Using a series of figures and descriptions, the authors illustrate the development of segment analysis and feature tracking to enable the user to explore segments of the combustion simulation as it evolves.

The paper also addresses the key performance aspects of the computation: the growth of the region being studied and the approach of particle extraction. The test data used for a performance study is focused on a typical feature extraction corresponding to a volumetric size in the range of 10,000 voxels recommended by combustion scientists at Sandia National Laboratories. This study was conducted using Hopper, which is a 6,384-node Cray XE6 system at the National Energy Research Scientific Computing Center (NERSC) in Berkeley. Performance graphs are used to illustrate the parallel performance for the test problem being studied. For the two key steps, particle extraction and region growing, the results are analyzed and show that the optimal processor size for this data is in the range of 128 processors.

This is a well-done paper that represents a significant contribution to an important research area.

Reviewer:  Mike Minkoff Review #: CR141931 (1404-0285)
Bookmark and Share
  Reviewer Selected
 
 
Miscellaneous (E.m )
 
 
Physics (J.2 ... )
 
Would you recommend this review?
yes
no

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy