TY - GEN
T1 - SCALABLE PROBABILISTIC MODELING AND MACHINE LEARNING WITH DIMENSIONALITY REDUCTION FOR EXPENSIVE HIGH-DIMENSIONAL PROBLEMS
AU - Luan, Lele
AU - Ramachandra, Nesar
AU - Ravi, Sandipp Krishnan
AU - Bhaduri, Anindya
AU - Pandita, Piyush
AU - Balaprakash, Prasanna
AU - Anitescu, Mihai
AU - Sun, Changjie
AU - Wang, Liping
N1 - Publisher Copyright:
Copyright © 2023 by The United States Government.
PY - 2023
Y1 - 2023
N2 - Modern computational methods involving highly sophisticated mathematical formulations enable several tasks like modeling complex physical phenomena, predicting key properties, and optimizing design. The higher fidelity in these computer models makes it computationally intensive to query them hundreds of times for optimization. One usually relies on a simplified model, albeit at the cost of losing predictive accuracy and precision. Towards this, data-driven surrogate modeling methods have shown much promise in emulating the behavior of expensive computer models. However, a major bottleneck in such methods is the inability to deal with high input dimensionality and the need for relatively large datasets. In certain cases, the high dimensionality of the input space can be attributed to its image-like characteristics, for example, the stress and displacement fields of continuums. With such problems, the input and output quantity of interest are tensors of high dimensionality. Commonly used surrogate modeling methods for such problems suffer from requirements like many computational evaluations that precludes one from performing other numerical tasks like uncertainty quantification and statistical analysis. This work proposes an end-to-end approach that maps a high-dimensional image-like input to an output of high dimensionality or its key statistics. Our approach uses two main frameworks that perform three steps: a) reduce the input and output from a high-dimensional space to a reduced or low-dimensional space, b) model the input-output relationship in the low-dimensional space, and c) enable the incorporation of domain-specific physical constraints as masks. To reduce input dimensionality, we leverage principal component analysis, coupled with two surrogate modeling methods: a) Bayesian hybrid modeling and b) DeepHyper’s deep neural networks. We demonstrate the approach’s applicability to a linear elastic stress field data problem. We perform numerical studies to study the effect of the two end-to-end workflows and the effect of data size. Key insights and conclusions are provided, which can aid such efforts in surrogate modeling and engineering optimization.
AB - Modern computational methods involving highly sophisticated mathematical formulations enable several tasks like modeling complex physical phenomena, predicting key properties, and optimizing design. The higher fidelity in these computer models makes it computationally intensive to query them hundreds of times for optimization. One usually relies on a simplified model, albeit at the cost of losing predictive accuracy and precision. Towards this, data-driven surrogate modeling methods have shown much promise in emulating the behavior of expensive computer models. However, a major bottleneck in such methods is the inability to deal with high input dimensionality and the need for relatively large datasets. In certain cases, the high dimensionality of the input space can be attributed to its image-like characteristics, for example, the stress and displacement fields of continuums. With such problems, the input and output quantity of interest are tensors of high dimensionality. Commonly used surrogate modeling methods for such problems suffer from requirements like many computational evaluations that precludes one from performing other numerical tasks like uncertainty quantification and statistical analysis. This work proposes an end-to-end approach that maps a high-dimensional image-like input to an output of high dimensionality or its key statistics. Our approach uses two main frameworks that perform three steps: a) reduce the input and output from a high-dimensional space to a reduced or low-dimensional space, b) model the input-output relationship in the low-dimensional space, and c) enable the incorporation of domain-specific physical constraints as masks. To reduce input dimensionality, we leverage principal component analysis, coupled with two surrogate modeling methods: a) Bayesian hybrid modeling and b) DeepHyper’s deep neural networks. We demonstrate the approach’s applicability to a linear elastic stress field data problem. We perform numerical studies to study the effect of the two end-to-end workflows and the effect of data size. Key insights and conclusions are provided, which can aid such efforts in surrogate modeling and engineering optimization.
KW - Bayesian hybrid modeling
KW - Deep neural networks
KW - Dimension reduction
KW - Image-based models
KW - Surrogate modeling
UR - http://www.scopus.com/inward/record.url?scp=85178512157&partnerID=8YFLogxK
U2 - 10.1115/DETC2023110704
DO - 10.1115/DETC2023110704
M3 - Conference contribution
AN - SCOPUS:85178512157
T3 - Proceedings of the ASME Design Engineering Technical Conference
BT - 43rd Computers and Information in Engineering Conference (CIE)
PB - American Society of Mechanical Engineers (ASME)
T2 - ASME 2023 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC-CIE 2023
Y2 - 20 August 2023 through 23 August 2023
ER -