Abstract
We present an approach to analyze C1(ℝm) functions that addresses limitations present in the Active Subspaces (AS) method of Constantine et al. (2015; 2014). Under appropriate hypotheses, our Active Manifolds (AM) method identifies a 1-D curve in the domain (the active manifold) on which nearly all values of the unknown function are attained, and which can be exploited for approximation or analysis, especially when m is large (high-dimensional input space). We provide theorems justifying our AM technique and an algorithm permitting functional approximation and sensitivity analysis. Using accessible, low-dimensional functions as initial examples, we show AM reduces approximation error by an order of magnitude compared to AS, at the expense of more computation. Following this, we revisit the sensitivity analysis by Glaws et al. (2017), who apply AS to analyze a magnetohydrodynamic power generator model, and compare the performance of AM on the same data. Our analysis provides detailed information not captured by AS, exhibiting the influence of each parameter individually along an active manifold. Overall, AM represents a novel technique for analyzing functional models with benefits including: reducing m-dimensional analysis to a 1-D analogue, permitting more accurate regression than AS (at more computational expense), enabling more informative sensitivity analysis, and granting accessible visualizations (2-D plots) of parameter sensitivity along the AM.
Original language | English |
---|---|
Title of host publication | 36th International Conference on Machine Learning, ICML 2019 |
Publisher | International Machine Learning Society (IMLS) |
Pages | 1204-1212 |
Number of pages | 9 |
ISBN (Electronic) | 9781510886988 |
State | Published - 2019 |
Event | 36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States Duration: Jun 9 2019 → Jun 15 2019 |
Publication series
Name | 36th International Conference on Machine Learning, ICML 2019 |
---|---|
Volume | 2019-June |
Conference
Conference | 36th International Conference on Machine Learning, ICML 2019 |
---|---|
Country/Territory | United States |
City | Long Beach |
Period | 06/9/19 → 06/15/19 |
Funding
This manuscript has been co-authored by UT-Battelle, IXC, under contract DE-AC05-OOOR22725 with the US Department of Energy (DOE). The US government retains and the publisher, by accepting the article for publication, acknowledges that the US government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this manuscript, or allow others to do so. for US government purposes. DOE will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http7/energy.gov/downloads/doe-public-access-plan). Special thanks to Guannan Zhang and the reviewers whose comments helped polish this paper. This work was supported in part by the U.S. Department of Energy, Office of Science, Office of Workforce Development for Teachers and Scientists (WDTS) under the program SULI and the National Science Foundation's Math Science Graduate Research Internship.