Active manifolds: A non-linear analogue to Active Subspaces

Robert A. Bridges, Anthony D. Gruber, Christopher R. Felder, Miki E. Verma, Chelsey Hoff

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

We present an approach to analyze C1(ℝm) functions that addresses limitations present in the Active Subspaces (AS) method of Constantine et al. (2015; 2014). Under appropriate hypotheses, our Active Manifolds (AM) method identifies a 1-D curve in the domain (the active manifold) on which nearly all values of the unknown function are attained, and which can be exploited for approximation or analysis, especially when m is large (high-dimensional input space). We provide theorems justifying our AM technique and an algorithm permitting functional approximation and sensitivity analysis. Using accessible, low-dimensional functions as initial examples, we show AM reduces approximation error by an order of magnitude compared to AS, at the expense of more computation. Following this, we revisit the sensitivity analysis by Glaws et al. (2017), who apply AS to analyze a magnetohydrodynamic power generator model, and compare the performance of AM on the same data. Our analysis provides detailed information not captured by AS, exhibiting the influence of each parameter individually along an active manifold. Overall, AM represents a novel technique for analyzing functional models with benefits including: reducing m-dimensional analysis to a 1-D analogue, permitting more accurate regression than AS (at more computational expense), enabling more informative sensitivity analysis, and granting accessible visualizations (2-D plots) of parameter sensitivity along the AM.

Original languageEnglish
Title of host publication36th International Conference on Machine Learning, ICML 2019
PublisherInternational Machine Learning Society (IMLS)
Pages1204-1212
Number of pages9
ISBN (Electronic)9781510886988
StatePublished - 2019
Event36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States
Duration: Jun 9 2019Jun 15 2019

Publication series

Name36th International Conference on Machine Learning, ICML 2019
Volume2019-June

Conference

Conference36th International Conference on Machine Learning, ICML 2019
Country/TerritoryUnited States
CityLong Beach
Period06/9/1906/15/19

Funding

This manuscript has been co-authored by UT-Battelle, IXC, under contract DE-AC05-OOOR22725 with the US Department of Energy (DOE). The US government retains and the publisher, by accepting the article for publication, acknowledges that the US government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this manuscript, or allow others to do so. for US government purposes. DOE will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http7/energy.gov/downloads/doe-public-access-plan). Special thanks to Guannan Zhang and the reviewers whose comments helped polish this paper. This work was supported in part by the U.S. Department of Energy, Office of Science, Office of Workforce Development for Teachers and Scientists (WDTS) under the program SULI and the National Science Foundation's Math Science Graduate Research Internship.

FundersFunder number
DOE Public Access Plan
IXCDE-AC05-OOOR22725
National Science Foundation's Math Science
U.S. Department of Energy
Office of Science
Workforce Development for Teachers and Scientists
UT-Battelle

    Fingerprint

    Dive into the research topics of 'Active manifolds: A non-linear analogue to Active Subspaces'. Together they form a unique fingerprint.

    Cite this