TY - GEN
T1 - HyperSpace
T2 - 30th International Symposium on Computer Architecture and High Performance Computing, SBAC-PAD 2018
AU - Young, M. Todd
AU - Hinkle, Jacob
AU - Ramanathan, Arvind
AU - Kannan, Ramakrishnan
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/7/2
Y1 - 2018/7/2
N2 - As machine learning models continue to increase in complexity, so does the potential number of free model parameters commonly known as hyperparameters. While there has been considerable progress toward finding optimal configurations of these hyperparameters, many optimization procedures are treated as black boxes. We believe optimization methods should not only return a set of optimized hyperparameters, but also give insight into the effects of model hyperparameter settings. To this end, we present HyperSpace, a parallel implementation of Bayesian sequential model-based optimization. HyperSpace leverages high performance computing (HPC) resources to better understand unknown, potentially non-convex hyperparameter search spaces. We show that it is possible to learn the dependencies between model hyperparameters through the optimization process. By partitioning large search spaces and running many optimization procedures in parallel, we also show that it is possible to discover families of good hyperparameter settings over a variety of models including unsupervised clustering, regression, and classification tasks.
AB - As machine learning models continue to increase in complexity, so does the potential number of free model parameters commonly known as hyperparameters. While there has been considerable progress toward finding optimal configurations of these hyperparameters, many optimization procedures are treated as black boxes. We believe optimization methods should not only return a set of optimized hyperparameters, but also give insight into the effects of model hyperparameter settings. To this end, we present HyperSpace, a parallel implementation of Bayesian sequential model-based optimization. HyperSpace leverages high performance computing (HPC) resources to better understand unknown, potentially non-convex hyperparameter search spaces. We show that it is possible to learn the dependencies between model hyperparameters through the optimization process. By partitioning large search spaces and running many optimization procedures in parallel, we also show that it is possible to discover families of good hyperparameter settings over a variety of models including unsupervised clustering, regression, and classification tasks.
KW - Bayesian optimization
KW - HPC
KW - SMBO
KW - parallel computing
UR - http://www.scopus.com/inward/record.url?scp=85063127019&partnerID=8YFLogxK
U2 - 10.1109/CAHPC.2018.8645954
DO - 10.1109/CAHPC.2018.8645954
M3 - Conference contribution
AN - SCOPUS:85063127019
T3 - Proceedings - 2018 30th International Symposium on Computer Architecture and High Performance Computing, SBAC-PAD 2018
SP - 339
EP - 347
BT - Proceedings - 2018 30th International Symposium on Computer Architecture and High Performance Computing, SBAC-PAD 2018
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 24 September 2018 through 27 September 2018
ER -