TY - GEN
T1 - Graph Pyramid Autoformer for Long- Term Traffic Forecasting
AU - Zhong, Weiheng
AU - Mallick, Tanwi
AU - MacFarlane, Jane
AU - Meidani, Hadi
AU - Balaprakash, Prasanna
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Accurate traffic forecasting is vital to an intelligent transportation system. Although many deep learning models have achieved state-of-art performance for short-term traffic forecasting of up to 1 hour, long-term traffic forecasting that spans multiple hours remains a major challenge. To that end, we develop Graph Pyramid Autoformer (GPA), an attention-based spatial-temporal graph neural network that uses a novel pyramid autocorrelation attention mechanism. It enables learning from long temporal sequences on graphs and improves long-term traffic forecasting accuracy. We demonstrate the efficacy of the GPA using two benchmark traffic datasets: Los Angeles' METR-LA and the Bay Area's PEMS-BAY. Notably, our model has outperformed a range of existing state-of-the-art methods, delivering up to a 25 % improvement in the accuracy of long-term traffic forecasts. Our code is available at: https://github.com/WeiheneZlExplainable-Graph-Autoformer.
AB - Accurate traffic forecasting is vital to an intelligent transportation system. Although many deep learning models have achieved state-of-art performance for short-term traffic forecasting of up to 1 hour, long-term traffic forecasting that spans multiple hours remains a major challenge. To that end, we develop Graph Pyramid Autoformer (GPA), an attention-based spatial-temporal graph neural network that uses a novel pyramid autocorrelation attention mechanism. It enables learning from long temporal sequences on graphs and improves long-term traffic forecasting accuracy. We demonstrate the efficacy of the GPA using two benchmark traffic datasets: Los Angeles' METR-LA and the Bay Area's PEMS-BAY. Notably, our model has outperformed a range of existing state-of-the-art methods, delivering up to a 25 % improvement in the accuracy of long-term traffic forecasts. Our code is available at: https://github.com/WeiheneZlExplainable-Graph-Autoformer.
KW - Graph Neural Network
KW - Long-term forecasting
KW - Pyramid Autocorrelation attention
UR - http://www.scopus.com/inward/record.url?scp=85190158197&partnerID=8YFLogxK
U2 - 10.1109/ICMLA58977.2023.00060
DO - 10.1109/ICMLA58977.2023.00060
M3 - Conference contribution
AN - SCOPUS:85190158197
T3 - Proceedings - 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023
SP - 384
EP - 391
BT - Proceedings - 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023
A2 - Arif Wani, M.
A2 - Boicu, Mihai
A2 - Sayed-Mouchaweh, Moamar
A2 - Abreu, Pedro Henriques
A2 - Gama, Joao
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023
Y2 - 15 December 2023 through 17 December 2023
ER -