Abstract
Accurate traffic forecasting is vital to an intelligent transportation system. Although many deep learning models have achieved state-of-art performance for short-term traffic forecasting of up to 1 hour, long-term traffic forecasting that spans multiple hours remains a major challenge. To that end, we develop Graph Pyramid Autoformer (GPA), an attention-based spatial-temporal graph neural network that uses a novel pyramid autocorrelation attention mechanism. It enables learning from long temporal sequences on graphs and improves long-term traffic forecasting accuracy. We demonstrate the efficacy of the GPA using two benchmark traffic datasets: Los Angeles' METR-LA and the Bay Area's PEMS-BAY. Notably, our model has outperformed a range of existing state-of-the-art methods, delivering up to a 25 % improvement in the accuracy of long-term traffic forecasts. Our code is available at: https://github.com/WeiheneZlExplainable-Graph-Autoformer.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023 |
| Editors | M. Arif Wani, Mihai Boicu, Moamar Sayed-Mouchaweh, Pedro Henriques Abreu, Joao Gama |
| Publisher | Institute of Electrical and Electronics Engineers Inc. |
| Pages | 384-391 |
| Number of pages | 8 |
| ISBN (Electronic) | 9798350345346 |
| DOIs | |
| State | Published - 2023 |
| Event | 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023 - Jacksonville, United States Duration: Dec 15 2023 → Dec 17 2023 |
Publication series
| Name | Proceedings - 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023 |
|---|
Conference
| Conference | 22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023 |
|---|---|
| Country/Territory | United States |
| City | Jacksonville |
| Period | 12/15/23 → 12/17/23 |
Funding
This material is based on the work supported by the U.S. Department of Energy, Office of Science, under contract DEAC02- 06CH11357. This research used resources from the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility under contract DE-AC02- 06CH11357.
Keywords
- Graph Neural Network
- Long-term forecasting
- Pyramid Autocorrelation attention
Fingerprint
Dive into the research topics of 'Graph Pyramid Autoformer for Long- Term Traffic Forecasting'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver