University of Hertfordshire

From the same journal

By the same authors

A General Traffic Flow Prediction Approach Based on Spatial-Temporal Graph Attention

Research output: Contribution to journalArticlepeer-review

Standard

A General Traffic Flow Prediction Approach Based on Spatial-Temporal Graph Attention. / Tang, Cong; Sun, Jingru; Sun, Yichuang; Peng, Mu; Gan, Nianfei.

In: IEEE Access, Vol. 8, 9173702, 2020, p. 153731-153741.

Research output: Contribution to journalArticlepeer-review

Harvard

APA

Vancouver

Author

Tang, Cong ; Sun, Jingru ; Sun, Yichuang ; Peng, Mu ; Gan, Nianfei. / A General Traffic Flow Prediction Approach Based on Spatial-Temporal Graph Attention. In: IEEE Access. 2020 ; Vol. 8. pp. 153731-153741.

Bibtex

@article{e168db2ca8234ea59d3b4a171f8e955f,
title = "A General Traffic Flow Prediction Approach Based on Spatial-Temporal Graph Attention",
abstract = "Accurate and reliable traffic flow prediction is critical to the safe and stable deployment of intelligent transportation systems. However, it is very challenging due to the complex spatial and temporal dependence of traffic flows. Most existing works require the information of the traffic network structure and human intervention to model the spatial-temporal association of traffic data, resulting in low generality of the model and unsatisfactory prediction performance. In this paper, we propose a general spatial-temporal graph attention based dynamic graph convolutional network (GAGCN) model to predict traffic flow. GAGCN uses the graph attention networks to extract the spatial associations among nodes hidden in the traffic feature data automatically which can be dynamically adjusted over time. And then the graph convolution network is adjusted based on the spatial associations to extract the spatial features of the road network. Notably, the information of road network structure and human intervention are not required in GAGCN. The forecasting accuracy and the generality are evaluated with two real-world traffic datasets. Experimental results indicate that our GAGCN surpasses the state-of-the-art baselines on one of the two datasets.",
keywords = "Traffic flow forecasting, dynamic spatial-Temporal, graph attention networks, graph convolutional network",
author = "Cong Tang and Jingru Sun and Yichuang Sun and Mu Peng and Nianfei Gan",
note = "Funding Information: This work was supported in part by the Science and Technology Project of Hunan Provincial Communications Department, China, under Grant 2018037, and in part by the National Nature Science Foundation of China under Grant 61674054. Publisher Copyright: {\textcopyright} 2013 IEEE.",
year = "2020",
doi = "10.1109/ACCESS.2020.3018452",
language = "English",
volume = "8",
pages = "153731--153741",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "IEEE",

}

RIS

TY - JOUR

T1 - A General Traffic Flow Prediction Approach Based on Spatial-Temporal Graph Attention

AU - Tang, Cong

AU - Sun, Jingru

AU - Sun, Yichuang

AU - Peng, Mu

AU - Gan, Nianfei

N1 - Funding Information: This work was supported in part by the Science and Technology Project of Hunan Provincial Communications Department, China, under Grant 2018037, and in part by the National Nature Science Foundation of China under Grant 61674054. Publisher Copyright: © 2013 IEEE.

PY - 2020

Y1 - 2020

N2 - Accurate and reliable traffic flow prediction is critical to the safe and stable deployment of intelligent transportation systems. However, it is very challenging due to the complex spatial and temporal dependence of traffic flows. Most existing works require the information of the traffic network structure and human intervention to model the spatial-temporal association of traffic data, resulting in low generality of the model and unsatisfactory prediction performance. In this paper, we propose a general spatial-temporal graph attention based dynamic graph convolutional network (GAGCN) model to predict traffic flow. GAGCN uses the graph attention networks to extract the spatial associations among nodes hidden in the traffic feature data automatically which can be dynamically adjusted over time. And then the graph convolution network is adjusted based on the spatial associations to extract the spatial features of the road network. Notably, the information of road network structure and human intervention are not required in GAGCN. The forecasting accuracy and the generality are evaluated with two real-world traffic datasets. Experimental results indicate that our GAGCN surpasses the state-of-the-art baselines on one of the two datasets.

AB - Accurate and reliable traffic flow prediction is critical to the safe and stable deployment of intelligent transportation systems. However, it is very challenging due to the complex spatial and temporal dependence of traffic flows. Most existing works require the information of the traffic network structure and human intervention to model the spatial-temporal association of traffic data, resulting in low generality of the model and unsatisfactory prediction performance. In this paper, we propose a general spatial-temporal graph attention based dynamic graph convolutional network (GAGCN) model to predict traffic flow. GAGCN uses the graph attention networks to extract the spatial associations among nodes hidden in the traffic feature data automatically which can be dynamically adjusted over time. And then the graph convolution network is adjusted based on the spatial associations to extract the spatial features of the road network. Notably, the information of road network structure and human intervention are not required in GAGCN. The forecasting accuracy and the generality are evaluated with two real-world traffic datasets. Experimental results indicate that our GAGCN surpasses the state-of-the-art baselines on one of the two datasets.

KW - Traffic flow forecasting

KW - dynamic spatial-Temporal

KW - graph attention networks

KW - graph convolutional network

UR - http://www.scopus.com/inward/record.url?scp=85090551802&partnerID=8YFLogxK

U2 - 10.1109/ACCESS.2020.3018452

DO - 10.1109/ACCESS.2020.3018452

M3 - Article

VL - 8

SP - 153731

EP - 153741

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

M1 - 9173702

ER -