Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT)
Road tunnels might be exposed to Boiling Liquid Expansion Vapor Explosion (BLEVE) due to the transportation of liquified gas tankers passing through road tunnels. An efficient and accurate prediction of the response of road tunnels under internal BLEVEs can facilitate the reliable BLEVE-resistant de...
| Main Authors: | , , , |
|---|---|
| Format: | Journal Article |
| Published: |
2024
|
| Online Access: | http://purl.org/au-research/grants/arc/FL180100196 http://hdl.handle.net/20.500.11937/96052 |
| _version_ | 1848766083838771200 |
|---|---|
| author | Cheng, Ruishan Chen, Wensu Hao, Hong Li, Qilin |
| author_facet | Cheng, Ruishan Chen, Wensu Hao, Hong Li, Qilin |
| author_sort | Cheng, Ruishan |
| building | Curtin Institutional Repository |
| collection | Online Access |
| description | Road tunnels might be exposed to Boiling Liquid Expansion Vapor Explosion (BLEVE) due to the transportation of liquified gas tankers passing through road tunnels. An efficient and accurate prediction of the response of road tunnels under internal BLEVEs can facilitate the reliable BLEVE-resistant design and risk assessment of road tunnels. This study introduces an advanced deep-learning model that employs a Transformer-based architecture with a modified self-attention mechanism, termed as Self-Attention Modified Transformer (SAMT), to predict BLEVE-induced support rotation of tunnel structure, which is a common criterion in assessing reinforced concrete structure damage to blast loads. Unlike the Transformer with the traditional self-attention mechanism, the proposed SAMT effectively aggregates global information across all variables while mitigating undue dependencies among uncorrelated variables. Consequently, the proposed SAMT is better suited for processing tabular data with uncorrelated variables. The feasibility and advantages of the proposed SAMT are verified by extensive data generated using calibrated numerical models of box-shaped road tunnels subjected to internal BLEVEs. By comparing the performance of the proposed SAMT with the non-modified Transformer network (FT-Transformer) as well as two other typical deep learning networks, i.e., Multi-layer Perceptron (MLP) and Residual Network (ResNet), it is found that the SAMT offers higher prediction accuracy and robustness than the other three models in predicting BLEVE-induced support rotations of box-shaped road tunnels. The study demonstrates that the proposed SAMT is an effective tool for the prediction of BLEVE-induced support rotations of road tunnels. |
| first_indexed | 2025-11-14T11:45:31Z |
| format | Journal Article |
| id | curtin-20.500.11937-96052 |
| institution | Curtin University Malaysia |
| institution_category | Local University |
| last_indexed | 2025-11-14T11:45:31Z |
| publishDate | 2024 |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | curtin-20.500.11937-960522024-11-07T00:50:24Z Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) Cheng, Ruishan Chen, Wensu Hao, Hong Li, Qilin Road tunnels might be exposed to Boiling Liquid Expansion Vapor Explosion (BLEVE) due to the transportation of liquified gas tankers passing through road tunnels. An efficient and accurate prediction of the response of road tunnels under internal BLEVEs can facilitate the reliable BLEVE-resistant design and risk assessment of road tunnels. This study introduces an advanced deep-learning model that employs a Transformer-based architecture with a modified self-attention mechanism, termed as Self-Attention Modified Transformer (SAMT), to predict BLEVE-induced support rotation of tunnel structure, which is a common criterion in assessing reinforced concrete structure damage to blast loads. Unlike the Transformer with the traditional self-attention mechanism, the proposed SAMT effectively aggregates global information across all variables while mitigating undue dependencies among uncorrelated variables. Consequently, the proposed SAMT is better suited for processing tabular data with uncorrelated variables. The feasibility and advantages of the proposed SAMT are verified by extensive data generated using calibrated numerical models of box-shaped road tunnels subjected to internal BLEVEs. By comparing the performance of the proposed SAMT with the non-modified Transformer network (FT-Transformer) as well as two other typical deep learning networks, i.e., Multi-layer Perceptron (MLP) and Residual Network (ResNet), it is found that the SAMT offers higher prediction accuracy and robustness than the other three models in predicting BLEVE-induced support rotations of box-shaped road tunnels. The study demonstrates that the proposed SAMT is an effective tool for the prediction of BLEVE-induced support rotations of road tunnels. 2024 Journal Article http://hdl.handle.net/20.500.11937/96052 10.1016/j.engstruct.2024.118415 http://purl.org/au-research/grants/arc/FL180100196 https://creativecommons.org/licenses/by/4.0/ fulltext |
| spellingShingle | Cheng, Ruishan Chen, Wensu Hao, Hong Li, Qilin Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) |
| title | Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) |
| title_full | Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) |
| title_fullStr | Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) |
| title_full_unstemmed | Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) |
| title_short | Prediction of BLEVE-induced response of road tunnel using Transformer network with modified self-attention (SAMT) |
| title_sort | prediction of bleve-induced response of road tunnel using transformer network with modified self-attention (samt) |
| url | http://purl.org/au-research/grants/arc/FL180100196 http://hdl.handle.net/20.500.11937/96052 |