Please wait a minute...
Frontiers of Structural and Civil Engineering

ISSN 2095-2430

ISSN 2095-2449(Online)

CN 10-1023/X

Postal Subscription Code 80-968

2018 Impact Factor: 1.272

Front. Struct. Civ. Eng.    2024, Vol. 18 Issue (7) : 1084-1102    https://doi.org/10.1007/s11709-024-1077-z
Bayesian Optimized LightGBM model for predicting the fundamental vibrational period of masonry infilled RC frames
Taimur RAHMAN1, Pengfei ZHENG2(), Shamima SULTANA3
1. Department of Civil Engineering, World University of Bangladesh, Dhaka 1230, Bangladesh
2. School of Civil Engineering, Zhengzhou University, Zhengzhou 450001, China
3. Department of Computer Science & Engineering, University of Asia Pacific, Dhaka 1205, Bangladesh
 Download: PDF(7107 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

The precise prediction of the fundamental vibrational period for reinforced concrete (RC) buildings with infilled walls is essential for structural design, especially earthquake-resistant design. Machine learning models from previous studies, while boasting commendable accuracy in predicting the fundamental period, exhibit vulnerabilities due to lengthy training times and inherent dependence on pre-trained models, especially when engaging with continually evolving data sets. This predicament emphasizes the necessity for a model that adeptly balances predictive accuracy with robust adaptability and swift data training. The latter should include consistent re-training ability as demanded by real-time, continuously updated data sets. This research implements an optimized Light Gradient Boosting Machine (LightGBM) model, highlighting its augmented predictive capabilities, realized through the astute use of Bayesian Optimization for hyperparameter tuning on the FP4026 research data set, and illuminating its adaptability and efficiency in predictive modeling. The results show that the R2 score of LightGBM model is 0.9995 and RMSE is 0.0178, while training speed is 23.2 times faster than that offered by XGBoost and 45.5 times faster than for Gradient Boosting. Furthermore, this study introduces a practical application through a streamlit-powered, web-based dashboard, enabling engineers to effortlessly utilize and augment the model, contributing data and ensuring precise fundamental period predictions, effectively bridging scholarly research and practical applications.

Keywords masonry-infilled RC frame      fundamental period      LightGBM      FP4026 research dataset      machine learning      data-driven approach      Bayesian Optimization     
Corresponding Author(s): Pengfei ZHENG   
Just Accepted Date: 13 June 2024   Online First Date: 11 July 2024    Issue Date: 06 August 2024
 Cite this article:   
Taimur RAHMAN,Pengfei ZHENG,Shamima SULTANA. Bayesian Optimized LightGBM model for predicting the fundamental vibrational period of masonry infilled RC frames[J]. Front. Struct. Civ. Eng., 2024, 18(7): 1084-1102.
 URL:  
https://academic.hep.com.cn/fsce/EN/10.1007/s11709-024-1077-z
https://academic.hep.com.cn/fsce/EN/Y2024/V18/I7/1084
Fig.1  Methodology overview.
Fig.2  Violin plots showing distribution of period (s) for various building characteristics: (a) fundamental period vs. number of storey; (b) fundamental period vs. opening percentage (%); (c) fundamental period vs. length of spans (m); (d) fundamental period vs. number of spans; (e) fundamental period vs. masonry wall stiffness, Et (× 105 kN/m).
Dataset Property Unit Nos. of storey Nos. of spans Length of spans (m) Opening percentage (%) Masonry wall stiffness (×105 kN/m) Period (s)
Training (3220 samples) Min 1.00 2.00 3.00 0.00 2.25 0.04
Max 22.00 6.00 7.50 100.00 25.00 3.57
Mean 11.61 4.95 4.99 63.07 11.77 1.11
SD 6.34 1.55 1.57 40.11 7.79 0.78
Skewness ?0.01 ?1.05 0.17 ?0.51 0.38 0.80
kurtosis ?1.21 ?0.52 ?1.19 ?1.37 ?1.17 ?0.11
Test (806 samples) Min 1.00 2.00 3.00 0.00 2.25 0.04
Max 22.00 6.00 7.50 100.00 25.00 3.50
Mean 11.07 4.94 5.01 63.12 11.74 1.09
SD 6.37 1.55 1.59 40.27 7.79 0.81
Skewness 0.06 ?1.03 0.14 ?0.51 0.37 0.92
kurtosis ?1.17 ?0.56 ?1.22 ?1.38 ?1.16 0.09
Tab.1  Statistical properties of the training and test data
HyperparameterValue range
Maximum number of decision leaves (num_leaves)(20, 150)
Maximum tree depth (max_depth)(5, 50)
Learning rate (learning_rate)(0.05, 0.5)
Number of estimators (n_estimators)(100, 1000)
Minimum sum of instance weight in a child node (min_child_samples)(5, 20)
Fraction of observations to be randomly sampled for each tree (subsample)(0.5, 1)
Column sample by tree (colsample_bytree)(0.8, 1)
L1 regularization term (reg_alpha)(0, 0.1)
L2 regularization term (reg_lambda)(0, 0.5)
Tab.2  Hyperparameters for the LightGBM model
Fig.3  The Bayesian Optimization process for LightGBM hyperparameter tuning.
Fig.4  R2 scores of different hyperparameters.
Fig.5  Sensitivity Analyses of LightGBM Hyperparameters: (a) number of leaves sensitivity analysis; (b) maximum depth sensitivity analysis; (c) learning rate sensitivity analysis; (d) number of estimators sensitivity analysis; (e) minimum child samples sensitivity analysis; (f) subsample sensitivity analysis; (g) column sample by tree sensitivity analysis; (h) regularization alpha sensitivity analysis; (i) regularization lambda sensitivity analysis.
Fig.6  Predicted vs. actual values (LightGBM model): (a) training set; (b) test set.
Fig.7  Percent Error Distribution (LightGBM model): (a) training set; (b) test set.
Model Hyperparameter Value
Decision Tree splitter of nodes (splitter) ‘best’, ‘random’
maximum tree depth (max_depth) ‘None’, 1, 3, 5, 7, 9, 11
minimum samples for split (min_samples_split) 2, 5, 10, 20, 50
minimum samples of leaf node (min_samples_leaf) 1, 2, 5, 10, 20, 50
number of features (max_features) ‘None’, ‘auto’, ‘sqrt’, ‘log2
minimal cost-complexity pruning (ccp_alpha) 0.0, 0.01, 0.1, 1, 10
Random Forest number of estimators (n_estimators) 100, 500, 1000
maximum tree depth (max_depth) ‘None’, 5, 10, 15, 20
minimum samples for split (min_samples_split) 2, 5, 10
minimum samples of leaf node (min_samples_leaf) 1, 2, 4
number of features (max_features) ‘auto’, ‘sqrt’, ‘log2
XGBoost number of estimators (n_estimators) 100, 500, 1000
maximum tree depth (max_depth) ‘None’, 1, 3, 5, 7, 9, 11
learning rate (learning_rate) 0.01, 0.1, 0.2
minimum sum of instance weight in a child node (min_child_weight) 1, 3, 5
minimum loss reduction (gamma) 0, 0.1, 0.2
fraction of observations to be randomly sampled for each tree (subsample) 0.8, 1
column sample by tree (colsample_bytree) 0.8, 1
L1 regularization term (reg_alpha) 0, 0.1, 1
L2 regularization term (reg_lambda) 0.1, 1, 10
Gradient Boosting number of estimators (n_estimators) 100, 500, 1000
maximum tree depth (max_depth) ‘None’, 1, 3, 5, 7, 9, 11
learning rate (learning_rate) 0.01, 0.1, 0.2
minimum samples for split (min_samples_split) 2, 5, 10
minimum samples of leaf node (min_samples_leaf) 1, 3, 5
number of features (max_features) ‘None’, ‘sqrt’, ‘log2
fraction of observations to be randomly sampled for each tree (subsample) 0.8, 1
regularization parameter (alpha) 0.1, 0.5, 0.9
minimal cost-complexity pruning (ccp_alpha) 0, 0.1, 1
CatBoost number of iterations (iterations) 100, 500, 1000
maximum tree depth (depth) 3, 5, 7, 9, 11
learning rate (learning_rate) 0.01, 0.1, 0.2
L2 regularization (l2_leaf_reg) 1, 3, 5, 7, 9
border count (border_count) 32, 64, 128
bagging temperature (bagging_temperature) 0, 0.5, 1
random strength (random_strength) 0.2, 0.5, 0.8
NGBoost number of estimators (n_estimators) 100, 500, 1000
learning rate (learning_rate) 0.01, 0.1, 0.2
mini-batch fraction (minibatch_frac) 0.5, 0.7, 1.0
natural gradient (natural_gradient) ‘True’, ‘False’
verbosity (verbose) ‘True’, ‘False’
Tab.3  Hyperparameters for the predictive models
Fig.8  Predicted vs. actual values and Percent Error Distribution for the compared models: (a) predicted vs. actual values (Decision Tree model); (b) Percent Error Distribution (Decision Tree model); (c) predicted vs. actual values (Random Forest model); (d) Percent Error Distribution (Random Forest model); (e) predicted vs. actual values (XGBoost model); (f) Percent Error Distribution (XGBoost model); (g) predicted vs. actual values (Gradient Boosting model); (h) Percent Error Distribution (Gradient Boosting model); (i) predicted vs. actual values (CatBoost model); (j) Percent Error Distribution (CatBoost model); (k) predicted vs. actual values (NGBoost model); (l) Percent Error Distribution (NGBoost model); (m) predicted vs. actual values (LightGBM model); (n) Percent Error Distribution (LightGBM model).
Machine learning modelR2RMSEMAPEMAEModel training time (s)
TrainTestTrainTestTrainTestTrainTest
LightGBM0.99990.99950.00720.01780.69831.42130.00470.00988.44
Decision Tree10.99820.00070.03440.01292.521700.01786.42
Random Forest0.99990.99920.00940.02320.64421.74690.00490.012458.24
XGBoost0.99990.99970.00810.01490.82851.27540.00530.0081196.23
Gradient Boosting10.99980.00350.01260.2750.74610.00180.0049383.92
CatBoost10.99970.00170.01370.16710.58580.00110.0036410.69
NGBoost0.99950.99920.01720.02241.592.52900.01010.0135973.04
ANN [40]0.9910.9930.07270.068?????
Tab.4  Performance evaluation of different models
Fig.9  Model performance comparison on evaluation metrics.
Fig.10  Fundamental periods dashboard overview: (a) dashboard homepage; (b) predicted period (s) calculation; (c) data contribution.
Fig.11  Workflow of the user interaction process on the dashboard.
1 Y J Chiou, J C Tzeng, Y W Liou. Experimental and analytical study of masonry infilled frames. Journal of Structural Engineering, 1999, 125(10): 1109–1117
https://doi.org/10.1061/(ASCE)0733-9445(1999)125:10(1109
2 F Colangelo. Pseudo-dynamic seismic response of reinforced concrete frames infilled with non-structural brick masonry. Earthquake Engineering & Structural Dynamics, 2005, 34(10): 1219–1241
https://doi.org/10.1002/eqe.477
3 A De Angelis, M R Pecce. The structural identification of the infill walls contribution in the dynamic response of framed buildings. Structural Control and Health Monitoring, 2019, 26(9): e2405
https://doi.org/10.1002/stc.2405
4 M N Fardis, T B Panagiotakos. Seismic design and response of bare and masonry-infilled reinforced concrete buildings. Part II: Infilled structures. Journal of Earthquake Engineering, 1997, 1(3): 475–503
https://doi.org/10.1080/13632469708962375
5 A S Gago, J Alfaiate, A Lamas. The effect of the infill in arched structures: Analytical and numerical modelling. Engineering Structures, 2011, 33(5): 1450–1458
https://doi.org/10.1016/j.engstruct.2010.12.037
6 H Singh, D K Paul, V V Sastry. Inelastic dynamic response of reinforced concrete infilled frames. Computers & Structures, 1998, 69(6): 685–693
https://doi.org/10.1016/S0045-7949(98)00124-2
7 F Wang, K Zhao, J Zhang, K Yan. Influence of different types of infill walls on the hysteretic performance of reinforced concrete frames. Buildings, 2021, 11(7): 310–328
https://doi.org/10.3390/buildings11070310
8 P G Asteris, A K Tsaris, L Cavaleri, C C Repapis, A Papalou, F Di Trapani, D F Karypidis. Prediction of the fundamental period of infilled RC frame structures using artificial neural networks. Computational Intelligence and Neuroscience, 2016, 2016: 1–12
https://doi.org/10.1155/2016/5104907
9 P G Asteris, C C Repapis, E V Repapi, L Cavaleri. Fundamental period of infilled reinforced concrete frame structures. Structure and Infrastructure Engineering, 2017, 13(7): 929–941
https://doi.org/10.1080/15732479.2016.1227341
10 P G Asteris, C C Repapis, L Cavaleri, V Sarhosis, A Athanasopoulou. On the fundamental period of infilled RC frame buildings. Structural Engineering and Mechanics, 2015, 54(6): 1175–1200
https://doi.org/10.12989/sem.2015.54.6.1175
11 P G Asteris, C C Repapis, A K Tsaris, F Di Trapani, L Cavaleri. Parameters affecting the fundamental period of infilled RC frame structures. Earthquakes and Structures, 2015, 9(5): 999–1028
https://doi.org/10.12989/eas.2015.9.5.999
12 K Chethan, R Babu, K Venkataramana, A Sharma. Influence of masonry infill on fundamental natural frequency of 2D RC frames. Journal of Structural Engineering, 2010, 37(2): 135–141
13 R Jiang, L Jiang, Y Hu, J Ye, L Zhou. A simplified method for estimating the fundamental period of masonry infilled reinforced concrete frames. Structural Engineering and Mechanics, 2020, 74(6): 821–832
14 A Koçak, A Kalyoncuoğlu, B Zengin. Effect of infill wall and wall openings on the fundamental period of RC buildings. Earthquake Resistant Engineering Structures IX, 2013, 132: 121–131
15 M M Kose. Parameters affecting the fundamental period of RC buildings with infill walls. Engineering Structures, 2009, 31(1): 93–102
https://doi.org/10.1016/j.engstruct.2008.07.017
16 A Masi, M Vona. Experimental and numerical evaluation of the fundamental period of undamaged and damaged RC framed buildings. Bulletin of Earthquake Engineering, 2010, 8(3): 643–656
https://doi.org/10.1007/s10518-009-9136-3
17 P Ricci, G M Verderame, G Manfredi. Analytical investigation of elastic period of infilled RC MRF buildings. Engineering Structures, 2011, 33(2): 308–319
https://doi.org/10.1016/j.engstruct.2010.10.009
18 D M Dimiduk, E A Holm, S R Niezgoda. Perspectives on the impact of machine learning, deep learning, and artificial intelligence on materials, processes, and structures engineering. Integrating Materials and Manufacturing Innovation, 2018, 7(3): 157–172
https://doi.org/10.1007/s40192-018-0117-8
19 P H Jasmine, S Arun. Machine learning applications in structural engineering—A review. IOP Conference Series: Materials Science and Engineering, 2021, 1114(1): 012012
20 S Lee, J Ha, M Zokhirova, H Moon, J Lee. Background information of deep learning for structural engineering. Archives of Computational Methods in Engineering, 2018, 25(1): 121–129
https://doi.org/10.1007/s11831-017-9237-0
21 H Salehi, R Burgueño. Emerging artificial intelligence methods in structural engineering. Engineering Structures, 2018, 171: 170–189
https://doi.org/10.1016/j.engstruct.2018.05.084
22 H Sun, H V Burton, H Huang. Machine learning applications for building structural design and performance assessment: State-of-the-art review. Journal of Building Engineering, 2021, 33: 101816
https://doi.org/10.1016/j.jobe.2020.101816
23 K M Hamdia, X Zhuang, T Rabczuk. An efficient optimization approach for designing machine learning models based on genetic algorithm. Neural Computing & Applications, 2021, 33(6): 1923–1933
https://doi.org/10.1007/s00521-020-05035-x
24 N A Nariman, K Hamdia, A M Ramadan, H Sadaghian. Optimum design of flexural strength and stiffness for reinforced concrete beams using machine learning. Applied Sciences, 2021, 11(18): 8762–8777
https://doi.org/10.3390/app11188762
25 H Guo, X Zhuang, N Alajlan, T Rabczuk. Physics-informed deep learning for melting heat transfer analysis with model-based transfer learning. Computers & Mathematics with Applications, 2023, 143: 303–317
https://doi.org/10.1016/j.camwa.2023.05.014
26 H Guo, X Zhuang, P Chen, N Alajlan, T Rabczuk. Stochastic deep collocation method based on neural architecture search and transfer learning for heterogeneous porous media. Engineering with Computers, 2022, 38(6): 5173–5198
https://doi.org/10.1007/s00366-021-01586-2
27 H Guo, X Zhuang, X Fu, Y Zhu, T Rabczuk. Physics-informed deep learning for three-dimensional transient heat transfer analysis of functionally graded materials. Computational Mechanics, 2023, 72(3): 513–524
https://doi.org/10.1007/s00466-023-02287-x
28 H Guo, X Zhuang, T Rabczuk. A deep collocation method for the bending analysis of Kirchhoff plate. Computers, Materials & Continua, 2019, 59(2): 433–456
https://doi.org/10.32604/cmc.2019.06660
29 E Samaniego, C Anitescu, S Goswami, V M Nguyen-Thanh, H Guo, K Hamdia, X Zhuang, T Rabczuk. An energy approach to the solution of partial differential equations in computational mechanics via machine learning: Concepts, implementation and applications. Computer Methods in Applied Mechanics and Engineering, 2020, 362: 112790
https://doi.org/10.1016/j.cma.2019.112790
30 X Zhuang, H Guo, N Alajlan, H Zhu, T Rabczuk. Deep autoencoder based energy method for the bending, vibration, and buckling analysis of Kirchhoff plates with transfer learning. European Journal of Mechanics. A, Solids, 2021, 87: 104225
https://doi.org/10.1016/j.euromechsol.2021.104225
31 T Sang-To, H Le-Minh, M Abdel Wahab, C L Thanh. A new metaheuristic algorithm: Shrimp and Goby association search algorithm and its application for damage identification in large-scale and complex structures. Advances in Engineering Software, 2023, 176: 103363
https://doi.org/10.1016/j.advengsoft.2022.103363
32 H L Minh, S Khatir, R V Rao, M Abdel Wahab, T Cuong-Le. A variable velocity strategy particle swarm optimization algorithm (VVS-PSO) for damage assessment in structures. Engineering with Computers, 2023, 39(2): 1055–1084
https://doi.org/10.1007/s00366-021-01451-2
33 L V Ho, T T Trinh, G De Roeck, T Bui-Tien, L Nguyen-Ngoc, M Abdel Wahab. An efficient stochastic-based coupled model for damage identification in plate structures. Engineering Failure Analysis, 2022, 131: 105866
https://doi.org/10.1016/j.engfailanal.2021.105866
34 T Nghia-Nguyen, M Kikumoto, H Nguyen-Xuan, S Khatir, M Abdel Wahab, T Cuong-Le. Optimization of artificial neutral networks architecture for predicting compression parameters using piezocone penetration test. Expert Systems with Applications, 2023, 223: 119832
https://doi.org/10.1016/j.eswa.2023.119832
35 V T Tran, T K Nguyen, H Nguyen-Xuan, M Abdel Wahab. Vibration and buckling optimization of functionally graded porous microplates using BCMO-ANN algorithm. Thin-walled Structures, 2023, 182: 110267
https://doi.org/10.1016/j.tws.2022.110267
36 P G Asteris, M Nikoo. Artificial bee colony-based neural network for the prediction of the fundamental period of infilled frame structures. Neural Computing & Applications, 2019, 31(9): 4837–4847
https://doi.org/10.1007/s00521-018-03965-1
37 M Mirrashid, H Naderpour. Computational intelligence-based models for estimating the fundamental period of infilled reinforced concrete frames. Journal of Building Engineering, 2022, 46: 103456
https://doi.org/10.1016/j.jobe.2021.103456
38 I Latif, A Banerjee, M Surana. Explainable machine learning aided optimization of masonry infilled reinforced concrete frames. Structures, 2022, 44: 1751–1766
https://doi.org/10.1016/j.istruc.2022.08.115
39 S N Somala, K Karthikeyan, S Mangalathu. Time period estimation of masonry infilled RC frames using machine learning techniques. Structures, 2021, 34: 1560–1566
https://doi.org/10.1016/j.istruc.2021.08.088
40 A E Charalampakis, G C Tsiatas, S B Kotsiantis. Machine learning and nonlinear models for the estimation of fundamental period of vibration of masonry infilled RC frame structures. Engineering Structures, 2020, 216: 110765
https://doi.org/10.1016/j.engstruct.2020.110765
41 N BioudI LaidM A Benbouras. Estimating the fundamental period of infilled RC frame structures via deep learning. Urbanism. Architecture. Constructions, 2023,14:1–22
42 C Cakiroglu, G Bekdaş, S Kim, Z W Geem. Explainable ensemble learning models for the rheological properties of self-compacting concrete. Sustainability, 2022, 14(21): 14640
https://doi.org/10.3390/su142114640
43 D Chakraborty, H Elhegazy, H Elzarka, L Gutierrez. A novel construction cost prediction model using hybrid natural and light gradient boosting. Advanced Engineering Informatics, 2020, 46: 101201
https://doi.org/10.1016/j.aei.2020.101201
44 P Chun, S Izumi, T Yamane. Automatic detection method of cracks from concrete surface imagery using two-step light gradient boosting machine. Computer-Aided Civil and Infrastructure Engineering, 2021, 36(1): 61–72
https://doi.org/10.1111/mice.12564
45 S Kookalani, B Cheng, J L C Torres. Structural performance assessment of GFRP elastic gridshells by machine learning interpretability methods. Frontiers of Structural and Civil Engineering, 2022, 16(10): 1249–1266
https://doi.org/10.1007/s11709-022-0858-5
46 S Mangalathu, H Jang, S H Hwang, J S Jeon. Data-driven machine-learning-based seismic failure mode identification of reinforced concrete shear walls. Engineering Structures, 2020, 208: 110331
https://doi.org/10.1016/j.engstruct.2020.110331
47 M Z Naser, V Kodur, H T Thai, R Hawileh, J Abdalla, V V Degtyarev. StructuresNet and FireNet: Benchmarking databases and machine learning algorithms in structural and fire engineering domains. Journal of Building Engineering, 2021, 44: 102977
https://doi.org/10.1016/j.jobe.2021.102977
48 Z Ding, W Zhang, D Zhu. Neural-network based wind pressure prediction for low-rise buildings with genetic algorithm and Bayesian optimization. Engineering Structures, 2022, 260: 114203
https://doi.org/10.1016/j.engstruct.2022.114203
49 T Lookman, F Alexander, K Rajan. Information Science for Materials Discovery and Design. Switzerland: Springer, 2016,
50 A Mathern, O S Steinholtz, A Sjöberg, M Önnheim, K Ek, R Rempling, E Gustavsson, M Jirstrand. Multi-objective constrained Bayesian optimization for structural design. Structural and Multidisciplinary Optimization, 2021, 63(2): 689–701
https://doi.org/10.1007/s00158-020-02720-2
51 S Sajedi, X Liang. Deep generative Bayesian optimization for sensor placement in structural health monitoring. Computer-Aided Civil and Infrastructure Engineering, 2022, 37(9): 1109–1127
https://doi.org/10.1111/mice.12799
52 W Zhang, C Wu, H Zhong, Y Li, L Wang. Prediction of undrained shear strength using extreme gradient boosting and random forest based on Bayesian optimization. Geoscience Frontiers, 2021, 12(1): 469–477
https://doi.org/10.1016/j.gsf.2020.03.007
53 P G Asteris. The FP4026 Research Database on the fundamental period of RC infilled frame structures. Data in Brief, 2016, 9: 704–709
https://doi.org/10.1016/j.dib.2016.10.002
54 G KeQ MengT FinleyT WangW ChenW MaQ YeT Liu. LightGBM: A highly efficient gradient boosting decision tree. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. New York: Curran Associates Inc., 2017, 3149–3157
55 J H Friedman. Greedy function approximation: A gradient boosting machine. Annals of Statistics, 2001, 29(5): 1189–1232
https://doi.org/10.1214/aos/1013203451
56 E BrochuV M CoraN de Freitas. A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning. 2010, arXiv: 1012.2599
57 P I Frazier. A Tutorial on Bayesian Optimization. 2018. arXiv: 1807.02811
58 B Shahriari, K Swersky, Z Wang, R P Adams, N de Freitas. Taking the human out of the loop: A review of Bayesian Optimization. proceedings of the IEEE, 2016, 104(1): 148–175
https://doi.org/10.1109/JPROC.2015.2494218
59 C E Rasmussen. Gaussian Processes in Machine Learning. In: Bousquet O, von Luxburg U, Rätsch G, eds. Advanced Lectures on Machine Learning. Berlin: Springer, 2004, 63–71
60 J SnoekH LarochelleR P Adams. Practical Bayesian optimization of machine learning algorithms. In: Proceedings of the 25th International Conference on Neural Information Processing Systems—Volume 2. New York: Curran Associates Inc., 2012, 2951–2959
[1] Huong-Giang Thi HOANG, Hai-Van Thi MAI, Hoang Long NGUYEN, Hai-Bang LY. Application of extreme gradient boosting in predicting the viscoelastic characteristics of graphene oxide modified asphalt at medium and high temperatures[J]. Front. Struct. Civ. Eng., 2024, 18(6): 899-917.
[2] Aybike Özyüksel Çiftçioğlu, M.Z. Naser. Fire resistance evaluation through synthetic fire tests and generative adversarial networks[J]. Front. Struct. Civ. Eng., 2024, 18(4): 587-614.
[3] Dinh-Nhat TRUONG, Van-Lan TO, Gia Toai TRUONG, Hyoun-Seung JANG. Engineering punching shear strength of flat slabs predicted by nature-inspired metaheuristic optimized regression system[J]. Front. Struct. Civ. Eng., 2024, 18(4): 551-567.
[4] Shenggang CHEN, Congcong CHEN, Shengyuan LI, Junying GUO, Quanquan GUO, Chaolai LI. Predicting torsional capacity of reinforced concrete members by data-driven machine learning models[J]. Front. Struct. Civ. Eng., 2024, 18(3): 444-460.
[5] Zhan SHU, Ao WU, Yuning SI, Hanlin DONG, Dejiang WANG, Yifan LI. Automated identification of steel weld defects, a convolutional neural network improved machine learning approach[J]. Front. Struct. Civ. Eng., 2024, 18(2): 294-308.
[6] Lei-jie WU, Xu LI, Ji-dong YUAN, Shuang-jing WANG. Real-time prediction of tunnel face conditions using XGBoost Random Forest algorithm[J]. Front. Struct. Civ. Eng., 2023, 17(12): 1777-1795.
[7] Van Quan TRAN, Hai-Van Thi MAI, Thuy-Anh NGUYEN, Hai-Bang LY. Assessment of different machine learning techniques in predicting the compressive strength of self-compacting concrete[J]. Front. Struct. Civ. Eng., 2022, 16(7): 928-945.
[8] Nang Duc BUI, Hieu Chi PHAN, Tiep Duc PHAM, Ashutosh Sutra DHAR. A hierarchical system to predict behavior of soil and cantilever sheet wall by data-driven models[J]. Front. Struct. Civ. Eng., 2022, 16(6): 667-684.
[9] Wafaa Mohamed SHABAN, Khalid ELBAZ, Mohamed AMIN, Ayat gamal ASHOUR. A new systematic firefly algorithm for forecasting the durability of reinforced recycled aggregate concrete[J]. Front. Struct. Civ. Eng., 2022, 16(3): 329-346.
[10] Hai-Bang LY, Huong-Lan Thi VU, Lanh Si HO, Binh Thai PHAM. Dimensionality reduction and prediction of soil consolidation coefficient using random forest coupling with Relief algorithm[J]. Front. Struct. Civ. Eng., 2022, 16(2): 224-238.
[11] Soheila KOOKALANI, Bin CHENG, Jose Luis Chavez TORRES. Structural performance assessment of GFRP elastic gridshells by machine learning interpretability methods[J]. Front. Struct. Civ. Eng., 2022, 16(10): 1249-1266.
[12] Gebrail BEKDAŞ, Melda YÜCEL, Sinan Melih NIGDELI. Estimation of optimum design of structural systems via machine learning[J]. Front. Struct. Civ. Eng., 2021, 15(6): 1441-1452.
[13] Shan LIN, Hong ZHENG, Chao HAN, Bei HAN, Wei LI. Evaluation and prediction of slope stability using machine learning approaches[J]. Front. Struct. Civ. Eng., 2021, 15(4): 821-833.
[14] Amin MAHDAVI-MEYMAND, Mohammad ZOUNEMAT-KERMANI, Kourosh QADERI. Prediction of hydro-suction dredging depth using data-driven methods[J]. Front. Struct. Civ. Eng., 2021, 15(3): 652-664.
[15] Zohreh SHEIKH KHOZANI, Khabat KHOSRAVI, Mohammadamin TORABI, Amir MOSAVI, Bahram REZAEI, Timon RABCZUK. Shear stress distribution prediction in symmetric compound channels using data mining and machine learning models[J]. Front. Struct. Civ. Eng., 2020, 14(5): 1097-1109.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed