Abstract
The reinforced concrete (RC) infrastructure can be retrofitted by adhesively bonding fiber-reinforced polymers (FRPs) to the tension face. In the FRP-to-concrete bonding system, the debonding of the FRP plate from the member is the most common failure type. Predicting the bond strength of FRP-to-concrete joints using traditional predictive models is far from being satisfactory because of the highly nonlinear relationships between the bond strength and a large number of influencing variables. To address this issue, this study proposes a metaheuristic-optimized least-squares support vector regression (LSSVR) model to predict the bond strength of FRP-to-concrete joints. The hyperparameters of the LSSVR model are tuned using a recently proposed beetle antennae search (BAS) algorithm. In addition, the Levy flight is incorporated in the BAS algorithm to improve its searching efficiency. The proposed model is then trained on a dataset collected from internationally published literature. To understand the importance of each input variable on the bond strength, the variable importance is calculated using the random forest algorithm. The results show that the proposed LBAS-LSSVR model has comparatively high prediction accuracy, as indicated by a high correlation coefficient (0.983) and low root mean square error (1.99 MPa) on the test set. Width of FRP is the most sensitive variable to the bond strength. The proposed model can be extended to solve other regression problems in structural engineering.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The reinforced concrete (RC) beam can be retrofitted or strengthened using external bonding of fiber-reinforced polymer (FRP) plates. The flexural strength of a concrete beam can be increased by bonding the FRP plates to the tension face (soffit) of the reinforced concrete (RC) beam [20]. This technique has been widely used in bridges, buildings and tunnel lining due to a variety of advantages such as good corrosion resistance, ease of site handling and minimum increase in weight and structural size [28, 45, 55].
In the FRP-to-concrete bonding system, the debonding of the FRP plate from the member is usually caused by the concrete fracture close to the FRP-to- concrete interfaces. The occurrence of the debonding failure will reduce the strain of the bonded FRP plate to 20–50% of the rupture strain of the FRP plate. Therefore, FRP plates are not fully utilized when the debonding failure occurs [17, 26]. Moreover, the deformability of the strengthened member will be reduced with the premature failure [5]. A variety of factors need to be considered when explaining the bond mechanism, such as FRP material properties, adhesive, and RC properties [10]. Many experiments have been conducted to evaluate the bond strength. The two widely applied experiments are modified beam tests [56, 68], and single shear tests [3, 11, 12] as they can be easily implemented in laboratory. However, as there exist a large number of influencing variables, it is impossible to evaluate the bond strength accurately using the time and money consuming laboratory experiments.
To address this issue, statistical models were proposed to predict the bond strength [14, 34, 48]. These models evaluate the bond strength of FRP-to-concrete joints by combining the fracture energy, the axial rigidity of the FRP system and the bond width. Generally, empirical relationships are established in these models by using small test data, and there are not enough factors considered in these models. Therefore, the generalization ability of these models is not high enough.
To address this problem, machine learning (ML) algorithms are used to predict the bond strength of FRP-to-concrete joints as ML algorithms do not rely on explicit equations. Among many ML methods, support vector regression (SVR) has been extensively applied in civil engineering such as prediction of unconfined compressive strength (UCS) [19, 63], elastic modulus [50, 59], tensile strength [60] and fresh-state properties [47]. As one of the best ML algorithms, SVR is able to solve regression problems by estimating the implicit functions accurately using kernel tricks and risk minimization principles [31]. The SVR algorithm has a number of advantages over other ML models, including excellent performance on small datasets [13], and non-convergence of local solution [39]. In spite of the benefits, SVR is computationally demanding. The computational efficiency can be achieved by introducing equality constraints to achieve closed form least-square-type solutions. This form of SVR is called least-squares support vector regression (LSSVR) [52]. The prediction accuracy and generalization ability of LSSVR rely on its hyperparameters. In order to tune and find optimal hyperparameters, a new metaheuristic algorithm called beetle antennae search (BAS) is used for hyperparameter tuning. It is easy to implement and converges very fast because only one beetle is used to search rather than a beetle swarm. Thus, in the present study, the BAS is employed to tune the LSSVR model. However, the BAS can be easily trapped into a local optima [25, 64]. To solve this problem, this study modifies the BAS by incorporating the Levy flight to improve its searching efficiency. Based on this modification, a Levy-BAS based LSSVR (LBAS-LSSVR) method is proposed for predicting bond strength of FRP-to-concrete joints. This study makes several contributions to the literature as follows:
-
(i)
It firstly proposes a metaheuristic-based regression model to predict the bond strength of FRP-to-concrete joints. The prediction accuracy of the proposed model is higher than the other ML approaches.
-
(ii)
The recently developed BAS algorithm is firstly utilized to tune the hyperparameters of LSSVR, which outperforms other metaheuristic algorithms in terms of simple implementation, rapid convergence, and global optima.
-
(iii)
The Levy flight is incorporated into the BAS algorithm to avoid being trapped into local optima, which improves the searching efficiency of BAS obviously.
-
(iv)
The importance of each influencing variable on the bond strength of FRP-to-concrete joints is computed using the random forest algorithm.
The remainder of this paper will describe the applied ML methods in Sect. 2, construction of the proposed regression model in Sect. 3, and evaluation of the performance of the proposed model in Sect. 4.
2 Overview of the used algorithms
2.1 Least-squares support vector regression
Least-squares support vector regression (LSSVR) is the least-squares version of support vector regression (SVR). SVR can convert a nonlinear problem into a linear one by mapping the data from the sample space into a higher dimensional characteristic space using a kernel trick to learn the complicated relationship between predictors and outputs. SVR is extensively used due to its advantages such as excellent generalization ability, rapid learning speed and good noise-tolerating ability [9, 46].
Suppose a training dataset of n points is given as follows:
where each xi is an l-dimensional real director and yi is the scalar regression value. The regression function can be derived using this dataset in the following form:
where \(\varphi \left( {\mathbf{x}} \right)\) is a nonlinear mapping function; w represents the weight vector; b is the bias. f(x) is required to be as flat as possible. Flatness in (2) means the Euclidean norm of w, i.e., ||w||2 needs to be minimized [46]. If for each instance xi, the deviation between f(xi) and yi is less than \(\varepsilon\) (the largest tolerance error), the function f(xi) is said to be found. A loss function using the \(\varepsilon\)-insensitive factor is employed to measure the deviation degree:
This function indicates that the training points within the \(\varepsilon\)-tube are not penalized and only the data situated on or outside the \(\varepsilon\)-tube will be used as support vectors to build f(x). According to the structural risk minimization [2], the problem can be written as follows:
Some errors are allowed sometimes, and therefore slack variables \(\xi_{i}\) and \(\xi_{i}^{*}\) are introduced to cope with infeasible constraints. The above problem can then be converted into the following convex optimization form:
where C is a penalty parameter to determine the trade-off between the flatness of f(x) and the penalizing extent of the sample outside the tube. An example of nonlinear SVR with an ε-tube is shown in Fig. 1.
To address problems with constraints, Lagrange multipliers can be used as follows:
where \(\alpha_{i} \ge 0\), \(\alpha_{i}^{*} \ge 0\), \(\mu_{i} \ge 0\) and \(\mu_{i}^{*} \ge 0\) are Lagrange multipliers. When the constraint functions have strong duality and the objective function is differentiable, KKT conditions must be satisfied for each pair of the primal and dual optimal points [6] as follows:
In addition, the product between the constraints and the dual variables ought to be 0 based on the KKT condition at the optimal solution:
By solving the above equations, the Lagrange dual problem can be derived as follows:
From Eq. 8, the weight vector can be obtained as \({\mathbf{w}} = \sum_{i = 1}^{n} \left( {\alpha_{i} - \alpha_{i}^{*} } \right)\varphi \left( {{\mathbf{x}}_{\varvec{i}} } \right)\), and therefore the regression function can be derived as:
For LSSVR, the total error in SVR is replaced by the sum of the squared error variable \(\xi_{i}\). Therefore, the primal optimization problem in Eq. 5 can be reorganized for the LSSVR model as follows:
In a way similar to solving the SVR problem, the Lagrangian function should be applied to obtain the dual optimization problem of LSSVR as follows:
where \(\lambda_{i}\) are the Lagrangian multipliers. The following conditions should be satisfied to achieve the optimal solutions:
After the w and \(\xi\) terms are eliminated, the solutions can be obtained as follows:
where \({\mathbf{e}}_{n \times 1}\) = [1,1,…,1]; \(\varvec{\lambda}= [\lambda_{1} ,\lambda_{2} , \ldots ,\lambda_{n} ]\); \(\varvec{Y} = [y_{1} ,y_{2} , \ldots ,y_{n} ]\); \({\mathbf{I}}_{n}\) is the identity matrix; \({\varvec{\Omega}}_{{\varvec{i},\varvec{j}}} = \varphi^{T} \left( {{\mathbf{x}}_{i} } \right)\cdot\varphi \left( {{\mathbf{x}}_{i} } \right) = {\mathcal{X}} ({{\mathbf{x}}_{i} ,{\mathbf{x}}_{j}})\); and \({\mathcal{X}}\) denotes the kernel function. After obtaining \(\varvec{\lambda}\) and b from Eq. (13), the regression model for LSSVR can be expressed as:
2.2 Beetle antennae search algorithm
Beetle antennae search (BAS) is a recently proposed metaheuristic algorithm to solve optimization problems inspired by the searching behavior of beetles [25, 62]. A beetle searches neighboring areas using two antennae and moves to a location of higher concentration of odour. Assume that the position of the beetle is represented by a vector \({\mathbf{x}}^{\varvec{i}}\) at the ith time instant (i = 1, 2,…). We define the fitness function as f(x) representing the concentration of odour at position x with its maximum value denoting the source point of the odour. It can search for the global optimum in a multi-dimensional space by a general function:
The searching behavior of the beetle can be defined as follows:
where b is a normalized random unit vector;\(rnd(\cdot)\) is a random function; k is the dimensionality of the position; xl and xr denote left-hand side and right-hand side areas, respectively; d represents the antennae length.
The detecting behavior of the beetle is described by:
where \(\delta^{i}\) represents the step size at the ith iteration, \({\text{sign}}\left( \cdot \right)\) is the sign function. The step size updating formula is written as:
where \(\eta\) is the attenuation coefficient of the step size.
3 Methodology
3.1 Dataset description
The dataset including 150 instances is collected from internationally published literature [11, 36, 41, 43, 53, 54, 58, 61, 67]. The input variables are width of the prism (WP), width of FRP (WF), modulus of elasticity of FRP (EF), thickness of FRP (TF), uniaxial compressive strength of concrete cylinder (UCS), and bond length (BL). The output variable is the bond strength of FRP-to-concrete joints (BS). The statistics of the parameters are summarized in Table 1. The relationships between input variables are visualized using a correlation matrix (Fig. 2). It can be seen that the correlations between any two input variables are pretty low, indicating that these variables will not cause multicollinearity issues [16].
3.2 Levy flight BAS algorithm
In the traditional BAS algorithm, the step size of a beetle is fixed or decreases with each iteration. The BAS algorithm can be easily trapped in a local optimum. To solve the issue, this study has introduced Levy flight to adjust the step size of BAS and proposes a new Levy flight BAS (LBAS) algorithm for hyperparameter tuning. The pseudocode of LBAS is shown in Algorithm 1. Levy flight has been used in many optimization algorithms and the results are promising [40, 44, 57].
In the implementation, the step size in LBAS is updated as follows
where \(\eta\) is the same as that in Eq. (21); The product ⊕ means entrywise multiplications; β is defined as follows:
where α ∼ U (0, 1); fw and fb are the historical worst and best fitness values; μ is a coefficient. In this study, μ=10−5; Levy is defined as follows:
3.3 Evaluation and validation methods
3.3.1 Performance evaluation methods
The following performance measures are applied to assess the predictive performance of the proposed model.
-
Root-mean-square error (RMSE)
RMSE measures the difference between predicted and observed values using the following function [24]:
where \(y_{i}^{*}\) is the predicted value; yi is the actual value; n is the number of data samples.
-
Correlation coefficient (R).
R measures the strength of correlation between predicted and observed values, which is described as follows: [4]
where \(\overline{{y^{*} }}\) and \(\overline{y}\) are the mean value of predicted and observed values, respectively.
3.3.2 k-fold cross-validation
Several methods for validating the regression model have been used such as simple substitution method [8], bootstrap method [15], holdout method [29], and bolstered method [7]. Among these methods, the k-fold cross-validation (CV) is probably the most widely applied validation method for training data [49]. In this study, k was assigned to be 10 according to recommendations and the number of datasets [30]. During hyperparameter tuning, the training set is split into 10 folds. The algorithms are trained in nine folds and validated in the remaining one. This procedure is repeated for 10 times with a different fold being employed as the validating fold at each time. The 10 results from the folds can then be averaged to give a single estimation.
3.4 Hyperparameter tuning
This study applied the radial basis function (RBF) as the kernel function for LSSVR. The reasons for choosing this function are as follows: (1) more parameters need to be tuned in the polynomial kernel function, leading to a more difficult model selection process. Numerical difficulties such as underflow or overflow may be inevitable; (2) the sigmoid kernel function is conditionally positive definite for parameters in certain ranges and there exist similarities between the RBF and the sigmoid kernel function [33]; and (3) the linear kernel function is a particular case of RBF [27].
Two hyperparameters need to be tuned on the training dataset when the RBF kernel is used: the kernel parameter γ and the slack penalty coefficient C. The hyperparameters are tuned on the training set including 70% of the instances as per the recommendations of previous literature [23]. The training dataset is split into 10 subsets in which the beetle searches for the optimal hyperparameters on 9 subsets and calculate the RMSE on the remaining subset. After convergence, the optimal hyperparameters in this fold is selected out. This process repeats for 10 times based on ten-fold CV. After ten folds, the hyperparameters corresponding to the smallest RMSE is selected as the optimal hyperparameters used in this study. Finally, the predictive performance of the model is tested on the test dataset (containing 30% of the whole data points). Independent of the training dataset, this dataset is applied only for assessing the performance of the model. If a model fits well on both the training and test datasets, minimal overfitting has taken place [42].
3.5 Construction of the metaheuristic-optimized regression system
In the training set, the hyperparameters of the proposed model are tuned by 10-fold CV and LBAS. The predictive performance of the model with optimal hyperparameters are then evaluated on the test dataset. The whole procedure of the implementation of the proposed LBAS-LSSVR system is shown in Fig. 3.
4 Results and discussion
4.1 Performance of the LBAS algorithm
To assess the performance of the proposed LBAS-LSSVR model, eleven benchmark functions are used, including 6 unimodal ones and 5 multimodal ones [37]. The test results are summarized in Table 2. Here, we take f7 as an example to show the searching trajectories and convergence curves of the LBAS algorithm (see Fig. 4). From the expression of f7, it is known that when x is \(- \infty\), y is \(- \infty\). Therefore, the smaller the obtained y is, the better performance an optimization algorithm achieves. It can be seen that the beetle gradually moves “deeper and deeper”. For the BAS algorithm, the y stops decreasing at the 23rd iteration (Fig. 4e), while for the LBAS, y reaches a much smaller value (about seven times smaller) after convergence (after 550 iterations) (Fig. 4f). It can be observed from Table 2 that the solution obtained by LBAS algorithm is much closer to the global optimum, indicating the high searching efficiency of LBAS algorithm.
4.2 Results of hyperparameter tuning
As stated before, we have tuned the hyperparameters (C and γ) of LSSVR using LBAS and ten-fold CV. At each fold, the LBAS algorithm searches for the hyperparameters on the training set and then after convergence, the set of hyperparameters are selected out. To validate the performance of the LSSVR model with this set of hyperparameters, the RMSE value on the validation set is calculated by this model. After ten folds, ten sets of hyperparameters (C and γ) are obtained, as shown in Table 3. The hyperparameters corresponding to the smallest RMSE value on the validation set (typeset in bold) are selected as the optimal hyperparameters for further study (the hyperparameters at tenth fold). We have also plotted the RMSE versus iteration curve for the best fold, as shown in Fig. 5. It can be seen that the RMSE curve converges within 20 iterations. This indicates that LBAS is very efficient in tuning hyperparameters. In addition, the significant decrease in RMSE shows that LBAS can find optimal hyperparameters for LSSVR.
4.3 Performance of the LBAS-LSSVR algorithm
Figure 6 shows the predicted and actual BS values on the training and test sets. The bars on the horizontal line represent the difference between the actual and predicted values. It can be seen that most of the difference values are pretty small (except several noises), indicating that the LBAS-LSSVR model has high prediction accuracy. Figure 7 shows the correlation between the predicted and actual BS values on the training and test sets. It can be seen that the data points lie closely to the ideal fit, with correlation coefficients of 0.9842 and 0.9828, respectively, indicating excellent prediction performance of the LBAS-LSSVR model. In addition, the similar and low RMSE values on the training and test sets suggest that there are no under-fitting and over-fitting issues.
4.4 Comparison of the LBAS-LSSVR model with other models
The predictive performance of the proposed model is compared with several widely used ML models, including back propagation neural network (BPNN) [21, 51], decision tree (DT) [32, 65, 66], k nearest neighbors (kNN) [1], logistic regression (LR) [22], and multiple linear regression (MLR) [38]. The prediction performances of different models in terms of RMSE and R on training and test sets are summarized in Table 4. It can be observed that the LBAS-LSSVR model achieves the highest prediction accuracy, as indicated by the highest R value (0.983) and the lowest RMSE value (1.99 MPa) on the test set, followed by the kNN with R value of 0.947 and the RMSE of 3.52 MPa. DT performs the worst on the test set (RMSE = 5.42 MPa, R = 0.880).
Figure 8 displays the distribution of the residual between the actual and predicted BS value using a box plot. The box plot is constructed using five metrics: the minimum value (the lower whisker), the first quartile (the lower edge of the box), the median (the red line in the box), the third quartile (the upper edge of the box), and the maximum value (the upper whisker). It can be seen that the LBAS-LSSVR achieves the smallest residual values for all the five metrics, followed by kNN, while LR performs worst in terms of the larger distribution of the residual values. However, it should be noted that although the kNN model has achieved comparatively higher prediction accuracy, slight overfitting occurs, as the RMSE on the test set is 4.5 times larger than that on the training set (see Table 4).
A Taylor diagram is employed to quantify the degree of correspondence between the actual and predicted BS values in terms of the standard deviation, RMSE and the correlation coefficient. A model will lie nearest the “Actual” point if its predicted BS value agrees well with the actual ones, indicating this model achieves comparatively low RMSE and high correlation. It can be observed that the LBAS-LSSVR model is situated closest to the “Actual point”, suggesting this model has the highest prediction accuracy. LR has relatively large variations than the actual value, while DT has a lower correlation coefficient, in both cases, leading to a comparatively large RMSE (Fig. 9).
4.5 Variable importance
In this study, the random forest (RF) algorithm is used to calculate the variable importance [18]. The procedure is as follows: The out-of-bag sample for a tree t is defined as OOBt and the misclassification rate is denoted by errOOBt. The influencing variables are permuted randomly in OOBt to obtain a permuted sample \(O\tilde{O}B_{t}\) and the error of predictor t \(errO\tilde{O}B_{t}\) is then calculated. The variable importance of X can be written as
where \(ntree\) is the number of tress in the forest.
The results show that WF is the most sensitive to the bond strength of FRP-to-concrete joints with the highest influence score (3.4776), followed by the BL (1.1419), while UCS is the least sensitive to the bond strength (0.1195). TF, WP and EF fall in between (0.7917, 0.7360 and 0.6745, respectively) (see Fig. 10). It is not unexpected that the variables related with the rigidity of the FRP have comparatively high influence on the bond strength of the FRP-to-concrete joints. When the principle tensile strength reaches the concrete tensile strength (half of the concrete uniaxial compressive strength), debonding cracks start to develop. Therefore, the UCS is not sensitive to the bond strength [35].
5 Conclusions
This study predicts the bond strength of FRP-to-concrete joints using an intelligent regression model. This model has high prediction accuracy and can be used to solve structural engineering problems. The recently proposed BAS algorithm is introduced to search for optimal hyperparameters for the LSSVR model. In addition, Levy flight is employed to improve the searching efficiency of BAS. The results show that incorporating Levy flight into the BAS algorithm can avoid the premature convergence to local optima. The LBAS is very efficient in tuning hyperparameters of LSSVR. The prediction accuracy of the proposed LBAS-LSSVR is pretty high, indicated by high correlation coefficient (0.9828) and low RMSE (1.99 MPa) on the test set. The variable importance result indicates the width of the FRP has the most significant influence on the bond strength, while the concrete UCS is the least sensitive variable.
It should be noted, however, that the data samples used in this study come from laboratory experiments. Thus, in the future work, the following work should be conducted: (1) to include a wider range of influencing variables (e.g., tensile strength of FRP, cement type, water-to-cement ratio of concrete, and elastic modulus of concrete) to increase the generalization ability of the proposed model, and (2) to increase the volume of data used for training and testing the proposed model. This will ensure more refined tuning of hyperparameters and further improve the ability of the model to obtain meaningful patterns form data with noise. Since it is not convenient for engineers to use algorithms in practice, a GUI that incorporates the proposed model can be implemented in the future work to facilitate the design of FRP-to-concrete composite joints for construction of infrastructure and buildings.
References
Altman NS (1992) An introduction to kernel and nearest-neighbor nonparametric regression. Am Stat 46(3):175–185
Basak D, Pal S, Patranabis DC (2007) Support vector regression. Neural Inf Process Lett Rev 11(10):203–224
Bizindavyi L, Neale K (1999) Transfer lengths and bond strengths for composites bonded to concrete. J Compos Constr 3(4):153–160
Boddy R, Smith G (2009) Statistical methods in practice: for scientists and technologists. Wiley, New York
Bonacci J (1996) Strength, failure mode and deformability of concrete beams strengthened externally with advanced composites. Paper presented at the proceedings of the 2nd international conference on advanced composite materials in bridges and structures, ACMBS-II, Montreal 1996
Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge University Press, Cambridge
Braga-Neto U, Dougherty E (2004) Bolstered error estimation. Pattern Recognit 37(6):1267–1281
Braga-Neto U, Hashimoto R, Dougherty ER, Nguyen DV, Carroll RJ (2004) Is cross-validation better than resubstitution for ranking genes? Bioinformatics 20(2):253–258
Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Disc 2(2):121–167
Ceroni F, Ferracuti B, Pecce M, Savoia M (2014) Assessment of a bond strength model for FRP reinforcement externally bonded over masonry blocks. Compos B Eng 61:147–161
Chajes MJ, Finch WW, Thomson TA (1996) Bond and force transfer of composite-material plates bonded to concrete. Struct J 93(2):209–217
Chajes MJ, Januszka TF, Mertz DR, Thomson TA, Finch WW (1995) Shear strengthening of reinforced concrete beams using externally applied composite fabrics. Struct J 92(3):295–303
Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge
Dai J, Ueda T, Sato Y (2005) Development of the nonlinear bond stress–slip model of fiber reinforced plastics sheet–concrete interfaces with a simple method. J Compos Constr 9(1):52–62
Efron B, Tibshirani RJ (1994) An introduction to the bootstrap. CRC Press, Boca Raton
Farrar DE, Glauber RR (1967) Multicollinearity in regression analysis: the problem revisited. Rev Econ Stat 1967:92–107
Fu B, Chen G, Teng J (2017) Mitigation of intermediate crack debonding in FRP-plated RC beams using FRP U-jackets. Compos Struct 176:883–897
Genuer R, Poggi J-M, Tuleau-Malot C (2010) Variable selection using random forests. Pattern Recognit Lett 31(14):2225–2236
Gupta S (2007) Support vector machines based modelling of concrete strength. World Acad Sci Eng Technol 36:305–311
Hadi M (2003) Retrofitting of shear failed reinforced concrete beams. Compos Struct 62(1):1–6
Hecht-Nielsen R (1992) Theory of the backpropagation neural network. In: Neural networks for perception. Elsevier, pp 65–93
Hosmer DW Jr, Lemeshow S, Sturdivant RX (2013) Applied logistic regression, vol 398. Wiley, New York
Hsu C-W, Chang C-C, Lin C-J (2003) A practical guide to support vector classification
Hyndman RJ, Koehler AB (2006) Another look at measures of forecast accuracy. Int J Forecast 22(4):679–688
Jiang X, Li S (2017) BAS: beetle antennae search algorithm for optimization problems. arXiv preprint arXiv:1710.10724
Kalfat R, Gadd J, Al-Mahaidi R, Smith ST (2018) An efficiency framework for anchorage devices used to enhance the performance of FRP strengthened RC members. Constr Build Mater 191:354–375
Keerthi SS, Lin C-J (2003) Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Comput 15(7):1667–1689
Khalifa A, Nanni A (2002) Rehabilitation of rectangular simply supported RC beams with shear deficiencies using CFRP composites. Constr Build Mater 16(3):135–146
Kim J-H (2009) Estimating classification error rate: repeated cross-validation, repeated hold-out and bootstrap. Comput Stat Data Anal 53(11):3735–3745
Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. Paper presented at the Ijcai
Kotsiantis SB, Zaharakis I, Pintelas P (2007) Supervised machine learning: a review of classification techniques. Emerg Artif Intell Appl Comput Eng 160:3–24
Lewis RJ (2000). An introduction to classification and regression tree (CART) analysis. Paper presented at the Annual meeting of the society for academic emergency medicine in San Francisco, California
Lin H-T, Lin C-J (2003) A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. Neural Comput 3:1–32
Lu X, Teng J, Ye L, Jiang J (2005) Bond–slip models for FRP sheets/plates bonded to concrete. Eng Struct 27(6):920–937
Mander JB, Priestley MJ, Park R (1988) Theoretical stress-strain model for confined concrete. J Struct Eng 114(8):1804–1826
Mashrei MA, Seracino R, Rahman M (2013) Application of artificial neural networks to predict the bond strength of FRP-to-concrete joints. Constr Build Mater 40:812–821
Mirjalili S, Mirjalili SM, Yang X-S (2014) Binary bat algorithm. Neural Comput Appl 25(3–4):663–681
Neter J, Kutner MH, Nachtsheim CJ, Wasserman W (1996) Applied linear statistical models, vol 4. Irwin, Chicago
Owolabi TO, Oloore LE, Akande KO, Olatunji SO (2019) Modeling of magnetic cooling power of manganite-based materials using computational intelligence approach. Neural Comput Appl 31(2):1291–1298
Pavlyukevich I (2007) Lévy flights, non-local search and simulated annealing. J Comput Phys 226(2):1830–1844
Ren H (2003) Study on basic theories and long time behavior of concrete structures strengthened by fiber reinforced polymers. Dalian University of Technology
Ripley BD (2007) Pattern recognition and neural networks. Cambridge University Press, Cambridge
Sharma S, Ali MM, Goldar D, Sikdar P (2006) Plate–concrete interfacial bond strength of FRP and metallic plated concrete specimens. Compos B Eng 37(1):54–63
Shlesinger MF (2006) Mathematical physics: search research. Nature 443(7109):281
Smith ST, Teng JG (2002) FRP-strengthened RC beams. I: review of debonding strength models. Eng Struct 24(4):385–395. https://doi.org/10.1016/S0141-0296(01)00105-5
Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222. https://doi.org/10.1023/B:STCO.0000035301.49549.88
Sonebi M, Cevik A, Grünewald S, Walraven J (2016) Modelling the fresh properties of self-compacting concrete using support vector machine approach. Constr Build Mater 106:55–64. https://doi.org/10.1016/j.conbuildmat.2015.12.035
Soudki K, Alkhrdaji T (2005) Guide for the design and construction of externally bonded FRP systems for strengthening concrete structures (ACI 440.2 R-02). Paper presented at the Structures Congress 2005: Metropolis and Beyond
Stone M (1974) Cross-validatory choice and assessment of statistical predictions. J Roy Stat Soc: Ser B (Methodol) 1974:111–147
Sun Y, Zhang J, Li G, Ma G, Huang Y, Sun J, Nener B (2019) Determination of Young’s modulus of jet grouted coalcretes using an intelligent model. Eng Geol 252:43–53
Sun Y, Zhang J, Li G, Wang Y, Sun J, Jiang C (2019) Optimized neural network using beetle antennae search for predicting the unconfined compressive strength of jet grouting coalcretes. Int J Numer Anal Methods Geomech 43(4):801–813
Suykens JA, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300
Takeo K, Matsushita H, Makizumi T, Nagashima G (1997) Bond characteristics of CFRP sheets in the CFRP bonding technique. Proc Jpn Concr Inst 19(2):1599–1604
Toutanji H, Saxena P, Zhao L, Ooi T (2007) Prediction of interfacial bond failure of FRP–concrete surface. J Compos Constr 11(4):427–436
Tureyen AK, Frosch RJ (2002) Shear tests of FRP-reinforced concrete beams without stirrups. Struct J 99(4):427–434
Van Gemert D (1980) Force transfer in epoxy bonded steel/concrete joints. Int J Adhes Adhes 1(2):67–72
Viswanathan GM, Afanasyev V, Buldyrev S, Murphy E, Prince P, Stanley HE (1996) Lévy flight search patterns of wandering albatrosses. Nature 381(6581):413
Woo S-K, Lee Y (2010) Experimental study on interfacial behavior of CFRP-bonded concrete. KSCE J Civ Eng 14(3):385–393
Yan K, Shi C (2010) Prediction of elastic modulus of normal and high strength concrete by support vector machine. Constr Build Mater 24(8):1479–1485. https://doi.org/10.1016/j.conbuildmat.2010.01.006
Yan K, Xu H, Shen G, Liu P (2013) Prediction of splitting tensile strength from cylinder compressive strength of concrete by support vector machine. Advances in Materials Science and Engineering, 2013
Yao J, Teng J, Chen J (2005) Experimental study on FRP-to-concrete bonded joints. Compos B Eng 36(2):99–113
Zhang J, Huang Y, Ma G, Nener B (2020) Multi-objective beetle antennae search algorithm. arXiv preprint arXiv:2002.10090
Zhang J, Huang Y, Ma G, Sun J, Nener B (2020) A metaheuristic-optimized multi-output model for predicting multiple properties of pervious concrete. Constr Build Mater 249:118803
Zhang J, Li D, Wang Y (2020) Predicting uniaxial compressive strength of oil palm shell concrete using a hybrid artificial intelligence model. J Build Eng 30:101282
Zhang J, Li D, Wang Y (2020) Toward intelligent construction: prediction of mechanical properties of manufactured-sand concrete using tree-based models. J Clean Prod 258:120665
Zhang J, Ma G, Huang Y, Aslani F, Nener B (2019) Modelling uniaxial compressive strength of lightweight self-compacting concrete using random forest regression. Constr Build Mater 210:713–719
Zhao H, Zhang Y, Zhao M (2000) Research on the bond performance between CFRP plate and concrete. Paper presented at the Proc., 1st Conf. on FRP Concrete Structures of China
Ziraba Y, Baluch M, Basunbul I, Azad A, Al-Sulaimani G, Sharif A (1995) Combined experimental-numerical approach to characterization of steel-glue-concrete interface. Mater Struct 28(9):518–525
Acknowledgements
The first author is supported by China Scholarship Council (Grant Number: 201706460008).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhang, J., Wang, Y. Evaluating the bond strength of FRP-to-concrete composite joints using metaheuristic-optimized least-squares support vector regression. Neural Comput & Applic 33, 3621–3635 (2021). https://doi.org/10.1007/s00521-020-05191-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-020-05191-0