scholarly journals Laminated Composites Buckling Analysis Using Lamination Parameters, Neural Networks and Support Vector Regression

2015 ◽  
Vol 12 (2) ◽  
pp. 271-294 ◽  
Author(s):  
Rubem M. Koide ◽  
Ana Paula C. S. Ferreira ◽  
Marco A. Luersen
Materials ◽  
2020 ◽  
Vol 13 (17) ◽  
pp. 3766 ◽  
Author(s):  
Shin-Hyung Song

In this research, hot deformation experiments of 316L stainless steel were carried out at a temperature range of 800–1000 °C and strain rate of 2 × 10−3–2 × 10−1. The flow stress behavior of 316L stainless steel was found to be highly dependent on the strain rate and temperature. After the experimental study, the flow stress was modeled using the Arrhenius-type constitutive equation, a neural network approach, and the support vector regression algorithm. The present research mainly focused on a comparative study of three algorithms for modeling the characteristics of hot deformation. The results indicated that the neural network approach and the support vector regression algorithm could be used to model the flow stress better than the approach of the Arrhenius-type equation. The modeling efficiency of the support vector regression algorithm was also found to be more efficient than the algorithm for neural networks.


2021 ◽  
Vol 47 ◽  
Author(s):  
Feliksas Ivanauskas ◽  
Robertas Paulauskas ◽  
Pranas Vaitkus

In this paper extreme learning machine and support vector regression are used for biosensors response to mixtures of compounds classification. The results are compared with the results obtained using artificial neural networks and others.


Author(s):  
Zhao Lu ◽  
Gangbing Song ◽  
Leang-san Shieh

As a general framework to represent data, the kernel method can be used if the interactions between elements of the domain occur only through inner product. As a major stride towards the nonlinear feature extraction and dimension reduction, two important kernel-based feature extraction algorithms, kernel principal component analysis and kernel Fisher discriminant, have been proposed. They are both used to create a projection of multivariate data onto a space of lower dimensionality, while attempting to preserve as much of the structural nature of the data as possible. However, both methods suffer from the complete loss of sparsity and redundancy in the nonlinear feature representation. In an attempt to mitigate these drawbacks, this article focuses on the application of the newly developed polynomial kernel higher order neural networks in improving the sparsity and thereby obtaining a succinct representation for kernel-based nonlinear feature extraction algorithms. Particularly, the learning algorithm is based on linear programming support vector regression, which outperforms the conventional quadratic programming support vector regression in model sparsity and computational efficiency.


Sign in / Sign up

Export Citation Format

Share Document