scholarly journals A Novel Sparse Least Squares Support Vector Machines

2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Xiao-Lei Xia ◽  
Weidong Jiao ◽  
Kang Li ◽  
George Irwin

The solution of a Least Squares Support Vector Machine (LS-SVM) suffers from the problem of nonsparseness. The Forward Least Squares Approximation (FLSA) is a greedy approximation algorithm with a least-squares loss function. This paper proposes a new Support Vector Machine for which the FLSA is the training algorithm—the Forward Least Squares Approximation SVM (FLSA-SVM). A major novelty of this new FLSA-SVM is that the number of support vectors is the regularization parameter for tuning the tradeoff between the generalization ability and the training cost. The FLSA-SVMs can also detect the linear dependencies in vectors of the input Gramian matrix. These attributes together contribute to its extreme sparseness. Experiments on benchmark datasets are presented which show that, compared to various SVM algorithms, the FLSA-SVM is extremely compact, while maintaining a competitive generalization ability.

2014 ◽  
Vol 1061-1062 ◽  
pp. 935-938
Author(s):  
Xin You Wang ◽  
Guo Fei Gao ◽  
Zhan Qu ◽  
Hai Feng Pu

The predictions of chaotic time series by applying the least squares support vector machine (LS-SVM), with comparison with the traditional-SVM and-SVM, were specified. The results show that, compared with the traditional SVM, the prediction accuracy of LS-SVM is better than the traditional SVM and more suitable for time series online prediction.


2018 ◽  
Vol 18 (3) ◽  
pp. 715-724 ◽  
Author(s):  
Xiao Li ◽  
Xin Liu ◽  
Clyde Zhengdao Li ◽  
Zhumin Hu ◽  
Geoffrey Qiping Shen ◽  
...  

Foundation pit displacement is a critical safety risk for both building structure and people lives. The accurate displacement monitoring and prediction of a deep foundation pit are essential to prevent potential risks at early construction stage. To achieve accurate prediction, machine learning methods are extensively applied to fulfill this purpose. However, these approaches, such as support vector machines, have limitations in terms of data processing efficiency and prediction accuracy. As an emerging approach derived from support vector machines, least squares support vector machine improve the data processing efficiency through better use of equality constraints in the least squares loss functions. However, the accuracy of this approach highly relies on the large volume of influencing factors from the measurement of adjacent critical points, which is not normally available during the construction process. To address this issue, this study proposes an improved least squares support vector machine algorithm based on multi-point measuring techniques, namely, multi-point least squares support vector machine. To evaluate the effectiveness of the proposed multi-point least squares support vector machine approach, a real case study project was selected, and the results illustrated that the multi-point least squares support vector machine approach on average outperformed single-point least squares support vector machine in terms of prediction accuracy during the foundation pit monitoring and prediction process.


2008 ◽  
Vol 20 (1) ◽  
pp. 271-287 ◽  
Author(s):  
Tilman Knebel ◽  
Sepp Hochreiter ◽  
Klaus Obermayer

We describe a fast sequential minimal optimization (SMO) procedure for solving the dual optimization problem of the recently proposed potential support vector machine (P-SVM). The new SMO consists of a sequence of iteration steps in which the Lagrangian is optimized with respect to either one (single SMO) or two (dual SMO) of the Lagrange multipliers while keeping the other variables fixed. An efficient selection procedure for Lagrange multipliers is given, and two heuristics for improving the SMO procedure are described: block optimization and annealing of the regularization parameter ε. A comparison of the variants shows that the dual SMO, including block optimization and annealing, performs efficiently in terms of computation time. In contrast to standard support vector machines (SVMs), the P-SVM is applicable to arbitrary dyadic data sets, but benchmarks are provided against libSVM's ε-SVR and C-SVC implementations for problems that are also solvable by standard SVM methods. For those problems, computation time of the P-SVM is comparable to or somewhat higher than the standard SVM. The number of support vectors found by the P-SVM is usually much smaller for the same generalization performance.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Lean Yu

A least squares fuzzy support vector machine (LS-FSVM) model that integrates advantages of fuzzy support vector machine (FSVM) and least squares method is proposed for credit risk evaluation. In the proposed LS-FSVM model, the purpose of incorporating the concepts of fuzzy sets is to add generalization capability and outlier insensitivity, while the least squares method is adopted to reduce the computational complexity. For illustrative purposes, a real-world credit risk dataset is used to test the effectiveness and robustness of the proposed LS-FSVM methodology.


Author(s):  
Maryam Yalsavar ◽  
Paknoosh Karimaghaei ◽  
Akbar Sheikh-Akbari ◽  
Pancham Shukla ◽  
Peyman Setoodeh

The application of the support vector machine (SVM) classification algorithm to large-scale datasets is limited due to its use of a large number of support vectors and dependency of its performance on its kernel parameter. In this paper, SVM is redefined as a control system and iterative learning control (ILC) method is used to optimize SVM’s kernel parameter. The ILC technique first defines an error equation and then iteratively updates the kernel function and its regularization parameter using the training error and the previous state of the system. The closed loop structure of the proposed algorithm increases the robustness of the technique to uncertainty and improves its convergence speed. Experimental results were generated using nine standard benchmark datasets covering a wide range of applications. Experimental results show that the proposed method generates superior or very competitive results in term of accuracy than those of classical and state-of-the-art SVM based techniques while using a significantly smaller number of support vectors.


2011 ◽  
Vol 204-210 ◽  
pp. 879-882
Author(s):  
Kai Li ◽  
Xiao Xia Lu

By combining fuzzy support vector machine with rough set, we propose a rough margin based fuzzy support vector machine (RFSVM). It inherits the characteristic of the FSVM method and considers position of training samples of the rough margin in order to reduce overfitting due to noises or outliers. The new proposed algorithm finds the optimal separating hyperplane that maximizes the rough margin containing lower margin and upper margin. Meanwhile, the points lied on the lower margin have larger penalty than these in the boundary of the rough margin. Experiments on several benchmark datasets show that the RFSVM algorithm is effective and feasible compared with the existing support vector machines.


2013 ◽  
Vol 321-324 ◽  
pp. 1917-1920
Author(s):  
Li Wei Wei ◽  
Qiang Xiao ◽  
Ying Zhang ◽  
Xiong Fei Ji

Least squares support vector machine (LS-SVM) has an outstanding advantage of lower computational complexity than that of standard support vector machines. Its shortcomings are the loss of sparseness and robustness. Thus it usually results in slow testing speed and poor generalization performance. In this paper, a least squares support vector machine with L1 penalty (L1-LS-SVM) is proposed to deal with above shortcomings. A minimum of 1-norm based object function is chosen to get the sparse and robust solution based on the idea of basis pursuit (BP) in the whole feasibility region. Some UCI datasets are used to demonstrate the effectiveness of this model. The experimental results show that L1-LS-SVM can obtain a small number of support vectors and improve the generalization ability of LS-SVM.


Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 4059 ◽  
Author(s):  
Benny Chitambira ◽  
Simon Armour ◽  
Stephen Wales ◽  
Mark Beach

This article evaluates the use of least-squares support vector machines, with ray-traced data, to solve the problem of localisation in multipath environments. The schemes discussed concern 2-D localisation, but could easily be extended to 3-D. It does not require NLOS identification and mitigation, hence, it can be applied in any environment. Some background details and a detailed experimental setup is provided. Comparisons with schemes that require NLOS identification and mitigation, from earlier work, are also presented. The results demonstrate that the direct localisation scheme using least-squares support vector machine (the Direct method) achieves superior outage to TDOA and TOA/AOA for NLOS environments. TDOA has better outage in LOS environments. TOA/AOA performs better for an accepted outage probability of 20 percent or greater but as the outage probability lowers, the Direct method becomes better.


2016 ◽  
Vol 40 (4) ◽  
pp. 541-549
Author(s):  
Zengshou Dong ◽  
Zhaojing Ren ◽  
You Dong

Mechanical fault vibration signals are non-stationary, which causes system instability. The traditional methods are difficult to accurately extract fault information and this paper proposes a local mean decomposition and least squares support vector machine fault identification method. The article introduces waveform matching to solve the original features of signals at the endpoints, using linear interpolation to get local mean and envelope function, then obtain production function PF vector through making use of the local mean decomposition. The energy entropy of PF vector take as identification input vectors. These vectors are respectively inputted BP neural networks, support vector machines, least squares support vector machines to identify faults. Experimental result show that the accuracy of least squares support vector machine with higher classification accuracy has been improved.


Sign in / Sign up

Export Citation Format

Share Document