scholarly journals Density-Based Penalty Parameter Optimization on C-SVM

2014 ◽  
Vol 2014 ◽  
pp. 1-9
Author(s):  
Yun Liu ◽  
Jie Lian ◽  
Michael R. Bartolacci ◽  
Qing-An Zeng

The support vector machine (SVM) is one of the most widely used approaches for data classification and regression. SVM achieves the largest distance between the positive and negative support vectors, which neglects the remote instances away from the SVM interface. In order to avoid a position change of the SVM interface as the result of an error system outlier, C-SVM was implemented to decrease the influences of the system’s outliers. Traditional C-SVM holds a uniform parameterCfor both positive and negative instances; however, according to the different number proportions and the data distribution, positive and negative instances should be set with different weights for the penalty parameter of the error terms. Therefore, in this paper, we propose density-based penalty parameter optimization of C-SVM. The experiential results indicated that our proposed algorithm has outstanding performance with respect to both precision and recall.

2015 ◽  
Vol 2015 ◽  
pp. 1-9 ◽  
Author(s):  
Kuan-Cheng Lin ◽  
Sih-Yang Chen ◽  
Jason C. Hung

Rapid advances in information and communication technology have made ubiquitous computing and the Internet of Things popular and practicable. These applications create enormous volumes of data, which are available for analysis and classification as an aid to decision-making. Among the classification methods used to deal with big data, feature selection has proven particularly effective. One common approach involves searching through a subset of the features that are the most relevant to the topic or represent the most accurate description of the dataset. Unfortunately, searching through this kind of subset is a combinatorial problem that can be very time consuming. Meaheuristic algorithms are commonly used to facilitate the selection of features. The artificial fish swarm algorithm (AFSA) employs the intelligence underlying fish swarming behavior as a means to overcome optimization of combinatorial problems. AFSA has proven highly successful in a diversity of applications; however, there remain shortcomings, such as the likelihood of falling into a local optimum and a lack of multiplicity. This study proposes a modified AFSA (MAFSA) to improve feature selection and parameter optimization for support vector machine classifiers. Experiment results demonstrate the superiority of MAFSA in classification accuracy using subsets with fewer features for given UCI datasets, compared to the original FASA.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Danni Chen ◽  
JianDong Zhao ◽  
Peng Huang ◽  
Xiongna Deng ◽  
Tingting Lu

Purpose Sparrow search algorithm (SSA) is a novel global optimization method, but it is easy to fall into local optimization, which leads to its poor search accuracy and stability. The purpose of this study is to propose an improved SSA algorithm, called levy flight and opposition-based learning (LOSSA), based on LOSSA strategy. The LOSSA shows better search accuracy, faster convergence speed and stronger stability. Design/methodology/approach To further enhance the optimization performance of the algorithm, The Levy flight operation is introduced into the producers search process of the original SSA to enhance the ability of the algorithm to jump out of the local optimum. The opposition-based learning strategy generates better solutions for SSA, which is beneficial to accelerate the convergence speed of the algorithm. On the one hand, the performance of the LOSSA is evaluated by a set of numerical experiments based on classical benchmark functions. On the other hand, the hyper-parameter optimization problem of the Support Vector Machine (SVM) is also used to test the ability of LOSSA to solve practical problems. Findings First of all, the effectiveness of the two improved methods is verified by Wilcoxon signed rank test. Second, the statistical results of the numerical experiment show the significant improvement of the LOSSA compared with the original algorithm and other natural heuristic algorithms. Finally, the feasibility and effectiveness of the LOSSA in solving the hyper-parameter optimization problem of machine learning algorithms are demonstrated. Originality/value An improved SSA based on LOSSA is proposed in this paper. The experimental results show that the overall performance of the LOSSA is satisfactory. Compared with the SSA and other natural heuristic algorithms, the LOSSA shows better search accuracy, faster convergence speed and stronger stability. Moreover, the LOSSA also showed great optimization performance in the hyper-parameter optimization of the SVM model.


2021 ◽  
Author(s):  
Lance F Merrick ◽  
Dennis N Lozada ◽  
Xianming Chen ◽  
Arron H Carter

Most genomic prediction models are linear regression models that assume continuous and normally distributed phenotypes, but responses to diseases such as stripe rust (caused by Puccinia striiformis f. sp. tritici) are commonly recorded in ordinal scales and percentages. Disease severity (SEV) and infection type (IT) data in germplasm screening nurseries generally do not follow these assumptions. On this regard, researchers may ignore the lack of normality, transform the phenotypes, use generalized linear models, or use supervised learning algorithms and classification models with no restriction on the distribution of response variables, which are less sensitive when modeling ordinal scores. The goal of this research was to compare classification and regression genomic selection models for skewed phenotypes using stripe rust SEV and IT in winter wheat. We extensively compared both regression and classification prediction models using two training populations composed of breeding lines phenotyped in four years (2016-2018, and 2020) and a diversity panel phenotyped in four years (2013-2016). The prediction models used 19,861 genotyping-by-sequencing single-nucleotide polymorphism markers. Overall, square root transformed phenotypes using rrBLUP and support vector machine regression models displayed the highest combination of accuracy and relative efficiency across the regression and classification models. Further, a classification system based on support vector machine and ordinal Bayesian models with a 2-Class scale for SEV reached the highest class accuracy of 0.99. This study showed that breeders can use linear and non-parametric regression models within their own breeding lines over combined years to accurately predict skewed phenotypes.


Author(s):  
Qingmi Yang

The parameter optimization of Support Vector Machine (SVM) has been a hot research direction. To improve the optimization rate and classification performance of SVM, the Principal Component Analysis (PCA) - Particle Swarm Optimization (PSO) algorithm was used to optimize the penalty parameters and kernel parameters of SVM. PSO which is to find the optimal solution through continuous iteration combined with PCA that eliminates linear redundancy between data, effectively enhance the generalization ability of the model, reduce the optimization time of parameters, and improve the recognition accuracy. The simulation comparison experiments on 6 UCI datasets illustrate that the excellent performance of the PCA-PSO-SVM model. The results show that the proposed algorithm has higher recognition accuracy and better recognition rate than simple PSO algorithm in the parameter optimization of SVM. It is an effective parameter optimization method.


2021 ◽  
Author(s):  
Leila Zahedi ◽  
Farid Ghareh Mohammadi ◽  
M. Hadi Amini

Machine learning techniques lend themselves as promising decision-making and analytic tools in a wide range of applications. Different ML algorithms have various hyper-parameters. In order to tailor an ML model towards a specific application, a large number of hyper-parameters should be tuned. Tuning the hyper-parameters directly affects the performance (accuracy and run-time). However, for large-scale search spaces, efficiently exploring the ample number of combinations of hyper-parameters is computationally challenging. Existing automated hyper-parameter tuning techniques suffer from high time complexity. In this paper, we propose HyP-ABC, an automatic innovative hybrid hyper-parameter optimization algorithm using the modified artificial bee colony approach, to measure the classification accuracy of three ML algorithms, namely random forest, extreme gradient boosting, and support vector machine. Compared to the state-of-the-art techniques, HyP-ABC is more efficient and has a limited number of parameters to be tuned, making it worthwhile for real-world hyper-parameter optimization problems. We further compare our proposed HyP-ABC algorithm with state-of-the-art techniques. In order to ensure the robustness of the proposed method, the algorithm takes a wide range of feasible hyper-parameter values, and is tested using a real-world educational dataset.


2014 ◽  
Vol 543-547 ◽  
pp. 2045-2048
Author(s):  
Yuan Lv ◽  
Zhong Gan

In case of experimental data contaminated with errors and noise, the robust ε-support vector regression has good forecast accuracy and high generalization ability. However, it depends on the selection of system parameter. Firstly, this paper introduces the robust ε-support vector regression method. Secondly, as the experiments prove, the new method achieves high forecast accuracy by virtue of the optimal penalty parameter C. Finally, the optimal method of parameter C is presented in the last section.


2011 ◽  
Vol 216 ◽  
pp. 153-157
Author(s):  
D.L. Yang ◽  
Xue Jun Li ◽  
K. Wang ◽  
Ling Li Jiang

The parameter optimization is the key to study of support vector machine (SVM). With strong global search capability of bacterial foraging algorithm(BFA), the optimization method—support vector machine parameters optimization based on bacterial foraging algorithm was proposed, which can achieve the dynamic optimization of the parametersCandγ,and overcomes the problem of inefficiency for selecting reasonable parameters according to the experience in the traditional fault diagnosis. Compared with other methods, the BFA is simpler and easier for programming, and the optimization SVM model become smaller. The rolling bearing fault diagnosis results show that bacterial foraging algorithm is suitable for support vector machine parameter optimization.


Sign in / Sign up

Export Citation Format

Share Document