nonparallel hyperplanes
Recently Published Documents


TOTAL DOCUMENTS

9
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

2020 ◽  
Vol 2020 ◽  
pp. 1-18
Author(s):  
Qianru Zhai ◽  
Ye Tian ◽  
Jingyue Zhou

Twin support vector regression (TSVR) generates two nonparallel hyperplanes by solving a pair of smaller-sized problems instead of a single larger-sized problem in the standard SVR. Due to its efficiency, TSVR is frequently applied in various areas. In this paper, we propose a totally new version of TSVR named Linear Twin Quadratic Surface Support Vector Regression (LTQSSVR), which directly uses two quadratic surfaces in the original space for regression. It is worth noting that our new approach not only avoids the notoriously difficult and time-consuming task for searching a suitable kernel function and its corresponding parameters in the traditional SVR-based method but also achieves a better generalization performance. Besides, in order to make further improvement on the efficiency and robustness of the model, we introduce the 1-norm to measure the error. The linear programming structure of the new model skips the matrix inverse operation and makes it solvable for those huge-sized problems. As we know, the capability of handling large-sized problem is very important in this big data era. In addition, to verify the effectiveness and efficiency of our model, we compare it with some well-known methods. The numerical experiments on 2 artificial data sets and 12 benchmark data sets demonstrate the validity and applicability of our proposed method.


Electronics ◽  
2019 ◽  
Vol 8 (10) ◽  
pp. 1195 ◽  
Author(s):  
Qing Ai ◽  
Anna Wang ◽  
Aihua Zhang ◽  
Wenhui Wang ◽  
Yang Wang

Twin-KSVC (Twin Support Vector Classification for K class) is a novel and efficient multiclass twin support vector machine. However, Twin-KSVC has the following disadvantages. (1) Each pair of binary sub-classifiers has to calculate inverse matrices. (2) For nonlinear problems, a pair of additional primal problems needs to be constructed in each pair of binary sub-classifiers. For these disadvantages, a new multi-class twin hypersphere support vector machine, named Twin Hypersphere-KSVC, is proposed in this paper. Twin Hypersphere-KSVC also evaluates each sample into 1-vs-1-vs-rest structure, as in Twin-KSVC. However, our Twin Hypersphere-KSVC does not seek two nonparallel hyperplanes in each pair of binary sub-classifiers as in Twin-KSVC, but a pair of hyperspheres. Compared with Twin-KSVC, Twin Hypersphere-KSVC avoids computing inverse matrices, and for nonlinear problems, can apply the kernel trick to linear case directly. A large number of comparisons of Twin Hypersphere-KSVC with Twin-KSVC on a set of benchmark datasets from the UCI repository and several real engineering applications, show that the proposed algorithm has higher training speed and better generalization performance.


2015 ◽  
Vol 2015 ◽  
pp. 1-12
Author(s):  
Zhi-Xia Yang ◽  
Yuan-Hai Shao ◽  
Yao-Lin Jiang

A novel learning framework of nonparallel hyperplanes support vector machines (NPSVMs) is proposed for binary classification and multiclass classification. This framework not only includes twin SVM (TWSVM) and its many deformation versions but also extends them into multiclass classification problem when different parameters or loss functions are chosen. Concretely, we discuss the linear and nonlinear cases of the framework, in which we select the hinge loss function as example. Moreover, we also give the primal problems of several extension versions of TWSVM’s deformation versions. It is worth mentioning that, in the decision function, the Euclidean distance is replaced by the absolute value|wTx+b|, which keeps the consistency between the decision function and the optimization problem and reduces the computational cost particularly when the kernel function is introduced. The numerical experiments on several artificial and benchmark datasets indicate that our framework is not only fast but also shows good generalization.


2015 ◽  
Vol 51 ◽  
pp. 1574-1582 ◽  
Author(s):  
Xuchan Ju ◽  
Yingjie Tian ◽  
Dalian Liu ◽  
Zhiquan Qi

2014 ◽  
Vol 668-669 ◽  
pp. 1170-1173 ◽  
Author(s):  
Hai Fa Shi ◽  
Xin Bin Zhao ◽  
Ling Jing

Nowadays, there have been many data which are represented by tensor, that how to deal with these tensor data directly remains a significant challenge. In this paper, we propose a new tensor distance (TD) based least square twin support tensor machine (called TDLS-TSTM). Unlike the traditional Euclidean distance, TD considers the relationship information of various coordinates. TDLS-TSTM works directly on tensor data and aims to find two nonparallel hyperplanes for classification based on TD which can make full of structural information of data, solves two systems of linear equations rather than two quadratic programming problems. Compared with other classifiers, our method has the advantages of higher precision and accepted time consumption. The numerical experiments show the valid and efficient of TDLS-TSTM.


Sign in / Sign up

Export Citation Format

Share Document