scholarly journals Cost-Sensitive Attribute Reduction in Decision-Theoretic Rough Set Models

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shujiao Liao ◽  
Qingxin Zhu ◽  
Fan Min

In recent years, the theory of decision-theoretic rough set and its applications have been studied, including the attribute reduction problem. However, most researchers only focus on decision cost instead of test cost. In this paper, we study the attribute reduction problem with both types of costs in decision-theoretic rough set models. A new definition of attribute reduct is given, and the attribute reduction is formulated as an optimization problem, which aims to minimize the total cost of classification. Then both backtracking and heuristic algorithms to the new problem are proposed. The algorithms are tested on four UCI (University of California, Irvine) datasets. Experimental results manifest the efficiency and the effectiveness of both algorithms. This study provides a new insight into the attribute reduction problem in decision-theoretic rough set models.

2013 ◽  
Vol 2013 ◽  
pp. 1-12 ◽  
Author(s):  
Hong Zhao ◽  
Fan Min ◽  
William Zhu

The measurement error with normal distribution is universal in applications. Generally, smaller measurement error requires better instrument and higher test cost. In decision making, we will select an attribute subset with appropriate measurement error to minimize the total test cost. Recently, error-range-based covering rough set with uniform distribution error was proposed to investigate this issue. However, the measurement errors satisfy normal distribution instead of uniform distribution which is rather simple for most applications. In this paper, we introduce normal distribution measurement errors to covering-based rough set model and deal with test-cost-sensitive attribute reduction problem in this new model. The major contributions of this paper are fourfold. First, we build a new data model based on normal distribution measurement errors. Second, the covering-based rough set model with measurement errors is constructed through the “3-sigma” rule of normal distribution. With this model, coverings are constructed from data rather than assigned by users. Third, the test-cost-sensitive attribute reduction problem is redefined on this covering-based rough set. Fourth, a heuristic algorithm is proposed to deal with this problem. The experimental results show that the algorithm is more effective and efficient than the existing one. This study suggests new research trends concerning cost-sensitive learning.


2016 ◽  
Vol 21 (20) ◽  
pp. 6159-6173 ◽  
Author(s):  
Anhui Tan ◽  
Weizhi Wu ◽  
Yuzhi Tao

Filomat ◽  
2018 ◽  
Vol 32 (5) ◽  
pp. 1817-1822
Author(s):  
Jingzheng Li ◽  
Xiangjian Chen ◽  
Pingxin Wang ◽  
Xibei Yang

In traditional cost-sensitive attribute reduction, the variation of decision cost is referred to as a global difference of costs because the considered decision cost is the variation of sum of decision costs over all objects. However, such reduction does not take the variation of decision costs of each object into account. To solve this problem, a local view based cost-sensitive attribute reduction is introduced. Firstly, through considering the variation of decision costs of single object if the used attributes change, a local difference of costs is presented. Secondly, on the basis of the fuzzy decision-theoretic rough set model, a new significance function is given to measure the importance of attribute. Finally, the experimental results illustrate that by comparing the traditional reduction, the proposed local view can decreases both global and local differences of costs effectively on several UCI data sets.


2019 ◽  
Vol 186 ◽  
pp. 104938 ◽  
Author(s):  
Xiaojun Xie ◽  
Xiaolin Qin ◽  
Qian Zhou ◽  
Yanghao Zhou ◽  
Tong Zhang ◽  
...  

2016 ◽  
Vol 28 (15) ◽  
pp. 4125-4143 ◽  
Author(s):  
Zhongqin Bi ◽  
Feifei Xu ◽  
Jingsheng Lei ◽  
Teng Jiang

2011 ◽  
Vol 181 (22) ◽  
pp. 4928-4942 ◽  
Author(s):  
Fan Min ◽  
Huaping He ◽  
Yuhua Qian ◽  
William Zhu

Sign in / Sign up

Export Citation Format

Share Document