首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于加权熵的重要性不对等样本学习的知识约简
引用本文:刘金福,于达仁,胡清华.基于加权熵的重要性不对等样本学习的知识约简[J].广西师范大学学报(自然科学版),2006,24(4):38-41.
作者姓名:刘金福  于达仁  胡清华
作者单位:哈尔滨工业大学,先进动力控制与可靠性研究所,黑龙江,哈尔滨,150001
基金项目:国家自然科学基金资助项目(50306003),哈尔滨工业大学基金资助项目(HIT2003.35)
摘    要:由于学习样本的分布特性和所反映的主观特性的不同,每一个样本相对于学习算法的重要性程度往往是不对等的,为了能够在知识约简过程中考虑到学习样本的不对等性,提出了基于加权熵的知识约简方法。针对各类别样本分布不均匀的样本重要性不对等情况,为了加强小类样本所蕴含的知识在知识约简中的体现,提出一种逆类概率加权的不对等样本加权方法,实验表明该方法能够明显提高小类样本的分类正确率,也验证了基于加权熵的知识约简方法能够将样本的不对等性体现在知识约简结果中。

关 键 词:粗糙集  知识约简  加权熵  不对等样本
文章编号:1001-6600(2006)04-0038-04
收稿时间:2006-05-31
修稿时间:2006年5月31日

Weighed Entropy Based Knowledge Reduction in Learning from the Significance-imbalanced Instances
LIU Jin-fu,YU Da-ren,HU Qing-hua.Weighed Entropy Based Knowledge Reduction in Learning from the Significance-imbalanced Instances[J].Journal of Guangxi Normal University(Natural Science Edition),2006,24(4):38-41.
Authors:LIU Jin-fu  YU Da-ren  HU Qing-hua
Institution:Institute of Advanced Power control and reliability ,Harbin Institute of Technology ,Harbin 150001 ,China
Abstract:Since the difference of the distribution and subjective characteristic of the training instances,the significance of each instance is usually unequal in a learning algorithm.In order to take the imbalance of training instances into account,this paper proposes an approach to knowledge reduction based on weighted entropy.Aiming at the imbalance of class distribution,this paper presents an inverse class probability weighting approach to intensify the small class instances in knowledge reduction.The experiments express that the weighting approach obviously enhances the classification accuracy of the small class instances,which explains the validity of the approach to knowledge reduction based on weighted entropy in dealing with the imbalance of instances.
Keywords:rough set  knowledge reduction  weighed entropy  imbalanced instance
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号