首页 | 本学科首页   官方微博 | 高级检索  
     检索      

Induction of hybrid decision tree based on post-discretization strategy
作者姓名:WANG Limin  YUAN Senmiao
作者单位:College of Computer Science and Technology, Jilin University, Changchun 130012, China,College of Computer Science and Technology, Jilin University, Changchun 130012, China
摘    要:By redefining test selection measure, we propose in this paper a new algorithm, Flexible NBTree, which induces a hybrid of decision tree and Naive Bayes. Flexible NBTree mitigates the negative effect of information loss on test selection by applying postdiscretization strategy: at each internal node in the tree, we first select the test which is the most useful for improving classification accuracy, then apply discretization of continuous tests. The finial decision tree nodes contain univariate splits as regular decision trees, but the leaves contain Naive Bayesian classifiers. To evaluate the performance of Flexible NBTree, we compare it with NBTree and C4.5, both applying pre-discretization of continuous attributes. Experimental results on a variety of natural domains indicate that the classification accuracy of Flexible NBTree is substantially improved.

关 键 词:machine  learning    hybrid  decision  tree    Naive  Bayes

Induction of hybrid decision tree based on post-discretization strategy
WANG Limin,YUAN Senmiao.Induction of hybrid decision tree based on post-discretization strategy[J].Progress in Natural Science,2004,14(6):541-545.
Authors:WANG Limin  Yuan Senmiao
Institution:College of Computer Science and Technology, Jilin University, Changchun 130012, China
Abstract:By redefining test selection measure, we propose in this paper a new algorithm, Flexible NBTree, which induces a hybrid of decision tree and Naive Bayes. Flexible NBTree mitigates the negative effect of information loss on test selection by applying postdiscretization strategy: at each internal node in the tree, we first select the test which is the most useful for improving classification accuracy, then apply discretization of continuous tests. The finial decision tree nodes contain univariate splits as regular decision trees, but the leaves contain Naive Bayesian classifiers. To evaluate the performance of Flexible NBTree, we compare it with NBTree and C4.5, both applying pre-discretization of continuous attributes. Experimental results on a variety of natural domains indicate that the classification accuracy of Flexible NBTree is substantially improved.
Keywords:machine learning  hybrid decision tree  Naive Bayes
本文献已被 CNKI 万方数据 等数据库收录!
点击此处可从《自然科学进展(英文版)》浏览原始摘要信息
点击此处可从《自然科学进展(英文版)》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号