首页 | 本学科首页   官方微博 | 高级检索  
     

Semi-Supervised Additive Logistic Regression: A Gradient Descent Solution
引用本文:宋阳秋 蔡渠棠 聂飞平 张长水. Semi-Supervised Additive Logistic Regression: A Gradient Descent Solution[J]. 清华大学学报, 2007, 12(6): 638-646. DOI: 10.1016/S1007-0214(07)70168-2
作者姓名:宋阳秋 蔡渠棠 聂飞平 张长水
作者单位:State Key Laboratory on Intelligent Technology and Systems Tsinghua National Laboratory for Information Science and Technology Department of Automation Tsinghua University,State Key Laboratory on Intelligent Technology and Systems Tsinghua National Laboratory for Information Science and Technology Department of Automation Tsinghua University,State Key Laboratory on Intelligent Technology and Systems Tsinghua National Laboratory for Information Science and Technology Department of Automation Tsinghua University,State Key Laboratory on Intelligent Technology and Systems Tsinghua National Laboratory for Information Science and Technology Department of Automation Tsinghua University,Beijing 100084 China,Beijing 100084 China,Beijing 100084 China,Beijing 100084 China
基金项目:清华大学智能技术与系统国家重点实验室开放性基金
摘    要:This paper describes a semi-supervised regularized method for additive logistic regression. The graph regularization term of the combined functions is added to the original cost functional used in AdaBoost. This term constrains the learned function to be smooth on a graph. Then the gradient solution is computed with the advantage that the regularization parameter can be adaptively selected. Finally, the func- tion step-size of each iteration can be computed using Newton-Raphson iteration. Experiments on bench- mark data sets show that the algorithm gives better results than existing methods.

关 键 词:添加剂 逻辑理论 自然科学 实验数据
收稿时间:2007-03-15
修稿时间:2007-07-10

Semi-Supervised Additive Logistic Regression: A Gradient Descent Solution
Yangqiu Song, ,#xbb;,#x;,#xcb;, Qutang Cai, ,#x;,#xe;,#xe;, Feiping Nie, ,#x;,#xde;,#xe;,Changshui Zhang, ,#xf;,#xf;,#xc;. Semi-Supervised Additive Logistic Regression: A Gradient Descent Solution[J]. Tsinghua Science and Technology, 2007, 12(6): 638-646. DOI: 10.1016/S1007-0214(07)70168-2
Authors:Yangqiu Song, »  &#x  Ë  , Qutang Cai, &#x      , Feiping Nie, &#x  Þ    ,Changshui Zhang,        
Affiliation:

aState Key Laboratory on Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing 100084, China

Abstract:This paper describes a semi-supervised regularized method for additive logistic regression. The graph regularization term of the combined functions is added to the original cost functional used in AdaBoost. This term constrains the learned function to be smooth on a graph. Then the gradient solution is computed with the advantage that the regularization parameter can be adaptively selected. Finally, the func- tion step-size of each iteration can be computed using Newton-Raphson iteration. Experiments on bench- mark data sets show that the algorithm gives better results than existing methods.
Keywords:semi-supervised  Boosting  graph regularization
本文献已被 CNKI 维普 万方数据 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号