首页 | 本学科首页   官方微博 | 高级检索  
     检索      

关于支持向量机DirectSVM算法的探讨
引用本文:肇莹,刘红星,高敦堂.关于支持向量机DirectSVM算法的探讨[J].南京大学学报(自然科学版),2006,42(4):368-372.
作者姓名:肇莹  刘红星  高敦堂
作者单位:南京大学电子科学与工程系 南京210093
摘    要:DirectSVM算法是求解支持向量机的一种简单快速迭代算法,具有最好的几何直观性.算法将线性可分的两类样本中距离最近的两个异类样本点作为支持向量,以该两点连线的垂直平分面作为初始分类超平面,然后根据分类情况逐步确定新的支持向量,即逐步优化出最优分类超平面.对该算法进行了测试,发现该算法具有局限性,并对算法局限性产生的根源进行了分析,对如何合理使用DirectSVM算法进行了讨论.结论是:用DirectSVM算法直接求解最优分类面是不可靠的,但可以作为支持向量机的一种近似算法,也可以作为求解候选支持向量集的方法,再与其他经典算法结合使用.

关 键 词:支持向量机  直接支持向量机  最优分类面
收稿时间:12 24 2005 12:00AM

Investigation on DirectSVM Algorithm for Support Vector Machine
Zhao Ying,Liu Hong-Xing,Gao Dun-Tang.Investigation on DirectSVM Algorithm for Support Vector Machine[J].Journal of Nanjing University: Nat Sci Ed,2006,42(4):368-372.
Authors:Zhao Ying  Liu Hong-Xing  Gao Dun-Tang
Institution:Department of Electronic Science and Engineering, Nanjing University, Nanjing, 210093, China
Abstract:Support vector machines(SVMs),which are based on the principle of structural risk minimization,are by far the most sophisticated and powerful classifiers available today.Training an SVM classifier is substantially solving a quadratic programming(QP) problem.Among those SVM training algorithms,Sequential Minimal Optimization and Nearest Point Algorithm are of much concern.Platt's Sequential Minimal Optimization algorithm is a fast iterative algorithm which divides the large scale QP problem into a series of small scale QP sub-problems,thus overcoming the difficulties of the original QP problem which needs enormous matrix storage and does expensive matrix operations.The NPA algorithm transforms a particular SVM classification formulation into a problem of calculating the nearest training samples between two closed convex polytopes in the hidden feature space formed by the two training sample sets.DirectSVM is a very simple iterative algorithm for constructing support vector machine classifiers,and it is most intuitive geometrically.The DirectSVM algorithm is based on the proposition that the two closest training points of the opposite class in a training set are support vectors.Other support vectors are found by using the following conjecture: the training point that maximally violates the current hyper-plane is also a support vector.The DirectSVM algorithm under linearly separable cases is as follows: first,the two nearest training samples of the opposite class are found to be the initial support vectors,and the corresponding original classification hyper-plane is obtained based on these two support vectors;then,the training point that maximally violates the current hyper-plane is found to be a new support vector,and the classification hyper-plane is modified accordingly;the support vector set and the hyper-plane are modified iteratively according to the classification,until no sample is more closer to the classification hyper-plane than those support vectors,and the optimal hyper plane is obtained finally.In this paper,the algorithm is tested with several instances,and the limitation of the algorithm is found.The causation of the limitation and the way to appropriately use the algorithm are discussed.The conclusion is that DirectSVM is not always reliable,but it can be used as an algorithm to construct the approximate optimal hyper-plane.It can also be used to search for the candidate support vector sets and to take the support vector sets as new training sample sets which can be trained by using other classical SVM algorithms such as SMO and NPA.
Keywords:support vector machine  DirectSVM  optimal hyper-plane
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号