首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于快速鲁棒性特征的景象匹配
引用本文:陈 冰,赵亦工,李 欣.基于快速鲁棒性特征的景象匹配[J].系统工程与电子技术,2009,31(11):2714-2718.
作者姓名:陈 冰  赵亦工  李 欣
作者单位:西安电子科技大学模式识别与智能控制研究所, 陕西 西安 710071
摘    要:针对光电成像制导景象匹配中图像产生较大几何形变的问题,提出了一种基于快速鲁棒性特征(speeded up robust feature, SURF)的景象匹配算法。SURF提取的图像特征具有尺度和旋转不变性,对灰度不敏感,并能快速运算。算法首先利用仿射变换对基准图像进行3D视角补偿,模拟基准图像在不同视角下的成像,以减小基准图像和实时图像间的视角差异,分别提取两图像的SURF特征,然后根据最小欧氏距离准则提取两图像间匹配的SURF特征点对,根据该特征点对估计基础矩阵,得到两图像的投影关系。仿真结果表明,该算法能够适应光电成像制导中图像的几何形变,实现稳定的景象匹配。

关 键 词:景象匹配  快速鲁棒性特征  3D视角补偿  特征匹配  基础矩阵

Scene matching based on speeded up robust features
CHEN Bing,ZHAO Yi-gong,LI Xin.Scene matching based on speeded up robust features[J].System Engineering and Electronics,2009,31(11):2714-2718.
Authors:CHEN Bing  ZHAO Yi-gong  LI Xin
Institution:Inst. of Pattern Recognition and Intelligent Control, Xidian Univ., Xi’an 710071, China
Abstract:To overcome the projection transmutation and achieve the robust scene matching during electro-optical imaging guidance, a new approach based on speeded up robust features (SURF) is proposed. SURF features are invariant to image scale and rotation, robust to illumination change and rapid to be extracted. First, different oblique views of reference images are generated by the affine transform to get 3D viewpoint angle com-pensation, then the SURF features of the reference image and the real-time image are extracted respectively. The matching feature points are extracted based on a least Euclidean distance criterion, these points are used to estimate the fundamental matrix, which represents the correspondence between the reference image and the real-time image. Experimental results show that the proposed method is robust to the severe projection trans-mutation. The proposed method provides robust scene matching during electro-optical imaging guidance.
Keywords:scene matching  speeded up robust feature! 3D viewpoint angle compensation  feature matching  fundamental matrix
本文献已被 万方数据 等数据库收录!
点击此处可从《系统工程与电子技术》浏览原始摘要信息
点击此处可从《系统工程与电子技术》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号