首页 | 本学科首页   官方微博 | 高级检索  
     检索      


RGB and LBP-texture deep nonlinearly fusion features for fabric retrieval
Abstract:Fabric retrieval is very challenging since problems like viewpoint variations, illumination changes, blots, and poor image qualities are usually encountered in fabric images. In this work, a novel deep feature nonlinear fusion network(DFNFN) is proposed to nonlinearly fuse features learned from RGB and texture images for improving fabric retrieval. Texture images are obtained by using local binary pattern texture(LBP-Texture) features to describe RGB fabric images. The DFNFN firstly applies two feature learning branches to deal with RGB images and the corresponding LBP-Texture images simultaneously. Each branch contains the same convolutional neural network(CNN) architecture but independently learning parameters. Then, a nonlinear fusion module(NFM) is designed to concatenate the features produced by the two branches and nonlinearly fuse the concatenated features via a convolutional layer followed with a rectified linear unit(ReLU). The NFM is flexible since it can be embedded in different depths of the DFNFN to find the best fusion position. Consequently, DFNFN can optimally fuse features learned from RGB and LBP-Texture images to boost the retrieval accuracy. Extensive experiments on the Fabric 1.0 dataset show that the proposed method is superior to many state-of-the-art methods.
Keywords:
本文献已被 CNKI 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号