首页 | 本学科首页   官方微博 | 高级检索  
     

基于对比学习方法的小样本学习
引用本文:付海涛,刘烁,冯宇轩,朱丽,张竞吉,关路. 基于对比学习方法的小样本学习[J]. 吉林大学学报(理学版), 2023, 61(1): 111-117
作者姓名:付海涛  刘烁  冯宇轩  朱丽  张竞吉  关路
作者单位:1. 吉林农业大学 信息技术学院, 长春 130118; 2. 长春理工大学 经济管理学院, 长春 130022
摘    要:针对目前小样本学习中存在的问题,设计一种新的网络结构及其训练方法以改进小样本学习.该网络在特征嵌入部分采用卷积网络并结合多尺度滑动池化方法以增强特征提取.网络主体结构为类孪生网络,以便于通过样本间的对比从小样本数据中学到语义.网络的训练方法采用嵌套层次的参数更新以保证收敛的稳定性.在两个经典小样本学习数据集上与常用的视觉模型和前沿小样本学习方法进行了对比实验,实验结果表明,该方法在小样本学习的精度上有显著提升,可作为样本不充足情况下的解决方案.

关 键 词:小样本学习  对比学习  孪生神经网络  滑动池化
收稿时间:2022-07-24

Few-Shot Learning Based on Contrastive Learning Method
FU Haitao,LIU Shuo,FENG Yuxuan,ZHU Li,ZHANG Jingji,GUAN Lu. Few-Shot Learning Based on Contrastive Learning Method[J]. Journal of Jilin University: Sci Ed, 2023, 61(1): 111-117
Authors:FU Haitao  LIU Shuo  FENG Yuxuan  ZHU Li  ZHANG Jingji  GUAN Lu
Affiliation:1. College of Information Technology, Jinlin Agricultural University, Changchun 130118, China;2. School of Economics and Management, Changchun University of Science and Technology, Changchun 130022, China
Abstract:Aiming at the problems existing in few-shot learning at present, we designed a new network structure and its training method to improve the few-shot learning. The  convolution network and multi-scale slide pooling method were used to enhance feature extraction in the feature embedding part of the network. The main structure  of the networks was the Siamese network  to facilitate learning semantics from small sample data through comparison between samples. The training method  of the framework adopted nested level parameter updating to ensure the stability of convergence. Compared with the common visual model and few-shot learning methods, the experimental results  on two classical few-shot learning datasets show that the method significantly improves the  accuracy of  few-shot learning, and  can be used as a solution  under the condition of insufficient sample.
Keywords:few-shot learning   contrastive learning   Siamese network   slide pooling  
点击此处可从《吉林大学学报(理学版)》浏览原始摘要信息
点击此处可从《吉林大学学报(理学版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号