首页 | 本学科首页   官方微博 | 高级检索  
     检索      

多层次自适应知识蒸馏的轻量化高分遥感场景分类
引用本文:翁谦,黄志铭,林嘉雯,简彩仁,廖祥文.多层次自适应知识蒸馏的轻量化高分遥感场景分类[J].福州大学学报(自然科学版),2023,51(4):459-466.
作者姓名:翁谦  黄志铭  林嘉雯  简彩仁  廖祥文
作者单位:福州大学计算机与大数据学院,福州大学计算机与大数据学院,福州大学计算机与大数据学院,厦门大学嘉庚学院信息科学与技术学院,福州大学计算机与大数据学院
基金项目:国家自然科学基金项目(面上项目,重点项目,重大项目)、福建省自然科学基金资助项目(面上项目,重点项目,重大项目)、福建省中青年教师教育科研项目(科技类)
摘    要:针对深度神经网络模型计算量大且耗时,而轻量化模型速度快但精度低,二者均无法直接应用在嵌入式设备上的问题,本文提出一种多层次自适应知识蒸馏方法提升轻量化模型的性能。首先,针对遥感影像类别间差异程度不均衡,通过改进输出层知识蒸馏中的温度机制,提出一种自适应温度机制,促进学生模型更好地学习大而深的教师模型的输出层概率分布知识;然后,通过添加辅助卷积块以融入特征层的知识蒸馏方法,使学生模型学习教师模型的多层次知识。在两个高分遥感场景分类数据集上的实验验证本文方法的有效性。

关 键 词:知识蒸馏、场景分类、自适应温度蒸馏、特征蒸馏、卷积神经网络
收稿时间:2022/6/24 0:00:00
修稿时间:2023/3/6 0:00:00

Lightweight high-resolution remote sensing scene classification based on multi-level adaptive knowledge distillation
WENG Qian,HUANG Zhiming,LIN Jiawen,JIAN Cairen,LIAO Xiangwen.Lightweight high-resolution remote sensing scene classification based on multi-level adaptive knowledge distillation[J].Journal of Fuzhou University(Natural Science Edition),2023,51(4):459-466.
Authors:WENG Qian  HUANG Zhiming  LIN Jiawen  JIAN Cairen  LIAO Xiangwen
Institution:College of Computer and Data Science, Fuzhou University,College of Computer and Data Science, Fuzhou University,College of Computer and Data Science, Fuzhou University,School of Information Science & Technology, Tan Kah Kee College, Xiamen University,College of Computer and Data Science, Fuzhou University
Abstract:Aiming at the problem that the deep convolutional neural networks models are computationally expensive and time-consuming, while the lightweight models have fast speed but low accuracy, and both of them could not be directly applied to embedded devices, a multi-level adaptive knowledge distillation method is proposed to improve the performance of the lightweight model. Firstly, aiming at the uneven degree of difference between remote sensing image categories, an adaptive temperature mechanism is proposed in the knowledge distillation of the output layer, so as to promote the student model to better learn the probability distribution knowledge of the output layer of the large and deep teacher model; In addition, make the student model to learn the multilevel knowledge by adding auxiliary convolution blocks integrate into the knowledge distillation method of the feature layer from teacher model. Experiments on two high-resolution remote sensing scene classification datasets verify the effectiveness of this method.
Keywords:knowledge distillation  scene classification  adaptive temperature distillation  feature distillation  convolutional neural networks
点击此处可从《福州大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《福州大学学报(自然科学版)》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号