首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于可融合残差卷积块的深度神经网络模型层剪枝方法
引用本文:徐鹏涛,曹健,孙文宇,李普,王源,张兴.基于可融合残差卷积块的深度神经网络模型层剪枝方法[J].北京大学学报(自然科学版),2022,58(5):801-807.
作者姓名:徐鹏涛  曹健  孙文宇  李普  王源  张兴
作者单位:北京大学软件与微电子学院, 北京 102600
基金项目:国家重点研发计划(2018YFE0203801)资助
摘    要:针对当前主流的剪枝方法所获得的压缩模型推理时间较长和效果较差的问题, 提出一种易用且性能优异的层剪枝方法。该方法将原始卷积层转化为可融合残差卷积块, 然后通过稀疏化训练的方法实现层剪枝, 得到一种具有工程易用性的层剪枝方法, 兼具推理时间短和剪枝效果好的优点。实验结果表明, 在图像分类任务和目标检测任务中, 该方法可使模型在精度损失较小的情况下获得极高的压缩率, 优于先进的卷积核剪枝方法。

关 键 词:卷积神经网络  层剪枝  可融合残差卷积块  稀疏化训练  图像分类    
收稿时间:2021-09-30

Layer Pruning via Fusible Residual Convolutional Blockfor Deep Neural Networks
XU Pengtao,CAO Jian,SUN Wenyu,LI Pu,WANG Yuan,ZHANG Xing.Layer Pruning via Fusible Residual Convolutional Blockfor Deep Neural Networks[J].Acta Scientiarum Naturalium Universitatis Pekinensis,2022,58(5):801-807.
Authors:XU Pengtao  CAO Jian  SUN Wenyu  LI Pu  WANG Yuan  ZHANG Xing
Institution:School of Software and Microelectronics, Peking University, Beijing 102600
Abstract:Aiming at the problems of long inference time and poor effect of the compression model obtained by the current mainstream pruning methods, an easy-to-use and excellent layer pruning method is proposed. The original convolution layers in the model are transformed into fusible residual convolutional blocks, and then layer pruning is realized by sparse training, therefore a layer pruning method with engineering ease is obtained, which has the advantages of short inference time and good pruning effect. The experimental results show that the proposed layer pruning method can achieve a very high compression rate with less accuracy loss in image classification tasks and object detection tasks, and the compression performance is better than the advanced convolutional kernel pruning methods.
Keywords:convolutional neural network  layer pruning  fusible residual convolutional block  sparse training  image classification 
  
点击此处可从《北京大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《北京大学学报(自然科学版)》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号