首页 | 本学科首页   官方微博 | 高级检索  
     

基于改进的S-ReLU激活函数的图像分类方法
引用本文:徐静萍,王芳. 基于改进的S-ReLU激活函数的图像分类方法[J]. 科学技术与工程, 2022, 22(29): 12963-12968
作者姓名:徐静萍  王芳
作者单位:燕山大学理学院
基金项目:国家自然科学基金项目,河北省自然科学基金项目
摘    要:为解决ReLU函数负区域取值为0而引发的对应权重无法更新的问题,提出了新的激活函数S-ReLU。该函数在负区域具有软饱和性,增加了负样本数据的关注度。通过赋予负区域输出值较小的导数,促进了负输入值的反向传播,提高了模型的鲁棒性。通过与其他常见激活函数在数据集MNIST、CIFAR-10上使用LeNet-5模型的对比实验,探究基于S-ReLU激活函数的图像分类效果。实验结果表明,对于MNIST和CIFAR-10数据集,相比使用其他激活函数,S-ReLU函数提高了模型的分类精度。

关 键 词:激活函数  图像分类  卷积神经网络  特征提取
收稿时间:2022-01-11
修稿时间:2022-07-19

Image Classification Method Based on Improved S-ReLUActivation Function
Xu Jingping,Wang Fang. Image Classification Method Based on Improved S-ReLUActivation Function[J]. Science Technology and Engineering, 2022, 22(29): 12963-12968
Authors:Xu Jingping  Wang Fang
Affiliation:School of Science,Yanshan University
Abstract:In order to solve the problem that the corresponding weight cannot be updated due to the value of 0 in the negative region of ReLU function, a new activation function S-ReLU is proposed. It has soft saturation in the negative region and increases the attention of negative sample data. By giving a smaller derivative to the output value in the negative region, it promotes the back propagation of negative input value, the robustness of the model is improved. Through the comparative experiment of using LeNet-5 model on MNIST and CIFAR-10 datasets with other common activation functions, the image classification effect based on S-ReLU activation function is explored. The experimental results show that for MNIST and CIFAR-10 datasets, using S-ReLU function improves the classification accuracy of the model compared with other activation functions.
Keywords:activation function   image classification   convolutional neural network   feature extraction
点击此处可从《科学技术与工程》浏览原始摘要信息
点击此处可从《科学技术与工程》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号