首页 | 本学科首页   官方微博 | 高级检索  
     

基于深度学习的教材德目教育文本分类方法
引用本文:陈浩淼,陈军华. 基于深度学习的教材德目教育文本分类方法[J]. 上海师范大学学报(自然科学版), 2024, 53(2): 172-180
作者姓名:陈浩淼  陈军华
作者单位:上海师范大学 信息与机电工程学院, 上海 201418
基金项目:国家社会科学基金(13JZD046)
摘    要:对上海中小学教材德目教育文本分类进行研究,提出了基于转换器的双向编码表征(BERT)预训练模型、双向长短期记忆(BiLSTM)网络和注意力机制的模型IoMET_BBA. 通过合成少数类过采样技术(SMOTE)与探索性数据分析(EDA)技术进行数据增强,使用BERT模型生成富含语境信息的语义向量,通过BiLSTM提取特征,并结合注意力机制来获得词语权重信息,通过全连接层进行分类. 对比实验的结果表明,IoMET_BBA的F1度量值达到了86.14%,优于其他模型,可以精确地评估教材德目教育文本.

关 键 词:德目指标  中文文本分类  基于转换器的双向编码表征(BERT)模型  双向长短期记忆(BiLSTM)网络  注意力机制
收稿时间:2023-12-23

Text classification method for textbook moral education based on deep learning
CHEN Haomiao,CHEN Junhua. Text classification method for textbook moral education based on deep learning[J]. Journal of Shanghai Normal University(Natural Sciences), 2024, 53(2): 172-180
Authors:CHEN Haomiao  CHEN Junhua
Affiliation:College of Information, Mechanical and Electrical Engineering, Shanghai Normal University, Shanghai 201418, China
Abstract:The classification of moral education texts in Shanghai primary and secondary school textbooks was studied and an IoMET_BBA(Indicators of moral education target based on BERT, BiLSTM and attention) model was proposed based on bidirectional encoder representations from transformer(BERT) pre-training model, bidirectional long short-term memory (BiLSTM) network, and attention mechanism. Firstly, data augmentation was performed using synthetic minority oversampling technique(SMOTE)and exploratory data analysis (EDA). Secondly, BERT was used to generate semantic vectors with rich contextual information. Thirdly, BiLSTM was adopted to extract features, and attention mechanism was combined to obtain word weight information. Finally, classification was performed through a fully connected layer. The comparative experimental results indicated that F1 measurement value of IoMET_BBA reached 86.14%, which was higher than other models and could accurately evaluate the moral education texts of textbooks.
Keywords:moral education index  chinese text classification  bidirectional encoder representations from transformer(BERT) model  bidirectional long short-term memory (BiLSTM) network  attention mechanism
点击此处可从《上海师范大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《上海师范大学学报(自然科学版)》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号