首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于风格转换注意的循环一致风格转换
引用本文:张蕊儿,边晓航,刘思远,刘 滨,李建武,罗 俊,祁明月.基于风格转换注意的循环一致风格转换[J].河北科技大学学报,2024,45(3):328-340.
作者姓名:张蕊儿  边晓航  刘思远  刘 滨  李建武  罗 俊  祁明月
作者单位:沈阳师范大学美术与设计学院;郑州电子信息职业技术学院;北京理工大学计算机学院;河北科技大学经济管理学院;河北神玥软件科技股份有限公司;河北阅思信息科技有限公司
基金项目:国家文化和旅游科技创新工程项目(2020 年度);河北省省级科技计划资助项目(21310101D)
摘    要:为了解决现有艺术风格转换方法难以同时高质量保持图像内容和转换风格模式的问题,引入一种新颖的风格转换注意网络(style-transition attention network,STANet),其包含2个关键部分:一是非对称注意力模块,用于确定参考图像的风格特征;二是循环结构,用于保存图像内容。首先,采用双流架构,分别对风格和内容图像进行编码;其次,将注意力模块无缝集成到编码器中,生成风格注意表征;最后,将模块放入不同的卷积阶段,使编码器变成交错式的,促进从风格流到内容流的分层信息传播。此外,提出了循环一致损失,强制网络以整体方式保留内容结构和风格模式。结果表明:编码器优于传统的双流架构,STANet能用于交换具有任意风格的2幅图像的风格模式,合成更高质量的风格化图像,同时更好地保留了各自的内容。提出的带有风格转换注意的风格转换循环网络,模型风格化图像的内容细节更多,在泛化到任意风格方面获得了良好的效果。

关 键 词:图像内容  风格转换  风格恢复  神经注意力  循环网络
收稿时间:2024/3/5 0:00:00
修稿时间:2024/4/30 0:00:00

Cycle consistent style transfer based on style-transition attention
ZHANG Rui'er,BIAN Xiaohang,LIU Siyuan,LIU Bin,LI Jianwu,LUO Jun,QI Mingyue.Cycle consistent style transfer based on style-transition attention[J].Journal of Hebei University of Science and Technology,2024,45(3):328-340.
Authors:ZHANG Rui'er  BIAN Xiaohang  LIU Siyuan  LIU Bin  LI Jianwu  LUO Jun  QI Mingyue
Abstract:In order to solve the problem that the existing art style transfer methods can not maintain high-quality image content and transform style patterns at the same time, a novel style-transition attention network (STANet) was introduced, which consists of two key parts: one is the asymmetric attention module used to determine the style features of the reference image, and the other is the circular structure used to save the content of the image. Firstly, the two-stream architecture was adopted to encode the style and content images.Secondly, the attention module was seamlessly integrated into the encoder to generate the style attention representation. Finally, the module was put into different convolution stages, making the encoder interleaved, and facilitating the flow of hierarchical information from style to content. In addition, a circular consistency loss was proposed to force the network to retain the content structure and style patterns in a holistic manner. The results show that the encoder is superior to the traditional Shuangliu District architecture, and STANet can be used to exchange the style patterns of two images with any style, resulting in higher quality stylized images, while better preserving their own content. The proposed style conversion loop network with attention to style conversion makes the model stylized images more detailed and achieves good performance in generalization to any styles.
Keywords:image content  style transfer  style restoration  neural attention  cycle network
点击此处可从《河北科技大学学报》浏览原始摘要信息
点击此处可从《河北科技大学学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号