电信科学 ›› 2022, Vol. 38 ›› Issue (1): 83-94.doi: 10.11959/j.issn.1000-0801.2022004

• 研究与开发 • 上一篇    下一篇

深度卷积神经网络的柔性剪枝策略

陈靓1,2, 钱亚冠1,2, 何志强1,2, 关晓惠3, 王滨4, 王星4   

  1. 1 浙江科技学院理学院/大数据学院,浙江 杭州 310023
    2 海康威视-浙江科技学院边缘智能安全联合实验室,浙江 杭州 310023
    3 浙江水利水电学院信息工程与艺术设计学院,浙江 杭州310023
    4 浙江大学电气工程学院,浙江 杭州 310063
  • 修回日期:2021-12-13 出版日期:2022-01-20 发布日期:2022-01-01
  • 作者简介:陈靓(1995- ),男,浙江科技学院硕士生,主要研究方向为网络剪枝
    钱亚冠(1976- ),男,博士,浙江科技学院副教授,主要研究方向为深度学习、人工智能安全、大数据处理
    何志强(1996- ),男,浙江科技学院硕士生,主要研究方向为网络剪枝
    关晓惠(1977- ),女,博士,浙江水利水电学院副教授,主要研究方向为数字图像处理与模式识别
    王滨(1978- ),男,博士,浙江大学研究员,主要研究方向为人工智能安全、物联网安全、密码学
    王星(1985- ),男,博士,浙江大学在站博士后,主要研究方向为机器学习与物联网安全
  • 基金资助:
    国家重点研发计划项目(2018YFB2100400);国家自然科学基金资助项目(61902082)

A flexible pruning on deep convolutional neural networks

Liang CHEN1,2, Yaguan QIAN1,2, Zhiqiang HE1,2, Xiaohui GUAN3, Bin WANG4, Xing WANG4   

  1. 1 School of Science/School of Big-data Science, Zhejiang University of Science and Technology, Hangzhou 310023, China
    2 Hikvision-Zhejiang University of Science and Technology Edge Intelligence Security Lab, Hangzhou 310023, China
    3 College of Information Engineering & Art Design, Zhejiang University of Water Resources and Electric Power, Hangzhou 310023, China
    4 College of Electrical Engineering, Zhejiang University, Hangzhou 310063, China
  • Revised:2021-12-13 Online:2022-01-20 Published:2022-01-01
  • Supported by:
    The National Key Research and Development Program of China(2018YFB2100400);The National Natural Science Foundation of China(61902082)

摘要:

摘 要:尽管深度卷积神经网络在多种应用中取得了极大的成功,但其结构的冗余性导致模型过大的存储容量和过高的计算代价,难以部署到资源受限的边缘设备中。网络剪枝是消除网络冗余的一种有效途径,为了找到在有限资源下最佳的神经网络模型架构,提出了一种高效的柔性剪枝策略。一方面,通过计算通道贡献量,兼顾通道缩放系数的分布情况;另一方面,通过对剪枝结果的合理估计及预先模拟,提高剪枝过程的效率。基于VGG16与ResNet56在CIFAR-10的实验结果表明,柔性剪枝策略分别降低了71.3%和54.3%的浮点运算量,而准确率仅分别下降0.15个百分点和0.20个百分点。

关键词: 卷积神经网络, 网络剪枝, 缩放系数, 通道贡献量

Abstract:

Despite the successful application of deep convolutional neural networks, due to the redundancy of its structure, the large memory requirements and the high computing cost lead it hard to be well deployed to the edge devices with limited resources.Network pruning is an effective way to eliminate network redundancy.An efficient flexible pruning strategy was proposed in the purpose of the best architecture under the limited resources.The contribution of channels was calculated considering the distribution of channel scaling factors.Estimating the pruning result and simulating in advance increase efficiency.Experimental results based on VGG16 and ResNet56 on CIFAR-10 show that the flexible pruning reduces FLOPs by 71.3% and 54.3%, respectively, while accuracy by only 0.15 percentage points and 0.20 percentage points compared to the benchmark model.

Key words: convolutional neural network, network pruning, scaling factor, channel contribution

中图分类号: 

No Suggested Reading articles found!