Space-Integrated-Ground Information Networks ›› 2023, Vol. 4 ›› Issue (4): 79-85.doi: 10.11959/j.issn.2096-8930.2023045

• Applications • Previous Articles    

Scalable Low Power Accelerator for Sparse Recurrent Neural Network

Panshi JIN1, Junjie LI2, Jingyi WANG2, Pengchong LI3, Lei XING2, Xiaodong LI1   

  1. 1 China Construction Bank Co., Ltd., Beijing 100034, China
    2 Jianxin Financial Technology Co., Ltd., Shanghai 321004, China
    3 Inspur Electronic Information Industry Co., Ltd., Jinan, Shandong 250000, China
  • Revised:2023-11-30 Online:2023-12-01 Published:2023-12-01

Abstract:

The use of edge computing devices in bank outlets for passenger flow analysis, security protection, risk prevention and control is increasingly widespread, among which the performance and power consumption of AI reasoning chips have become a very important factor in the selection of edge computing devices.Aiming at the problems of recurrent neural network, such as high power consumption, weak reasoning performance and low energy efficiency, which were caused by data dependence and low data reusability, this paper realized a sparse RNN low-power accelerator with scalable voltage by using FPGA, and verifies it on the edge design and calculation equipment.Firstly, the sparse -RNN was analyzed and the processing array was designed by network compression.Secondly, due to the unbalanced workload of sparse RNN, it introduced voltage scaling method to maintain low power consumption and high throughput.Experiments show that this method could significantly improve the RNN reasoning speed of the system and reduce the processing power consumption of the chip.

Key words: RNN, sparse, low power consumption, acceleration scheme

CLC Number: 

No Suggested Reading articles found!