Big Data Research ›› 2020, Vol. 6 ›› Issue (4): 56-68.doi: 10.11959/j.issn.2096-0271.2020033

• TOPIC:HETEROGENEOUS PARALLEL SYSTEMS FOR BIG DATA • Previous Articles     Next Articles

Memory management in deep learning:a survey

Weiliang MA1,2,Xuan PENG1,2,Qian XIONG1,2,Xuanhua SHI1,2,Hai JIN1,2   

  1. 1 School of Computer Science and Technology,Huazhong University of Science and Technology,Wuhan 430074,China
    2 National Engineering Research Center for Big Data Technology and System,Services Computing Technology and System Lab,Huazhong University of Science and Technology,Wuhan 430074,China
  • Online:2020-07-15 Published:2020-07-18
  • Supported by:
    The National Natural Science Foundation of China(61772218)

Abstract:

In recent years,deep learning has achieved great success in many fields.As the deep neural network develops towards a deeper and wider direction,the training and inference of a deep neural network face huge memory pressure.The limited memory space of accelerating devices has become an important factor restricting the rapid development of deep neural network.How to achieve efficient memory management in deep learning has become a key point in the development of deep learning.Therefore,the basic characteristics of deep neural network were introduced firstly and memory bottleneck in deep learning training was analyzed.Some representative research works were classified,and their advantages and disadvantages were analyzed.Finally,some important direction and tendency of memory management in deep learning were suggested.

Key words: memory management, deep learning, memory swapping, recomputation, memory sharing, compression

CLC Number: 

No Suggested Reading articles found!