論文

査読有り 本文へのリンクあり
2022年11月1日

One-shot pruning of gated recurrent unit neural network by sensitivity for time-series prediction

Neurocomputing
  • Hong Tang
  • ,
  • Xiangzheng Ling
  • ,
  • Liangzhi Li
  • ,
  • Liyan Xiong
  • ,
  • Yu Yao
  • ,
  • Xiaohui Huang

512
開始ページ
15
終了ページ
24
記述言語
掲載種別
研究論文(学術雑誌)
DOI
10.1016/j.neucom.2022.09.026

Although deep learning models have been successfully adopted in many applications, they are facing challenges to be deployed on energy-limited devices (e.g., some mobile devices, etc.) due to their high computation complexity. In this paper, we focus on reducing the costs of Gated Recurrent Units (GRUs) for time-series prediction tasks and we propose a new pruning method that can recognize and remove the neural connections that have little influence on the network loss, using a controllable threshold on the absolute value of the pre-trained GRU weights. This is different from existing approaches which usually try to find and preserve the connections with large weight values. We further propose a sparse-connection GRU model (SCGRU) that only needs a one-time pruning (with fine-tuning), rather than using multiple prune-retrain cycles. A large number of experimental results demonstrate that the proposed method is able to largely reduce the storage and computation costs while achieving the state-of-arts performance in two datasets. Code is available ( https://github.com/imLingo/SCGRU).

リンク情報
DOI
https://doi.org/10.1016/j.neucom.2022.09.026
Scopus
https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85138453665&origin=inward 本文へのリンクあり
Scopus Citedby
https://www.scopus.com/inward/citedby.uri?partnerID=HzOxMe3b&scp=85138453665&origin=inward
ID情報
  • DOI : 10.1016/j.neucom.2022.09.026
  • ISSN : 0925-2312
  • eISSN : 1872-8286
  • SCOPUS ID : 85138453665

エクスポート
BibTeX RIS