論文

査読有り 筆頭著者 責任著者
2018年5月

Hyperbolic Gradient Operator and Hyperbolic Back-Propagation Learning Algorithms

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
  • Tohru Nitta
  • ,
  • Yasuaki Kuroe

29
5
開始ページ
1689
終了ページ
1702
記述言語
英語
掲載種別
研究論文(学術雑誌)
DOI
10.1109/TNNLS.2017.2677446
出版者・発行元
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC

In this paper, we first extend the Wirtinger derivative which is defined for complex functions to hyperbolic functions, and derive the hyperbolic gradient operator yielding the steepest descent direction by using it. Next, we derive the hyperbolic backpropagation learning algorithms for some multilayered hyperbolic neural networks (NNs) using the hyperbolic gradient operator. It is shown that the use of the Wirtinger derivative reduces the effort necessary for the derivation of the learning algorithms by half, simplifies the representation of the learning algorithms, and makes their computer programs easier to code. In addition, we discuss the differences between the derived Hyperbolic-BP rules and the complex-valued backpropagation learning rule (Complex-BP). Finally, we make some experiments with the derived learning algorithms. As a result, we find that the convergence rates of the Hyperbolic-BP learning algorithms are high even if the fully activation functions are used, and discover that the Hyperbolic-BP learning algorithm for the hyperbolic NN with the split-type hyperbolic activation function has an ability to learn hyperbolic rotation as its inherent property.

リンク情報
DOI
https://doi.org/10.1109/TNNLS.2017.2677446
Web of Science
https://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=JSTA_CEL&SrcApp=J_Gate_JST&DestLinkType=FullRecord&KeyUT=WOS:000430729100024&DestApp=WOS_CPL
ID情報
  • DOI : 10.1109/TNNLS.2017.2677446
  • ISSN : 2162-237X
  • eISSN : 2162-2388
  • Web of Science ID : WOS:000430729100024

エクスポート
BibTeX RIS