<学術雑誌論文>
A Lyapunov-Based Method of Reducing Activation Functions of Recurrent Neural Networks for Stability Analysis

作成者
本文言語
出版者
発行日
収録物名
開始ページ
終了ページ
出版タイプ
アクセス権
権利関係
関連DOI
概要 This letter proposes a Lyapunov-based method of reducing the number of activation functions of a recurrent neural network (RNN) for its stability analysis. To the best of the authors’ knowledge, no me...thod has been presented for pruning RNNs with respecting their stability properties. We are the first to present an effective solution method for this important problem in the control community and machine learning community. The proposed reduction method follows the intuitive policy: compose a reduced RNN by removing some activation functions whose “magnitudes” with respect to their weighted actions are “small” in some sense, and analyze its stability to guarantee the stability of the original RNN. Moreover, we theoretically justify this policy by proving several theorems that are applicable to general reduction methods. In addition, we propose a method of rendering the proposed reduction method less conservative, on the basis of semidefinite programming. The effectiveness of the proposed methods is demonstrated on a numerical example.続きを見る

本文ファイル

pdf 7339204 pdf 117 KB 62  

詳細

EISSN
レコードID
主題
助成情報
登録日 2025.03.03
更新日 2025.03.04