<会議発表論文>
On the convergence of adaptive first order methods: Proximal gradient and alternating minimization algorithms

作成者
本文言語
出版者
発行日
収録物名
開始ページ
終了ページ
会議情報
出版タイプ
アクセス権
権利関係
関連DOI
関連URI
関連HDL
概要 Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG^<q,^r>, a framework that unifies and extends existing results by providing larger stepsize p...olicies and improved lower bounds. Different choices of the parameters are discussed and the efficacy of the resulting methods is demonstrated through numerical simulations. In an attempt to better understand the underlying theory, its convergence is established in a more general setting that allows for time-varying parameters. Finally, an adaptive alternating minimization algorithm is presented by exploring the dual setting. This algorithm not only incorporates additional adaptivity but also expands its applicability beyond standard strongly convex settings.続きを見る

本文ファイル

pdf 7234381 pdf 483 KB 12  

詳細

EISSN
レコードID
関連URI
主題
助成情報
登録日 2024.09.12
更新日 2024.12.02