作成者 |
|
|
|
本文言語 |
|
出版者 |
|
発行日 |
|
収録物名 |
|
巻 |
|
開始ページ |
|
終了ページ |
|
会議情報 |
|
出版タイプ |
|
アクセス権 |
|
権利関係 |
|
関連DOI |
|
関連URI |
|
関連HDL |
|
概要 |
Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG^<q,^r>, a framework that unifies and extends existing results by providing larger stepsize p...olicies and improved lower bounds. Different choices of the parameters are discussed and the efficacy of the resulting methods is demonstrated through numerical simulations. In an attempt to better understand the underlying theory, its convergence is established in a more general setting that allows for time-varying parameters. Finally, an adaptive alternating minimization algorithm is presented by exploring the dual setting. This algorithm not only incorporates additional adaptivity but also expands its applicability beyond standard strongly convex settings.続きを見る
|