<学術雑誌論文>
On the Convergence of Proximal Gradient Methods for Convex Simple Bilevel Optimization
| 作成者 | |
|---|---|
| 本文言語 | |
| 出版者 | |
| 発行日 | |
| 収録物名 | |
| 巻 | |
| 開始ページ | |
| 出版タイプ | |
| アクセス権 | |
| 権利関係 | |
| 関連DOI | |
| 関連HDL | |
| 概要 | This paper studies proximal gradient iterations for addressing simple bilevel optimization problems where both the upper and the lower level cost functions are split as the sum of differentiable and (...possibly nonsmooth) prox-friendly functions. We develop a novel convergence recipe for iteration-varying stepsizes that relies on Barzilai-Borwein type local estimates for the differentiable terms. Leveraging the convergence recipe, under global Lipschitz gradient continuity, we establish convergence for a nonadaptive stepsize sequence, without requiring any strong convexity or linesearch. In the locally Lipschitz differentiable setting, we develop an adaptive linesearch method that introduces a systematic adaptive scheme enabling large and nonmonotonic stepsize sequences while being insensitive to the choice of hyperparameters and initialization. Numerical simulations are provided showcasing favorable convergence speed of our methods.続きを見る |
詳細
| PISSN | |
|---|---|
| EISSN | |
| NCID | |
| レコードID | |
| 主題 | |
| 注記 | |
| タイプ | |
| 助成情報 | |
| 登録日 | 2025.03.05 |
| 更新日 | 2026.02.02 |
Mendeley出力