作成者 |
|
|
|
|
|
本文言語 |
|
出版者 |
|
発行日 |
|
収録物名 |
|
巻 |
|
開始ページ |
|
終了ページ |
|
会議情報 |
|
出版タイプ |
|
アクセス権 |
|
権利関係 |
|
権利関係 |
|
関連DOI |
|
関連URI |
|
関連HDL |
|
概要 |
We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions. Our analysis reveals that a class of linesearch-free methods is still co...nvergent under mere local Hölder gradient continuity, covering in particular continuously differentiable semi-algebraic functions. To mitigate the lack of local Lipschitz continuity, popular approaches revolve around ε-oracles and/or linesearch procedures. In contrast, we exploit plain Hölder inequalities not entailing any approximation, all while retaining the linesearch-free nature of adaptive schemes. Furthermore, we prove full sequence convergence without prior knowledge of local Hölder constants nor of the order of Hölder continuity. Numerical experiments make comparisons with baseline methods on diverse tasks from machine learning covering both the locally and the globally Hölder setting.続きを見る
|