| 作成者 |
|
|
|
|
|
| 本文言語 |
|
| 出版者 |
|
|
|
| 発行日 |
|
| 収録物名 |
|
| 巻 |
|
| 開始ページ |
|
| 終了ページ |
|
| 会議情報 |
|
| 出版タイプ |
|
| アクセス権 |
|
| 利用開始日 |
|
| 権利関係 |
|
| 関連DOI |
|
| 関連DOI |
|
|
|
| 関連URI |
|
|
|
| 関連HDL |
|
| 概要 |
Difference-of-convex (DC) optimization problems are shown to be equivalent to the minimization of a Lipschitz-differentiable “envelope”. A gradient method on this surrogate function yields a novel (su...b)gradient-free proximal algorithm which is inherently parallelizable and can handle fully nonsmooth formulations. Newton-type methods such as L-BFGS are directly applicable with a classical linesearch. Our analysis reveals a deep kinship between the novel DC envelope and the forward-backward envelope, the former being a smooth and convexity-preserving nonlinear reparametrization of the latter.続きを見る
|