Convergence rate for diminishing stepsize methods in nonconvex constrained optimization via ghost penalties

Francisco Facchinei, Vyacheslav Kungurtsev, Lorenzo Lampariello, Gesualdo Scutari

Abstract


This is a companion paper to "Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity" (to appear in Mathematics of Operations Research). We consider the ghost penalty scheme for nonconvex, constrained optimization introduced in that paper, coupled with a diminishing stepsize procedure. Under an extended Mangasarian-Fromovitz-type constraint qualification we give an expression for the maximum number of iterations needed to achieve a given solution accuracy according to a natural stationarity measure, thus establishing the first result of this kind for a diminishing stepsize method for nonconvex, constrained optimization problems.

Keywords


Constrained optimization; nonconvex optimization; diminishing stepsize; convergence rate; iteration complexity

Full Text:

PDF


DOI: https://doi.org/10.1478/AAPP.98S2A8

Copyright (c) 2020 Francisco Facchinei, Vyacheslav Kungurtsev, Lorenzo Lampariello, Gesualdo Scutari

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.