Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. / Matyukhin, Vladislav; Kabanikhin, Sergey; Shishlenin, Maxim et al.
Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings. ed. / Panos Pardalos; Michael Khachay; Alexander Kazakov. Springer Science and Business Media Deutschland GmbH, 2021. p. 159-175 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 12755 LNCS).Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
}
TY - GEN
T1 - Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems
AU - Matyukhin, Vladislav
AU - Kabanikhin, Sergey
AU - Shishlenin, Maxim
AU - Novikov, Nikita
AU - Vasin, Artem
AU - Gasnikov, Alexander
N1 - Funding Information: The research of V.V. Matyukhin and A.V. Gasnikov in Sects. 1,2,3,4 was supported by Russian Science Foundation (project No. 21-71-30005). The research of S.I. Kabanikhin, M.A. Shishlenin and N.S. Novikov in the last section was supported by RFBR 19-01-00694 and by the comprehensive program of fundamental scientific researches of the SB RAS II.1, project No. 0314-2018-0009. The work of A. Vasin was supported by Andrei M. Raigorodskii Scholarship in Optimization. Publisher Copyright: © 2021, Springer Nature Switzerland AG. Copyright: Copyright 2021 Elsevier B.V., All rights reserved.
PY - 2021
Y1 - 2021
N2 - In this paper, we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve the ill-posed Cauchy problem for the Poisson equation and make a comparative analysis with the Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of a new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method “doesn’t feel the noise”. But after this moment the noise starts to accumulate and the quality of the solution becomes worse for further iterations.
AB - In this paper, we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve the ill-posed Cauchy problem for the Poisson equation and make a comparative analysis with the Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of a new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method “doesn’t feel the noise”. But after this moment the noise starts to accumulate and the quality of the solution becomes worse for further iterations.
KW - Convex optimization
KW - Gradient method
KW - Inexact oracle
KW - Inverse and ill-posed problem
UR - http://www.scopus.com/inward/record.url?scp=85111387998&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-77876-7_11
DO - 10.1007/978-3-030-77876-7_11
M3 - Conference contribution
AN - SCOPUS:85111387998
SN - 9783030778750
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 159
EP - 175
BT - Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings
A2 - Pardalos, Panos
A2 - Khachay, Michael
A2 - Kazakov, Alexander
PB - Springer Science and Business Media Deutschland GmbH
T2 - 20th International Conference on Mathematical Optimization Theory and Operations Research, MOTOR 2021
Y2 - 5 July 2021 through 10 July 2021
ER -
ID: 29138470