Standard

Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. / Matyukhin, Vladislav; Kabanikhin, Sergey; Shishlenin, Maxim и др.

Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings. ред. / Panos Pardalos; Michael Khachay; Alexander Kazakov. Springer Science and Business Media Deutschland GmbH, 2021. стр. 159-175 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Том 12755 LNCS).

Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференцийстатья в сборнике материалов конференциинаучнаяРецензирование

Harvard

Matyukhin, V, Kabanikhin, S, Shishlenin, M, Novikov, N, Vasin, A & Gasnikov, A 2021, Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. в P Pardalos, M Khachay & A Kazakov (ред.), Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Том. 12755 LNCS, Springer Science and Business Media Deutschland GmbH, стр. 159-175, 20th International Conference on Mathematical Optimization Theory and Operations Research, MOTOR 2021, Irkutsk, Российская Федерация, 05.07.2021. https://doi.org/10.1007/978-3-030-77876-7_11

APA

Matyukhin, V., Kabanikhin, S., Shishlenin, M., Novikov, N., Vasin, A., & Gasnikov, A. (2021). Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. в P. Pardalos, M. Khachay, & A. Kazakov (Ред.), Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings (стр. 159-175). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Том 12755 LNCS). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-77876-7_11

Vancouver

Matyukhin V, Kabanikhin S, Shishlenin M, Novikov N, Vasin A, Gasnikov A. Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. в Pardalos P, Khachay M, Kazakov A, Редакторы, Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings. Springer Science and Business Media Deutschland GmbH. 2021. стр. 159-175. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-030-77876-7_11

Author

Matyukhin, Vladislav ; Kabanikhin, Sergey ; Shishlenin, Maxim и др. / Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems. Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings. Редактор / Panos Pardalos ; Michael Khachay ; Alexander Kazakov. Springer Science and Business Media Deutschland GmbH, 2021. стр. 159-175 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).

BibTeX

@inproceedings{40de394e64564d699105360f7623072e,
title = "Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems",
abstract = "In this paper, we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve the ill-posed Cauchy problem for the Poisson equation and make a comparative analysis with the Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of a new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method “doesn{\textquoteright}t feel the noise”. But after this moment the noise starts to accumulate and the quality of the solution becomes worse for further iterations.",
keywords = "Convex optimization, Gradient method, Inexact oracle, Inverse and ill-posed problem",
author = "Vladislav Matyukhin and Sergey Kabanikhin and Maxim Shishlenin and Nikita Novikov and Artem Vasin and Alexander Gasnikov",
note = "Funding Information: The research of V.V. Matyukhin and A.V. Gasnikov in Sects. 1,2,3,4 was supported by Russian Science Foundation (project No. 21-71-30005). The research of S.I. Kabanikhin, M.A. Shishlenin and N.S. Novikov in the last section was supported by RFBR 19-01-00694 and by the comprehensive program of fundamental scientific researches of the SB RAS II.1, project No. 0314-2018-0009. The work of A. Vasin was supported by Andrei M. Raigorodskii Scholarship in Optimization. Publisher Copyright: {\textcopyright} 2021, Springer Nature Switzerland AG. Copyright: Copyright 2021 Elsevier B.V., All rights reserved.; 20th International Conference on Mathematical Optimization Theory and Operations Research, MOTOR 2021 ; Conference date: 05-07-2021 Through 10-07-2021",
year = "2021",
doi = "10.1007/978-3-030-77876-7_11",
language = "English",
isbn = "9783030778750",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland GmbH",
pages = "159--175",
editor = "Panos Pardalos and Michael Khachay and Alexander Kazakov",
booktitle = "Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings",
address = "Germany",

}

RIS

TY - GEN

T1 - Convex Optimization with Inexact Gradients in Hilbert Space and Applications to Elliptic Inverse Problems

AU - Matyukhin, Vladislav

AU - Kabanikhin, Sergey

AU - Shishlenin, Maxim

AU - Novikov, Nikita

AU - Vasin, Artem

AU - Gasnikov, Alexander

N1 - Funding Information: The research of V.V. Matyukhin and A.V. Gasnikov in Sects. 1,2,3,4 was supported by Russian Science Foundation (project No. 21-71-30005). The research of S.I. Kabanikhin, M.A. Shishlenin and N.S. Novikov in the last section was supported by RFBR 19-01-00694 and by the comprehensive program of fundamental scientific researches of the SB RAS II.1, project No. 0314-2018-0009. The work of A. Vasin was supported by Andrei M. Raigorodskii Scholarship in Optimization. Publisher Copyright: © 2021, Springer Nature Switzerland AG. Copyright: Copyright 2021 Elsevier B.V., All rights reserved.

PY - 2021

Y1 - 2021

N2 - In this paper, we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve the ill-posed Cauchy problem for the Poisson equation and make a comparative analysis with the Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of a new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method “doesn’t feel the noise”. But after this moment the noise starts to accumulate and the quality of the solution becomes worse for further iterations.

AB - In this paper, we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve the ill-posed Cauchy problem for the Poisson equation and make a comparative analysis with the Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of a new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method “doesn’t feel the noise”. But after this moment the noise starts to accumulate and the quality of the solution becomes worse for further iterations.

KW - Convex optimization

KW - Gradient method

KW - Inexact oracle

KW - Inverse and ill-posed problem

UR - http://www.scopus.com/inward/record.url?scp=85111387998&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-77876-7_11

DO - 10.1007/978-3-030-77876-7_11

M3 - Conference contribution

AN - SCOPUS:85111387998

SN - 9783030778750

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 159

EP - 175

BT - Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings

A2 - Pardalos, Panos

A2 - Khachay, Michael

A2 - Kazakov, Alexander

PB - Springer Science and Business Media Deutschland GmbH

T2 - 20th International Conference on Mathematical Optimization Theory and Operations Research, MOTOR 2021

Y2 - 5 July 2021 through 10 July 2021

ER -

ID: 29138470