Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференций › статья в сборнике материалов конференции › научная › Рецензирование
Weakly Supervised Regression Using Manifold Regularization and Low-Rank Matrix Representation. / Berikov, Vladimir; Litvinenko, Alexander.
Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings. ред. / Panos Pardalos; Michael Khachay; Alexander Kazakov. Springer Science and Business Media Deutschland GmbH, 2021. стр. 447-461 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Том 12755 LNCS).Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференций › статья в сборнике материалов конференции › научная › Рецензирование
}
TY - GEN
T1 - Weakly Supervised Regression Using Manifold Regularization and Low-Rank Matrix Representation
AU - Berikov, Vladimir
AU - Litvinenko, Alexander
N1 - Publisher Copyright: © 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - We solve a weakly supervised regression problem. Under “weakly” we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources. The solution process requires to optimize a certain objective function (the loss function), which combines manifold regularization and low-rank matrix decomposition techniques. These low-rank approximations allow us to speed up all matrix calculations and reduce storage requirements. This is especially crucial for large datasets. Ensemble clustering is used for obtaining the co-association matrix, which we consider as the similarity matrix. The utilization of these techniques allows us to increase the quality and stability of the solution. In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
AB - We solve a weakly supervised regression problem. Under “weakly” we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources. The solution process requires to optimize a certain objective function (the loss function), which combines manifold regularization and low-rank matrix decomposition techniques. These low-rank approximations allow us to speed up all matrix calculations and reduce storage requirements. This is especially crucial for large datasets. Ensemble clustering is used for obtaining the co-association matrix, which we consider as the similarity matrix. The utilization of these techniques allows us to increase the quality and stability of the solution. In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
KW - Cluster ensemble
KW - Co-association matrix
KW - Low-rank matrix decomposition
KW - Manifold regularization
KW - Weakly supervised learning
UR - http://www.scopus.com/inward/record.url?scp=85111359382&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-77876-7_30
DO - 10.1007/978-3-030-77876-7_30
M3 - Conference contribution
AN - SCOPUS:85111359382
SN - 9783030778750
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 447
EP - 461
BT - Mathematical Optimization Theory and Operations Research - 20th International Conference, MOTOR 2021, Proceedings
A2 - Pardalos, Panos
A2 - Khachay, Michael
A2 - Kazakov, Alexander
PB - Springer Science and Business Media Deutschland GmbH
T2 - 20th International Conference on Mathematical Optimization Theory and Operations Research, MOTOR 2021
Y2 - 5 July 2021 through 10 July 2021
ER -
ID: 34146129