Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференций › глава/раздел › научная › Рецензирование
Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem. / Kalmutskiy, Kirill; Berikov, Vladimir.
Lecture Notes in Computer Science. Springer, 2025. стр. 335-347 (Lecture Notes in Computer Science; Том 15681 LNCS).Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференций › глава/раздел › научная › Рецензирование
}
TY - CHAP
T1 - Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem
AU - Kalmutskiy, Kirill
AU - Berikov, Vladimir
N1 - This work was supported by the Russian Science Foundation, project 24-21-00195.
PY - 2025/7/6
Y1 - 2025/7/6
N2 - This paper presents a novel Clustering-based Graph Neural Network algorithm for weakly supervised regression. The proposed approach constructs a robust graph representation of the data using a weighted co-association matrix derived from a cluster ensemble, enabling the model to effectively capture complex relationships and reduce the impact of noise and outliers. A graph neural network is then trained on this structure, with manifold regularization via the graph Laplacian allowing the model to effectively utilize both labeled and unlabeled data. This approach improves stability and enhances robustness to label noise. Additionally, Truncated Loss is employed to mitigate the influence of outliers during training, and a Balanced Batch Sampling algorithm is introduced to ensure effective mini-batch training on the constructed graph. Numerical experiments on several real-world regression datasets demonstrate that CBGNN outperforms classical supervised, semi-supervised, and other weakly supervised learning methods, particularly in settings with significant label noise.
AB - This paper presents a novel Clustering-based Graph Neural Network algorithm for weakly supervised regression. The proposed approach constructs a robust graph representation of the data using a weighted co-association matrix derived from a cluster ensemble, enabling the model to effectively capture complex relationships and reduce the impact of noise and outliers. A graph neural network is then trained on this structure, with manifold regularization via the graph Laplacian allowing the model to effectively utilize both labeled and unlabeled data. This approach improves stability and enhances robustness to label noise. Additionally, Truncated Loss is employed to mitigate the influence of outliers during training, and a Balanced Batch Sampling algorithm is introduced to ensure effective mini-batch training on the constructed graph. Numerical experiments on several real-world regression datasets demonstrate that CBGNN outperforms classical supervised, semi-supervised, and other weakly supervised learning methods, particularly in settings with significant label noise.
KW - Cluster ensemble
KW - Graph convolutional neural network
KW - Manifold regularization
KW - Truncated loss
KW - Weakly supervised regression
UR - https://www.mendeley.com/catalogue/46169007-c2e0-33e5-9d16-46961021c7eb/
UR - https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=105010826054&origin=inward
U2 - 10.1007/978-3-031-97077-1_23
DO - 10.1007/978-3-031-97077-1_23
M3 - Chapter
SN - 9783031970764
T3 - Lecture Notes in Computer Science
SP - 335
EP - 347
BT - Lecture Notes in Computer Science
PB - Springer
ER -
ID: 68561441