Standard

Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem. / Kalmutskiy, Kirill; Berikov, Vladimir.

Lecture Notes in Computer Science. Springer, 2025. стр. 335-347 (Lecture Notes in Computer Science; Том 15681 LNCS).

Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференцийглава/разделнаучнаяРецензирование

Harvard

Kalmutskiy, K & Berikov, V 2025, Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem. в Lecture Notes in Computer Science. Lecture Notes in Computer Science, Том. 15681 LNCS, Springer, стр. 335-347. https://doi.org/10.1007/978-3-031-97077-1_23

APA

Kalmutskiy, K., & Berikov, V. (2025). Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem. в Lecture Notes in Computer Science (стр. 335-347). (Lecture Notes in Computer Science; Том 15681 LNCS). Springer. https://doi.org/10.1007/978-3-031-97077-1_23

Vancouver

Kalmutskiy K, Berikov V. Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem. в Lecture Notes in Computer Science. Springer. 2025. стр. 335-347. (Lecture Notes in Computer Science). doi: 10.1007/978-3-031-97077-1_23

Author

Kalmutskiy, Kirill ; Berikov, Vladimir. / Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem. Lecture Notes in Computer Science. Springer, 2025. стр. 335-347 (Lecture Notes in Computer Science).

BibTeX

@inbook{624370b586084d8a8feb31b638bc0e64,
title = "Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem",
abstract = "This paper presents a novel Clustering-based Graph Neural Network algorithm for weakly supervised regression. The proposed approach constructs a robust graph representation of the data using a weighted co-association matrix derived from a cluster ensemble, enabling the model to effectively capture complex relationships and reduce the impact of noise and outliers. A graph neural network is then trained on this structure, with manifold regularization via the graph Laplacian allowing the model to effectively utilize both labeled and unlabeled data. This approach improves stability and enhances robustness to label noise. Additionally, Truncated Loss is employed to mitigate the influence of outliers during training, and a Balanced Batch Sampling algorithm is introduced to ensure effective mini-batch training on the constructed graph. Numerical experiments on several real-world regression datasets demonstrate that CBGNN outperforms classical supervised, semi-supervised, and other weakly supervised learning methods, particularly in settings with significant label noise.",
keywords = "Cluster ensemble, Graph convolutional neural network, Manifold regularization, Truncated loss, Weakly supervised regression",
author = "Kirill Kalmutskiy and Vladimir Berikov",
note = "This work was supported by the Russian Science Foundation, project 24-21-00195.",
year = "2025",
month = jul,
day = "6",
doi = "10.1007/978-3-031-97077-1_23",
language = "English",
isbn = "9783031970764",
series = "Lecture Notes in Computer Science",
publisher = "Springer",
pages = "335--347",
booktitle = "Lecture Notes in Computer Science",
address = "United States",

}

RIS

TY - CHAP

T1 - Clustering-Based Graph Neural Network in a Weakly Supervised Regression Problem

AU - Kalmutskiy, Kirill

AU - Berikov, Vladimir

N1 - This work was supported by the Russian Science Foundation, project 24-21-00195.

PY - 2025/7/6

Y1 - 2025/7/6

N2 - This paper presents a novel Clustering-based Graph Neural Network algorithm for weakly supervised regression. The proposed approach constructs a robust graph representation of the data using a weighted co-association matrix derived from a cluster ensemble, enabling the model to effectively capture complex relationships and reduce the impact of noise and outliers. A graph neural network is then trained on this structure, with manifold regularization via the graph Laplacian allowing the model to effectively utilize both labeled and unlabeled data. This approach improves stability and enhances robustness to label noise. Additionally, Truncated Loss is employed to mitigate the influence of outliers during training, and a Balanced Batch Sampling algorithm is introduced to ensure effective mini-batch training on the constructed graph. Numerical experiments on several real-world regression datasets demonstrate that CBGNN outperforms classical supervised, semi-supervised, and other weakly supervised learning methods, particularly in settings with significant label noise.

AB - This paper presents a novel Clustering-based Graph Neural Network algorithm for weakly supervised regression. The proposed approach constructs a robust graph representation of the data using a weighted co-association matrix derived from a cluster ensemble, enabling the model to effectively capture complex relationships and reduce the impact of noise and outliers. A graph neural network is then trained on this structure, with manifold regularization via the graph Laplacian allowing the model to effectively utilize both labeled and unlabeled data. This approach improves stability and enhances robustness to label noise. Additionally, Truncated Loss is employed to mitigate the influence of outliers during training, and a Balanced Batch Sampling algorithm is introduced to ensure effective mini-batch training on the constructed graph. Numerical experiments on several real-world regression datasets demonstrate that CBGNN outperforms classical supervised, semi-supervised, and other weakly supervised learning methods, particularly in settings with significant label noise.

KW - Cluster ensemble

KW - Graph convolutional neural network

KW - Manifold regularization

KW - Truncated loss

KW - Weakly supervised regression

UR - https://www.mendeley.com/catalogue/46169007-c2e0-33e5-9d16-46961021c7eb/

UR - https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=105010826054&origin=inward

U2 - 10.1007/978-3-031-97077-1_23

DO - 10.1007/978-3-031-97077-1_23

M3 - Chapter

SN - 9783031970764

T3 - Lecture Notes in Computer Science

SP - 335

EP - 347

BT - Lecture Notes in Computer Science

PB - Springer

ER -

ID: 68561441