Standard

Evaluation of oil workers' performance based on surveillance video. / Lebedeva, Elena; Zubkov, Andrey; Bondarenko, Denis et al.

SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. p. 432-435 8958352 (SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Harvard

Lebedeva, E, Zubkov, A, Bondarenko, D, Rymarenko, K, Nukhaev, M & Grishchenko, S 2019, Evaluation of oil workers' performance based on surveillance video. in SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings., 8958352, SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings, Institute of Electrical and Electronics Engineers Inc., pp. 432-435, 2019 International Multi-Conference on Engineering, Computer and Information Sciences, SIBIRCON 2019, Novosibirsk, Russian Federation, 21.10.2019. https://doi.org/10.1109/SIBIRCON48586.2019.8958352

APA

Lebedeva, E., Zubkov, A., Bondarenko, D., Rymarenko, K., Nukhaev, M., & Grishchenko, S. (2019). Evaluation of oil workers' performance based on surveillance video. In SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings (pp. 432-435). [8958352] (SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SIBIRCON48586.2019.8958352

Vancouver

Lebedeva E, Zubkov A, Bondarenko D, Rymarenko K, Nukhaev M, Grishchenko S. Evaluation of oil workers' performance based on surveillance video. In SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings. Institute of Electrical and Electronics Engineers Inc. 2019. p. 432-435. 8958352. (SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings). doi: 10.1109/SIBIRCON48586.2019.8958352

Author

Lebedeva, Elena ; Zubkov, Andrey ; Bondarenko, Denis et al. / Evaluation of oil workers' performance based on surveillance video. SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 432-435 (SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings).

BibTeX

@inproceedings{45a5af594a194ecc8f1b67d1cc1df9f8,
title = "Evaluation of oil workers' performance based on surveillance video",
abstract = "We present our research on the applicability of computer vision techniques for extracting various oil workers' performance metrics. This paper focuses on learning two metrics associated with the workers' location. The first metric \boldsymbol{e}-{1} is the percent of frames in which only some part of the crew is present. If its value is bigger than some threshold value, the crew's performance is declared inefficient. We propose to perform human detection in each video frame and count people present in order to calculate \boldsymbol{e}-{1}. The Faster R-CNN and single-shot detectors with several types of feature extractors were tested on a specially collected dataset. By finetuning the most accurate of them we've achieved 0.99 precision and 0.91 recall. The second metric \boldsymbol{e}-{2} considers workers' distance from an automated gas control system, which is the main subject of maintenance. We propose using some markers on the uniform for worker recognition and estimation of his/her position relative to an automated gas control system. We've tested the ArUco and the RUNETag markers on synthetic data and proved that they cannot be applied to our problem. We've also carried out some preliminary research on uniform numbers detection, as they can be also considered as markers. The Connectionist Text Proposal Network (CTPN) used for text detection achieved an accuracy of 0.76. Text recognition performed by Tesseract OCR failed with 0.05 recall. However, we plan to collect a dataset for number detection and recognition in the future and test more approaches.",
keywords = "ArUco, computer vision, convolutional neural networks, CTPN, Faster R-CNN, fiducial markers, OCR, RUNETag, single-shot detectors, surveillance, Tesseract, uniform numbers, workers' performance",
author = "Elena Lebedeva and Andrey Zubkov and Denis Bondarenko and Konstantin Rymarenko and Marat Nukhaev and Sergey Grishchenko",
note = "Publisher Copyright: {\textcopyright} 2019 IEEE. Copyright: Copyright 2020 Elsevier B.V., All rights reserved.; 2019 International Multi-Conference on Engineering, Computer and Information Sciences, SIBIRCON 2019 ; Conference date: 21-10-2019 Through 27-10-2019",
year = "2019",
month = oct,
doi = "10.1109/SIBIRCON48586.2019.8958352",
language = "English",
series = "SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "432--435",
booktitle = "SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings",
address = "United States",

}

RIS

TY - GEN

T1 - Evaluation of oil workers' performance based on surveillance video

AU - Lebedeva, Elena

AU - Zubkov, Andrey

AU - Bondarenko, Denis

AU - Rymarenko, Konstantin

AU - Nukhaev, Marat

AU - Grishchenko, Sergey

N1 - Publisher Copyright: © 2019 IEEE. Copyright: Copyright 2020 Elsevier B.V., All rights reserved.

PY - 2019/10

Y1 - 2019/10

N2 - We present our research on the applicability of computer vision techniques for extracting various oil workers' performance metrics. This paper focuses on learning two metrics associated with the workers' location. The first metric \boldsymbol{e}-{1} is the percent of frames in which only some part of the crew is present. If its value is bigger than some threshold value, the crew's performance is declared inefficient. We propose to perform human detection in each video frame and count people present in order to calculate \boldsymbol{e}-{1}. The Faster R-CNN and single-shot detectors with several types of feature extractors were tested on a specially collected dataset. By finetuning the most accurate of them we've achieved 0.99 precision and 0.91 recall. The second metric \boldsymbol{e}-{2} considers workers' distance from an automated gas control system, which is the main subject of maintenance. We propose using some markers on the uniform for worker recognition and estimation of his/her position relative to an automated gas control system. We've tested the ArUco and the RUNETag markers on synthetic data and proved that they cannot be applied to our problem. We've also carried out some preliminary research on uniform numbers detection, as they can be also considered as markers. The Connectionist Text Proposal Network (CTPN) used for text detection achieved an accuracy of 0.76. Text recognition performed by Tesseract OCR failed with 0.05 recall. However, we plan to collect a dataset for number detection and recognition in the future and test more approaches.

AB - We present our research on the applicability of computer vision techniques for extracting various oil workers' performance metrics. This paper focuses on learning two metrics associated with the workers' location. The first metric \boldsymbol{e}-{1} is the percent of frames in which only some part of the crew is present. If its value is bigger than some threshold value, the crew's performance is declared inefficient. We propose to perform human detection in each video frame and count people present in order to calculate \boldsymbol{e}-{1}. The Faster R-CNN and single-shot detectors with several types of feature extractors were tested on a specially collected dataset. By finetuning the most accurate of them we've achieved 0.99 precision and 0.91 recall. The second metric \boldsymbol{e}-{2} considers workers' distance from an automated gas control system, which is the main subject of maintenance. We propose using some markers on the uniform for worker recognition and estimation of his/her position relative to an automated gas control system. We've tested the ArUco and the RUNETag markers on synthetic data and proved that they cannot be applied to our problem. We've also carried out some preliminary research on uniform numbers detection, as they can be also considered as markers. The Connectionist Text Proposal Network (CTPN) used for text detection achieved an accuracy of 0.76. Text recognition performed by Tesseract OCR failed with 0.05 recall. However, we plan to collect a dataset for number detection and recognition in the future and test more approaches.

KW - ArUco

KW - computer vision

KW - convolutional neural networks

KW - CTPN

KW - Faster R-CNN

KW - fiducial markers

KW - OCR

KW - RUNETag

KW - single-shot detectors

KW - surveillance

KW - Tesseract

KW - uniform numbers

KW - workers' performance

UR - http://www.scopus.com/inward/record.url?scp=85079077543&partnerID=8YFLogxK

U2 - 10.1109/SIBIRCON48586.2019.8958352

DO - 10.1109/SIBIRCON48586.2019.8958352

M3 - Conference contribution

AN - SCOPUS:85079077543

T3 - SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings

SP - 432

EP - 435

BT - SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2019 International Multi-Conference on Engineering, Computer and Information Sciences, SIBIRCON 2019

Y2 - 21 October 2019 through 27 October 2019

ER -

ID: 28286594