Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
Evaluation of oil workers' performance based on surveillance video. / Lebedeva, Elena; Zubkov, Andrey; Bondarenko, Denis et al.
SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2019. p. 432-435 8958352 (SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings).Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
}
TY - GEN
T1 - Evaluation of oil workers' performance based on surveillance video
AU - Lebedeva, Elena
AU - Zubkov, Andrey
AU - Bondarenko, Denis
AU - Rymarenko, Konstantin
AU - Nukhaev, Marat
AU - Grishchenko, Sergey
N1 - Publisher Copyright: © 2019 IEEE. Copyright: Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2019/10
Y1 - 2019/10
N2 - We present our research on the applicability of computer vision techniques for extracting various oil workers' performance metrics. This paper focuses on learning two metrics associated with the workers' location. The first metric \boldsymbol{e}-{1} is the percent of frames in which only some part of the crew is present. If its value is bigger than some threshold value, the crew's performance is declared inefficient. We propose to perform human detection in each video frame and count people present in order to calculate \boldsymbol{e}-{1}. The Faster R-CNN and single-shot detectors with several types of feature extractors were tested on a specially collected dataset. By finetuning the most accurate of them we've achieved 0.99 precision and 0.91 recall. The second metric \boldsymbol{e}-{2} considers workers' distance from an automated gas control system, which is the main subject of maintenance. We propose using some markers on the uniform for worker recognition and estimation of his/her position relative to an automated gas control system. We've tested the ArUco and the RUNETag markers on synthetic data and proved that they cannot be applied to our problem. We've also carried out some preliminary research on uniform numbers detection, as they can be also considered as markers. The Connectionist Text Proposal Network (CTPN) used for text detection achieved an accuracy of 0.76. Text recognition performed by Tesseract OCR failed with 0.05 recall. However, we plan to collect a dataset for number detection and recognition in the future and test more approaches.
AB - We present our research on the applicability of computer vision techniques for extracting various oil workers' performance metrics. This paper focuses on learning two metrics associated with the workers' location. The first metric \boldsymbol{e}-{1} is the percent of frames in which only some part of the crew is present. If its value is bigger than some threshold value, the crew's performance is declared inefficient. We propose to perform human detection in each video frame and count people present in order to calculate \boldsymbol{e}-{1}. The Faster R-CNN and single-shot detectors with several types of feature extractors were tested on a specially collected dataset. By finetuning the most accurate of them we've achieved 0.99 precision and 0.91 recall. The second metric \boldsymbol{e}-{2} considers workers' distance from an automated gas control system, which is the main subject of maintenance. We propose using some markers on the uniform for worker recognition and estimation of his/her position relative to an automated gas control system. We've tested the ArUco and the RUNETag markers on synthetic data and proved that they cannot be applied to our problem. We've also carried out some preliminary research on uniform numbers detection, as they can be also considered as markers. The Connectionist Text Proposal Network (CTPN) used for text detection achieved an accuracy of 0.76. Text recognition performed by Tesseract OCR failed with 0.05 recall. However, we plan to collect a dataset for number detection and recognition in the future and test more approaches.
KW - ArUco
KW - computer vision
KW - convolutional neural networks
KW - CTPN
KW - Faster R-CNN
KW - fiducial markers
KW - OCR
KW - RUNETag
KW - single-shot detectors
KW - surveillance
KW - Tesseract
KW - uniform numbers
KW - workers' performance
UR - http://www.scopus.com/inward/record.url?scp=85079077543&partnerID=8YFLogxK
U2 - 10.1109/SIBIRCON48586.2019.8958352
DO - 10.1109/SIBIRCON48586.2019.8958352
M3 - Conference contribution
AN - SCOPUS:85079077543
T3 - SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings
SP - 432
EP - 435
BT - SIBIRCON 2019 - International Multi-Conference on Engineering, Computer and Information Sciences, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 International Multi-Conference on Engineering, Computer and Information Sciences, SIBIRCON 2019
Y2 - 21 October 2019 through 27 October 2019
ER -
ID: 28286594