CN116798087A - Employee on-duty state detection method and system - Google Patents

Employee on-duty state detection method and system Download PDF

Info

Publication number
CN116798087A
CN116798087A CN202210249009.0A CN202210249009A CN116798087A CN 116798087 A CN116798087 A CN 116798087A CN 202210249009 A CN202210249009 A CN 202210249009A CN 116798087 A CN116798087 A CN 116798087A
Authority
CN
China
Prior art keywords
employee
staff
similarity
checked
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210249009.0A
Other languages
Chinese (zh)
Inventor
马一骏
张岩
饶品波
徐朋朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Group Jiangsu Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Group Jiangsu Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Group Jiangsu Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202210249009.0A priority Critical patent/CN116798087A/en
Publication of CN116798087A publication Critical patent/CN116798087A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1091Recording time for administrative or management purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Educational Administration (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention provides a method and a system for detecting an on-duty state of an employee, wherein the method comprises the following steps: inputting the images of the staff to be checked into a target detection model to obtain first characteristic data of the images of the staff to be checked; based on the target similarity function, calculating the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database so as to determine employee identity information corresponding to the employee image with the highest similarity; judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked. The system is used for executing the method. According to the invention, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the staff is calculated, the auditing efficiency of the staff is improved, and the accurate identification of the on-duty state of the staff is realized.

Description

Employee on-duty state detection method and system
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a system for detecting an on-duty state of an employee.
Background
In the prior art, manual mode is mainly adopted to check whether staff is on duty. Typically, one workstation is required to upload multiple photos for auditing by an auditor during a week of operation. A large number of photos collected at different stations are available for auditing by auditors at the same time, and the auditing workload is great. The existing manual auditing process has low timeliness and large manual demand, and the auditing process is easy to make mistakes.
Disclosure of Invention
The on-duty detection method and system for the staff provided by the invention are used for solving at least one problem in the prior art, and the objective similarity function is constructed by integrating the Euclidean distance and the cosine distance, so that the comparison similarity index of the staff is calculated, the auditing efficiency of the staff is improved, and the accurate identification of the on-duty state of the staff is realized.
The invention provides a method for detecting the on-duty state of staff, which comprises the following steps:
inputting an image of a staff to be checked into a target detection model to acquire first characteristic data of the image of the staff to be checked;
calculating the similarity between the first characteristic data and second characteristic data corresponding to each employee image pre-stored in a database based on a target similarity function so as to determine employee identity information corresponding to the employee image with the highest similarity;
judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the target similarity function is determined according to a Euclidean distance and a cosine distance between the first characteristic data and the second characteristic data.
According to the staff on-duty state detection method provided by the invention, the target detection model is determined by the following modes:
pre-training the pre-constructed WIYOLOv3 model using a first dataset comprising near and far vision person face images to obtain a pre-trained WIYOLOv3 model;
training the pre-trained WIYOLOv3 model using a second dataset comprising images of a face of a real person to fine tune the pre-trained WIYOLOv3 model;
retraining parameters of the full-connection layer of the fine-tuned pre-trained WIYOLOv3 model based on a third dataset to obtain the target detection model;
the third data set is obtained by mixing the first data set with the face image of the worker included in the second data set according to a preset proportion.
According to the staff on-duty state detection method provided by the invention, the WIYOLOv3 model is obtained by the following steps:
correcting the original cross-over ratio of the YOLOv3 model to obtain a target cross-over ratio;
correcting a loss function of the YOLOv3 model based on the target cross-correlation to obtain a target loss function of the YOLOv3 model;
and taking the YOLOv3 model with the target loss function as the WIYOLOv3 model.
According to the on-duty employee status detection method provided by the invention, the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database is calculated based on the target similarity function, and the method comprises the following steps:
acquiring a first similarity between a first pixel value of the employee image to be checked and a second pixel value corresponding to each employee image pre-stored in a database;
calculating a second similarity between the first key point feature data of the employee image to be checked and the second key point feature data corresponding to each employee image pre-stored in a database;
and determining the similarity according to the first similarity and the second similarity.
According to the on-Shift employee status detection method provided by the invention, the calculating of the second similarity between the first key point feature data of the images of the employee to be checked and the second key point feature data corresponding to each employee image pre-stored in the database comprises the following steps:
determining a Euclidean distance between the first key point feature data and the second key point feature data;
determining a cosine distance between the first key point feature data and the second key point feature data;
and determining the second similarity according to the Euclidean distance and the cosine distance.
According to the method for detecting the on-duty status of the staff provided by the invention, the on-duty status of the staff to be checked is judged according to the staff identity information and the station information of the staff to be checked, and the method comprises the following steps:
if the employee ID in the employee identity information is consistent with the station ID in the station information, determining that the employee to be checked is on duty;
and if the employee ID in the employee identity information is inconsistent with the station ID in the station information, determining that the employee to be checked is not on duty.
The invention also provides a system for detecting the on-duty state of the staff, which comprises the following steps: the system comprises a first acquisition module, a second acquisition module and an on-duty detection module;
the first acquisition module is used for inputting the images of the staff to be checked into the target detection model so as to acquire first characteristic data of the images of the staff to be checked;
the second obtaining module is used for obtaining the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database based on the target similarity function so as to determine employee identity information corresponding to the employee image with the highest similarity;
the on-duty detection module is used for judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the target similarity function is determined according to Euclidean distance and cosine distance.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the method for detecting the on-duty state of the staff according to any one of the above when executing the program.
The present invention also provides a processor readable storage medium storing a computer program for causing the processor to execute the employee on duty status detection method as described in any of the above.
The invention also provides a computer program product comprising a computer program which when executed by a processor implements a method of staff on Shift status detection as described in any of the above.
According to the on-duty detection method and system for the staff provided by the invention, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the staff is calculated, the auditing efficiency of the staff is improved, and further, the accurate identification of the on-duty state of the staff is realized.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an employee on Shift status detection method provided by the invention;
FIG. 2 is a schematic diagram of calculating the cross-over ratio provided by the present invention;
FIG. 3 is a schematic diagram of an employee on Shift status detection system provided by the present invention;
fig. 4 is a schematic diagram of the physical structure of the electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
FIG. 1 is a flow chart of an employee on Shift status detection method provided by the invention, as shown in FIG. 1, the method comprises:
step 100, inputting an image of the staff to be checked into a target detection model to obtain first characteristic data of the image of the staff to be checked;
step 200, based on a target similarity function, obtaining similarity between the first characteristic data and second characteristic data corresponding to each employee image pre-stored in a database, so as to determine employee identity information corresponding to the employee image with the highest similarity;
step 300, judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the object similarity function is determined according to the Euclidean distance and the cosine distance.
It should be noted that, the execution subject of the above method may be a computer device.
Optionally, according to the YOLOv3 model, the target detection model is built, the WIYOLOv3 (Weight IOU YOLOv 3) model is built by correcting the loss function of YOLOv3 with respect to the labeling frame Bbox, the WIYOLOv3 model is trained by adopting a pre-training and mixed training strategy, the target detection model is obtained, and the face similarity of staff is judged by building a target similarity function, so that the generalization performance of the target detection model is improved, and specifically:
inputting the images of the staff to be checked into a target detection model obtained through training, and obtaining first characteristic data (including first key point characteristic data such as face key point characteristics) of the staff to be checked based on an output result of the target detection model.
Each employee image pre-stored in the database is acquired, and second feature data (including second key point feature data such as face key point features) of each employee image stored in the database can be obtained by performing image processing such as face clipping, gray level conversion, and the like on each employee image stored in the database.
The key point features of a face are physiological features inherent to the face, such as iris morphology, positional relationship between facial organs (eyes, nose, mouth, ears, etc.), structure of facial organs (shape, size, etc.), skin texture, and the like.
And constructing GC (Grey Cosine) similarity functions according to the Euclidean distance and the cosine distance, and taking the constructed GC similarity functions as target similarity functions.
And calculating the similarity between the first characteristic data of the employee images to be checked and the second characteristic data corresponding to each employee image stored in the database by using the constructed GC similarity function, and selecting employee identity information corresponding to one or more employee images with highest similarity from the calculated similarity values.
And identifying the on-duty state of the staff to be checked according to the staff identity information corresponding to the obtained one or more staff images with the highest similarity and the station information of the staff to be checked.
For example, if only one employee image with the highest similarity is determined through calculation, the identification of the on-duty state of the employee to be checked can be realized only by matching the employee identity information of the employee image with the highest similarity with the station information of the employee to be checked; if a plurality of employee images with the highest similarity are calculated and determined, the employee identity information of the employee image with the highest similarity can be matched with the station information of the employee to be checked in sequence, so that the identification of the on-duty state of the employee to be checked is realized.
According to the on-duty detection method for the staff provided by the invention, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the staff is calculated, the auditing efficiency of the staff is improved, and further, the accurate identification of the on-duty state of the staff is realized.
Further, in one embodiment, the object detection model in step 100 is determined by:
step 1001, pre-training a pre-constructed WIYOLOv3 model by using a first data set comprising a near-view worker face image and a far-view worker face image to obtain a pre-trained WIYOLOv3 model;
step 1002, training the pre-trained WIYOLOv3 model by using a second data set comprising images of the face of a real person to fine tune the pre-trained WIYOLOv3 model;
step 1003, retraining parameters of the full connection layer of the fine-tuned pre-trained WIYOLOv3 model based on the third data set to obtain a target detection model;
the third data set is obtained by mixing the first data set with the face image of the worker included in the second data set according to a preset proportion.
Further, in one embodiment, the WIYOLOv3 model in step 1001 is obtained by:
step 10010, correcting the original cross-over ratio of the YOLOv3 model to obtain a target cross-over ratio;
step 10011, correcting the loss function of the YOLOv3 model based on the target cross-correlation to obtain a target loss function of the YOLOv3 model;
step 10012, using the YOLOv3 model with the target loss function as the WIYOLOv3 model.
Optionally, whether the employee is on duty involves two parts, namely face detection and face recognition, and after splitting, the method specifically comprises the following steps: training a face detection model, cutting a face, and comparing the similarity of the face. The invention mainly aims at two parts of face model training and face similarity calculation for optimization, and specifically realizes the following steps:
first, data labeling and data processing are performed. Aiming at training of a face detection model, the faces of training data need to be marked, and the complete faces need to be marked. Because the staff is on duty with less data, if only the staff's labeling data is used, the trained model is easy to fit, so that other data sets are needed for pre-training. The pre-training data set (i.e., the first data set) is composed of two parts, the first part is a face image of a person who is close-shot by the user terminal shooting device (i.e., a face image of a close-shot person), and the other part is a face image of a person who is far-shot by the user terminal shooting device (i.e., a far-shot employee image).
The user terminal may be, but is not limited to, various smartphones, tablet computers, notebook computers, desktop computers, portable wearable devices, smart speakers, etc.
The shooting device can be a camera built in the user terminal, or can be an external camera which is associated with the user terminal and is used for collecting an environment image comprising a face image of the user and position information of the user. For example, the user terminal may be connected to the image capturing device through a connection line or a network, and the image capturing device captures an environmental image including a face image of the user and location information of the user through the camera, and transmits the captured environmental image including the face image of the user and the location information of the user to the user terminal. The cameras may be monocular cameras, binocular cameras, depth cameras, 3D (three-dimensional) cameras, etc.
The reason for selecting the two parts of data is mainly that computers of different staff are set at different distances, some of the computers like to be placed far away, and other computers like to be placed near. In addition, the staff is not stationary at the station, and the distance between the staff and the camera is sometimes short and sometimes long, so that the face image of the staff is required to be ensured to be large or small during training, and the pre-training model and the real scene are ensured to be consistent.
Secondly, a WIYOLOv3 model is constructed. The loss function loss of the original YOLOv3 model with respect to the label box is shown as follows:
wherein S is 2 Representing how many tiles the image is cut into, B represents the number of bscrolling,representing that the small square has a target of 1 and a target of 0,w i ,h i ,x i ,y i Representing the width, height, upper left-hand x-coordinate and upper left-hand y-coordinate of the real Bscrolling,/-> Representing the width, height, upper left corner x-coordinate and upper left corner y-coordinate of the prediction bbridging.
Through analysis of coefficients (2-w in the loss function loss i h i ) The AP value of the model can be effectively improved, which isThe numbers are mainly aimed at small targets in the COCO dataset, considering that in practical problems there will be both large and small target objects in one employee image and the detection effect is evaluated using the cross-over ratio when evaluating the model effect, it is necessary to incorporate the information of the cross-over ratio and object size into the loss function loss of the label box, as shown in fig. 2.
Assuming that the true labeling frame is A, the upper left corner coordinate of A is (x 1 ,y 1 ) Width is w 1 The height is h 1 The method comprises the steps of carrying out a first treatment on the surface of the The prediction frame is C, and the upper left corner coordinate of C is (x 2 ,y 2 ) Width is w 3 The height is h 3 The method comprises the steps of carrying out a first treatment on the surface of the The intersection of A and C is D, and the upper left corner coordinate of D is also (x 2 ,y 2 ) Width w 2 The height is h 2 . The original intersection ratio IOU of the Yolov3 model is formulated as follows:
the original intersection ratio IOU only simply judges the coincidence degree of the real frame and the predicted frame. In order to reflect the information of the object size, the original blending ratio needs to be corrected, and the corrected blending ratio is the target blending ratio IOU, regardless of the size of the object r The following formula is shown:
wherein, the target cross-over ratio IOU r The denominator of (2) is divided into two parts, wherein the first part is an original union, the second part is the geometric mean of the prediction frame and the real annotation frame, and the information of the size of the object is introduced. (w) 1 +w 3 -w 2 )×(h 1 +h 3 -h 2 ) Indicating the union of a and C in fig. 2; w (w) 1 ×h 1 Represents the area of A, w 3 ×h 3 Represents the area of C, thus w 1 ×h 1 ×w 3 ×h 3 Is the product of the areas of the AC,is the arithmetic square root of the AC area product, i.e. the geometric mean of the AC area. The denominator of the original IOU is the union of the ACs, while the target union compares the IOU r The denominator of (a) is a linear combination of the AC union and the AB geometric mean, the coefficient of the union is α, the coefficient of the geometric mean is β, and α+β=1.
In one embodiment, the importance of the union is 85% and the importance of the geometric mean is 15%, so that the coefficient of the union is α, the coefficient of the geometric mean is β, and the weighted average of the two is set to 0.85 and 0.15, respectively, so that the information of the intersection ratio and the object size is integrated. Cross-target ratio IOU r The loss function loss introduced into the annotation frame is implemented by the IOU r Instead of empirically chosen coefficients (2-w i h i ) Correcting the loss function of the labeling frame to obtain a target loss function loss r
And will have a target loss function loss r As a WIYOLOv3 model, practice proves that the AP value of the YOLOv3 model is improved by 2.11% after the modified loss function.
Note that the YOLO model can solve the object detection as a regression problem. Based on a single end-to-end network, the input from the original image to the output of the object location and class is accomplished.
YOLOv3 is a third version of the YOLO series of the target detection algorithm.
It should be noted that, an Intersection-over-Union (IoU) is a concept used in the object detection, and is a ratio of overlapping ratio of the generated candidate boxes (candidate bound) and the original mark boxes (ground truth bound), i.e., a ratio of their Intersection to Union. The most ideal is a complete overlap, i.e. a ratio of 1.
Wherein A.u.C represents the intersection of two boxes and A.u.C represents the union of two boxes.
Finally, the WIYOLOv3 model is trained based on a strategy of pre-training and hybrid training. The target detection model needs to be trained in three steps based on the actual scene of whether the employee is on duty.
Inputting a first data set comprising a near-scene operator face image and a far-scene operator face image into a pre-built WIYOLOv3 model for pre-training to obtain a pre-trained WIYOLOv3 model;
and secondly, correcting the model by the employee image. After the pre-training is finished, a fine tuning strategy is adopted to adjust the pre-training WIYOLOv3 model. The specific measures are that the face image of a real person is adopted as a training data set (namely a second data set), wherein each staff is required to provide a plurality of images, the angle difference of the face between each image is 10 degrees, and a bilateral symmetry mode is adopted, so that the face image is changed from 0 degree to 90 degrees. And (3) taking the parameters of the pre-training WIYOLOv3 model as initialization parameters, retraining the pre-training WIYOLOv3 model, and introducing the information of the multi-angle faces of the real staff by the trained model.
The Fine tuning of the pretrained WIYOLOv3 model refers to Fine tuning (Fine Tune) based on the pretrained WIYOLOv3 model after the pretrained model (pre_trained model) is given. Compared to the de novo training (Training a model from scatch), the fine tuning can save a lot of calculation resources and calculation time, improve the calculation efficiency and even improve the accuracy.
The use scenarios of fine tuning are: the correct rate of the built or used CNN model is too low; the data sets are similar, but the number of data sets is too small; the computational resources are too few.
And thirdly, mixing training. The second step is to retrain the model, which may be at risk of overfitting due to the parameter initialization strategy, thus requiring retraining of the fully connected layers of the pre-trained WIYOLOv3 model. Firstly, constructing a mixed sample, namely mixing part of data from a first data set and a face image of a real worker in a second data set, wherein the ratio of the part of data to the face image of the real worker can be set to be 10:1, and obtaining a third data set; then, all parameters of the second step model except the output layer are frozen. And finally, retraining parameters of the full-connection layer based on the third data set to obtain a target detection model.
The method has the advantages that the secondary fine tuning of the full-connection layer of the model is carried out by adding 10:1 mixed data, the possible overfitting risk is solved, the model performance is improved, and the practical demonstration shows that the AP value of the trained target detection model is improved by 1.01%.
According to the staff on-duty state detection method provided by the invention, the information of the IOU is introduced into the loss function of the labeling frame, so that the information of the whole loss of the labeling frame is structurally compensated, the error deviation caused by the fact that the staff face image target is too small is corrected, the performance of a finally obtained target detection model is improved, and the accuracy of staff image auditing is improved.
Further, in one embodiment, step 200 may specifically include:
step 2001, obtaining a first similarity between a first pixel value of an employee image to be checked and a second pixel value corresponding to each employee image pre-stored in a database;
step 2002, calculating a second similarity between the first key point feature data of the employee image to be checked and the second key point feature data corresponding to each employee image pre-stored in the database;
step 2003, determining the similarity according to the first similarity and the second similarity.
Optionally, a target similarity function (i.e., GC similarity function) is constructed, and the face images detected based on the target detection model are compared with the face images in the database, so as to detect who is. The common face comparison is verified by adopting the similarity of 128 key points. The invention provides a novel similarity function, grey similarity function, GC similarity function for short. The face image of each employee image in the database is converted into a gray scale image, and the width and the height are set to 128. The GC similarity function is constructed as follows:
firstly, cutting a face image detected based on a target detection model, converting the face image into a gray level image, wherein the width and the height of the converted image are 128;
the second step, the detected pixel value of the face image of the staff image to be checked (namely the first pixel value) uses a vector E= (x) 1 ,x 2 ,...,x 128 ) The pixel value of the face image (i.e., the second pixel value) representing each employee image in the database is represented by vector f= (y) 1 ,y 2 ,...,y 128 ) Representing, calculating the similarity of the pixel values of the two, namely, a first similarity s 1
Wherein the first similarity s 1 Representing the overall similarity of the two images.
Step three, calculating a second similarity s between the first key point feature data of the images of the staff to be audited and the second key point feature data corresponding to each staff image pre-stored in the database 2
Fourth, according to the calculated first similarity s 1 And a second similarity s 2 Calculating the similarity S between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database final
S final =s 1 ×s 2
According to the on-duty detection method for the staff provided by the invention, the European and cosine distances are integrated, the target similarity function is constructed, and the comparison similarity index of the staff is calculated, so that the auditing efficiency of the staff is improved, the accurate identification of the on-duty state of the staff is realized, and meanwhile, the manual auditing cost is saved.
Further, in one embodiment, step 2002 may specifically include:
step 20021, determining Euclidean distance between the first key point feature data and the second key point feature data;
step 20022, determining cosine distance between the first key point feature data and the second key point feature data;
step 20023, determining a second similarity according to the Euclidean distance and the cosine distance.
Optionally, a similarity between the detected first and second keypoint feature data is calculated. Assuming 128 first keypoint feature data are detected, usingA representation; second key feature data of each employee image in database>Representation, where i=1, 2,..128. And calculating the similarity of the 128 first key point characteristic data and the second key point characteristic data.
First, a Euclidean distance of a pair of key point characteristic data is calculatedThe following formula is shown:
secondly, calculating the cosine distance of a pair of key points, and correcting the cosine distance to obtain corrected cosine distance
Furthermore, the linear and nonlinear integration of the Euclidean distance and cosine distance is shown as follows:
wherein sigma 1 Representing Euclidean distanceCoefficient, sigma of 2 Representing the modified cosine distance +.>Coefficient, sigma of 12 The value of (c) may be at a value satisfying sigma 12 Freely set when=1, e.g. σ 1 =0.6,σ 2 =0.4。
Finally, the similarity s of all 128 first key point characteristic data and second key point characteristic data can be obtained 2 (i.e., a second degree of similarity).
According to the on-duty employee detection method, the Euclidean and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the employee is calculated, and the auditing efficiency of the employee is improved by integrating the linear and nonlinear information.
Further, in one embodiment, step 300 may specifically include:
step 3001, if the employee ID in the employee identity information is consistent with the station ID in the station information, determining that the employee to be checked is on duty;
step 3002, if the employee ID in the employee identity information is inconsistent with the workstation ID in the workstation information, determining that the employee to be checked is not on duty.
Optionally, identifying the on-duty state of the staff to be checked according to staff identity information corresponding to the obtained one or more staff images with the highest similarity and the station information of the staff to be checked, and specifically:
identifying the on-duty state of the staff to be checked according to the staff ID in the staff identity information of one or more staff images with highest calculated similarity and the workstation ID in the workstation information of the staff to be checked:
if the employee ID in the employee identity information is consistent with the station ID in the station information, judging that the employee to be checked is on duty, and if the employee ID in the employee identity information is inconsistent with the station ID in the station information, determining that the employee to be checked is not on duty.
Similarly, if only one employee image with the highest similarity is determined, the identification of the on-duty state of the employee to be checked can be realized by matching the employee ID in the employee identity information of the employee image with the highest similarity with the station ID of the station information of the employee to be checked; if a plurality of employee images with the highest similarity are determined, the employee IDs in the employee identity information of the employee images with the highest similarity can be respectively matched with the station IDs in the station information of the employees to be checked, so that the identification of the on-duty state of the employees to be checked is realized.
The employee identity information may be stored in the database together with the employee image in advance, or may be obtained by querying the corresponding employee.
According to the employee on-duty state detection method, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the employee is calculated, the timeliness and the auditing efficiency of the employee are improved, and the employee on-duty state is accurately identified.
The staff on-duty state detection system provided by the invention is described below, and the staff on-duty state detection system described below and the staff on-duty state detection method described above can be correspondingly referred to each other.
FIG. 3 is a schematic diagram of an employee on Shift status detection system provided by the present invention, as shown in FIG. 3, comprising:
a first acquisition module 310, a second acquisition module 311, and an on Shift detection module 312;
a first obtaining module 310, configured to input an image of the staff to be checked into the target detection model, so as to obtain first feature data of the image of the staff to be checked;
the second obtaining module 311 is configured to obtain, based on the target similarity function, a similarity between the first feature data and second feature data corresponding to each employee image pre-stored in the database, so as to determine employee identity information corresponding to an employee image with the highest similarity;
the on-duty detection module 312 is configured to determine an on-duty status of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the object similarity function is determined according to the Euclidean distance and the cosine distance.
According to the staff on-duty state detection system provided by the invention, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of staff is calculated, the auditing efficiency of staff is improved, and further accurate identification of staff on-duty states is realized.
Further, in one embodiment, the first obtaining module 310 may be further specifically configured to:
pre-training the pre-constructed WIYOLOv3 model using a first dataset comprising near and far vision person face images to obtain a pre-trained WIYOLOv3 model;
training the pre-trained WIYOLOv3 model using a second dataset comprising images of the face of the real person to fine tune the pre-trained WIYOLOv3 model;
retraining parameters of the full-connection layer of the fine-tuned pre-trained WIYOLOv3 model based on the third data set to obtain a target detection model;
the third data set is obtained by mixing the face images of the workers included in the first data set and the second data set according to a preset proportion;
the WIYOLOv3 model is obtained by:
correcting the original cross-over ratio of the YOLOv3 model to obtain a target cross-over ratio;
correcting the loss function of the YOLOv3 model based on the target cross-correlation to obtain a target loss function of the YOLOv3 model;
the YOLOv3 model with the objective loss function was taken as the WIYOLOv3 model.
According to the staff on-duty state detection system provided by the invention, the information of the IOU is introduced into the loss function of the labeling frame, so that the information of the whole loss of the labeling frame is structurally compensated, the error deviation caused by the fact that the staff face image target is too small is corrected, the performance of a finally obtained target detection model is improved, and the accuracy of staff image auditing is improved.
Further, in one embodiment, the second obtaining module 311 may be further specifically configured to:
acquiring a first similarity between a first pixel value of an employee image to be audited and a second pixel value corresponding to each employee image pre-stored in a database;
calculating a second similarity between the first key point feature data of the employee images to be audited and the second key point feature data corresponding to each employee image pre-stored in the database;
and determining the similarity according to the first similarity and the second similarity.
According to the on-duty state detection system for the staff, provided by the invention, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the staff is calculated, the auditing efficiency of the staff is improved, the accurate identification of the on-duty state of the staff is realized, and meanwhile, the manual auditing cost is saved.
Further, in one embodiment, the second obtaining module 311 may be further specifically configured to:
determining Euclidean distance between the first key point feature data and the second key point feature data;
determining a cosine distance between the first key point feature data and the second key point feature data;
and determining a second similarity according to the Euclidean distance and the cosine distance.
According to the on-duty employee detection system, the Euclidean and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of the employee is calculated, and the auditing efficiency of the employee is improved by integrating the linear and nonlinear information.
Further, in one embodiment, on Shift detection module 312 may also be specifically configured to:
if the employee ID in the employee identity information is consistent with the station ID in the station information, determining that the employee to be checked is on duty;
and if the employee ID in the employee identity information is inconsistent with the station ID in the station information, determining that the employee to be checked is not on duty.
According to the staff on-duty state detection system provided by the invention, the European and cosine distances are integrated, the target similarity function is constructed, the comparison similarity index of staff is calculated, the timeliness and the auditing efficiency of staff auditing are improved, and the staff on-duty state is accurately identified.
Fig. 4 is a schematic physical structure of an electronic device according to the present invention, as shown in fig. 4, the electronic device may include: a processor (processor) 410, a communication interface (communication interface) 411, a memory (memory) 412 and a bus (bus) 413, wherein the processor 410, the communication interface 411 and the memory 412 communicate with each other through the bus 413. The processor 410 may call logic instructions in the memory 412 to perform the following method:
inputting the images of the staff to be checked into a target detection model to obtain first characteristic data of the images of the staff to be checked;
based on the target similarity function, obtaining the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database, so as to determine employee identity information corresponding to the employee image with the highest similarity;
judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the object similarity function is determined according to the Euclidean distance and the cosine distance.
Further, the logic instructions in the memory described above may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer power supply screen (which may be a personal computer, a server, or a network power supply screen, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Further, the present invention discloses a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, are capable of performing the employee on Shift status detection method provided by the above method embodiments, for example comprising:
inputting the images of the staff to be checked into a target detection model to obtain first characteristic data of the images of the staff to be checked;
based on the target similarity function, obtaining the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database, so as to determine employee identity information corresponding to the employee image with the highest similarity;
judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the object similarity function is determined according to the Euclidean distance and the cosine distance.
In another aspect, the present invention also provides a processor readable storage medium storing a computer program for causing the processor to perform the method provided in the above embodiments, for example, including:
inputting the images of the staff to be checked into a target detection model to obtain first characteristic data of the images of the staff to be checked;
based on the target similarity function, obtaining the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database, so as to determine employee identity information corresponding to the employee image with the highest similarity;
judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the object similarity function is determined according to the Euclidean distance and the cosine distance.
The system embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer power screen (which may be a personal computer, a server, or a network power screen, etc.) to perform the method described in the various embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. An employee on-duty detection method, comprising:
inputting an image of a staff to be checked into a target detection model to acquire first characteristic data of the image of the staff to be checked;
based on a target similarity function, obtaining the similarity between the first characteristic data and second characteristic data corresponding to each employee image pre-stored in a database so as to determine employee identity information corresponding to the employee image with the highest similarity;
judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the target similarity function is determined according to Euclidean distance and cosine distance.
2. The employee on Shift status detection method of claim 1, wherein the target detection model is determined by:
pre-training the pre-constructed WIYOLOv3 model using a first dataset comprising near and far vision person face images to obtain a pre-trained WIYOLOv3 model;
training the pre-trained WIYOLOv3 model using a second dataset comprising images of a face of a real person to fine tune the pre-trained WIYOLOv3 model;
retraining parameters of the full-connection layer of the fine-tuned pre-trained WIYOLOv3 model based on a third dataset to obtain the target detection model;
the third data set is obtained by mixing the first data set with the face image of the worker included in the second data set according to a preset proportion.
3. The employee on Shift status detection method of claim 2, wherein the WIYOLOv3 model is obtained by:
correcting the original cross-over ratio of the YOLOv3 model to obtain a target cross-over ratio;
correcting a loss function of the YOLOv3 model based on the target cross-correlation to obtain a target loss function of the YOLOv3 model;
and taking the YOLOv3 model with the target loss function as the WIYOLOv3 model.
4. The method for detecting an on-Shift employee status according to claim 1, wherein the calculating a similarity between the first feature data and second feature data corresponding to each employee image pre-stored in a database based on a target similarity function comprises:
acquiring a first similarity between a first pixel value of the employee image to be checked and a second pixel value corresponding to each employee image pre-stored in a database;
calculating a second similarity between the first key point feature data of the employee image to be checked and the second key point feature data corresponding to each employee image pre-stored in a database;
and determining the similarity according to the first similarity and the second similarity.
5. The staff on Shift status detection method as claimed in claim 4, wherein said calculating a second similarity between the first key point feature data of the staff image to be checked and the second key point feature data corresponding to each staff image pre-stored in the database comprises:
determining a Euclidean distance between the first key point feature data and the second key point feature data;
determining a cosine distance between the first key point feature data and the second key point feature data;
and determining the second similarity according to the Euclidean distance and the cosine distance.
6. The method for detecting an on-duty status of an employee according to any one of claims 1 to 5, wherein the determining the on-duty status of the employee to be checked according to the employee identity information and the station information of the employee to be checked includes:
if the employee ID in the employee identity information is consistent with the station ID in the station information, determining that the employee to be checked is on duty;
and if the employee ID in the employee identity information is inconsistent with the station ID in the station information, determining that the employee to be checked is not on duty.
7. An employee on Shift status detection system, comprising: the system comprises a first acquisition module, a second acquisition module and an on-duty detection module;
the first acquisition module is used for inputting the images of the staff to be checked into the target detection model so as to acquire first characteristic data of the images of the staff to be checked;
the second obtaining module is used for obtaining the similarity between the first characteristic data and the second characteristic data corresponding to each employee image pre-stored in the database based on the target similarity function so as to determine employee identity information corresponding to the employee image with the highest similarity;
the on-duty detection module is used for judging the on-duty state of the staff to be checked according to the staff identity information and the station information of the staff to be checked;
the target similarity function is determined according to Euclidean distance and cosine distance.
8. An electronic device comprising a processor and a memory storing a computer program, wherein the processor, when executing the computer program, implements the employee on Shift status detection method of any of claims 1 to 6.
9. A processor-readable storage medium, wherein the processor-readable storage medium stores a computer program for causing the processor to perform the employee on Shift status detection method of any one of claims 1 to 6.
10. A computer program product comprising a computer program which when executed by a processor implements a method of on Shift status detection of an employee as claimed in any one of claims 1 to 6.
CN202210249009.0A 2022-03-14 2022-03-14 Employee on-duty state detection method and system Pending CN116798087A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210249009.0A CN116798087A (en) 2022-03-14 2022-03-14 Employee on-duty state detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210249009.0A CN116798087A (en) 2022-03-14 2022-03-14 Employee on-duty state detection method and system

Publications (1)

Publication Number Publication Date
CN116798087A true CN116798087A (en) 2023-09-22

Family

ID=88034937

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210249009.0A Pending CN116798087A (en) 2022-03-14 2022-03-14 Employee on-duty state detection method and system

Country Status (1)

Country Link
CN (1) CN116798087A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117251556A (en) * 2023-11-17 2023-12-19 北京遥领医疗科技有限公司 Patient screening system and method in registration queue

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117251556A (en) * 2023-11-17 2023-12-19 北京遥领医疗科技有限公司 Patient screening system and method in registration queue

Similar Documents

Publication Publication Date Title
US11487995B2 (en) Method and apparatus for determining image quality
US11961227B2 (en) Method and device for detecting and locating lesion in medical image, equipment and storage medium
CN108764048B (en) Face key point detection method and device
US10198623B2 (en) Three-dimensional facial recognition method and system
WO2020177432A1 (en) Multi-tag object detection method and system based on target detection network, and apparatuses
CN110147744B (en) Face image quality assessment method, device and terminal
CN106203242B (en) Similar image identification method and equipment
CN111274916B (en) Face recognition method and face recognition device
CN104424634B (en) Object tracking method and device
CN110728209A (en) Gesture recognition method and device, electronic equipment and storage medium
CN107408211A (en) Method for distinguishing is known again for object
CN110765882B (en) Video tag determination method, device, server and storage medium
CN113298158B (en) Data detection method, device, equipment and storage medium
CN111091075A (en) Face recognition method and device, electronic equipment and storage medium
CN113449704B (en) Face recognition model training method and device, electronic equipment and storage medium
WO2020164266A1 (en) Living body detection method and system, and terminal device
CN109815823B (en) Data processing method and related product
CN104063686A (en) System and method for performing interactive diagnosis on crop leaf segment disease images
CN110796135A (en) Target positioning method and device, computer equipment and computer storage medium
CN112651333B (en) Silence living body detection method, silence living body detection device, terminal equipment and storage medium
CN112336342A (en) Hand key point detection method and device and terminal equipment
CN114219855A (en) Point cloud normal vector estimation method and device, computer equipment and storage medium
CN107871103A (en) Face authentication method and device
CN111680544B (en) Face recognition method, device, system, equipment and medium
CN116798087A (en) Employee on-duty state detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination