CN112036744A - Method and device for determining working condition - Google Patents

Method and device for determining working condition Download PDF

Info

Publication number
CN112036744A
CN112036744A CN202010891917.0A CN202010891917A CN112036744A CN 112036744 A CN112036744 A CN 112036744A CN 202010891917 A CN202010891917 A CN 202010891917A CN 112036744 A CN112036744 A CN 112036744A
Authority
CN
China
Prior art keywords
dimension
group
working
value
index value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010891917.0A
Other languages
Chinese (zh)
Inventor
邱文
曾可
卢道和
罗锶
黄耿冬
鲁东东
郭江涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN202010891917.0A priority Critical patent/CN112036744A/en
Publication of CN112036744A publication Critical patent/CN112036744A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Software Systems (AREA)
  • Educational Administration (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Tourism & Hospitality (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Finance (AREA)
  • Biophysics (AREA)
  • Accounting & Taxation (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Technology Law (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method and a device for determining working conditions, wherein the method comprises the following steps: acquiring information of each working image of a worker in the working process; aiming at any one of the working image information, obtaining a dimension index value of the working image information for each characteristic dimension according to the working image information and a neural network model; aiming at any characteristic dimension, determining a working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension; and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension. When the method is applied to financial technology (Fintech), the working condition of the staff is determined more accurately.

Description

Method and device for determining working condition
Technical Field
The invention relates to the field of automatic monitoring in the field of financial technology (Fintech), in particular to a method and a device for determining a working condition.
Background
With the development of computer technology, more and more technologies are applied in the financial field, and the traditional financial industry is gradually changing to financial technology (Fintech), but due to the requirements of the financial industry on safety and real-time performance, higher requirements are also put forward on the technologies. The financial industry has high requirements on the working quality of workers, so that the working condition of the workers needs to be evaluated.
Currently, the evaluation of the working quality is mainly based on static data such as operation records and working results of workers on the system, and a relatively fixed and simple rule is used, for example, the working quality is determined according to the comparison of the times of the operation records of the system and a threshold value. Obviously, the working state of the worker in the actual working process is difficult to reflect by such a method, so that the working condition of the worker cannot be accurately reflected. Therefore, in the prior art, the manner of determining the working condition of the worker is not accurate enough, which is a problem to be solved urgently.
Disclosure of Invention
The invention provides a method and a device for determining working conditions, which solve the problem that the mode for determining the working conditions of workers in the prior art is not accurate enough.
In a first aspect, the present invention provides a method for determining an operating condition, comprising: acquiring information of each working image of a worker in the working process; aiming at any one of the working image information, obtaining a dimension index value of the working image information for each characteristic dimension according to the working image information and a neural network model; the neural network model is obtained by performing machine learning training according to an image training data set; any image training data in the image training data set comprises: the dimension index values of the image training information and the image training information for each feature dimension; aiming at any characteristic dimension, determining a working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension; and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension.
In the above manner, first, each working image information of the worker in the working process is obtained, each working image information reflects the image information of the worker in the working process, further, a feature index value of each working image information for each feature dimension can be obtained according to each working image information and a neural network model, since the neural network model is obtained by performing machine learning training according to an image training data set, the neural network model can analyze the working image information according to knowledge of a relationship between image training information and the feature index value learned historically, determine the working index value of the worker in the feature dimension, the obtained working index value is more accurate with respect to the working index value obtained by a fixed rule, and further, the working index value of each working image information for each feature dimension can be at least obtained according to the working index value of each working image information, the working condition of the staff is more accurately determined.
Optionally, the obtaining a dimension index value of the working image information for each feature dimension according to the working image information and the neural network model includes: carrying out gray level processing on the working image information to obtain gray level image information; performing convolution operation and/or pooling operation on the gray image information to obtain dimension reduction image information; and inputting the dimension reduction image information into the neural network model to obtain dimension index values of the working image information to each characteristic dimension.
Optionally, the neural network model performs iterative training in the following manner: inputting image training information in the image training data into an intermediate training model aiming at any image training data in the image training data set, and obtaining a predicted value of the image training information on each characteristic dimension according to each training parameter of the intermediate training model; determining whether the intermediate training model meets a preset convergence condition or not according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; if the preset convergence condition is met, ending the iterative training, and taking the intermediate training model as the neural network model; if the preset convergence condition is not met, determining the gradient value of each training parameter according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; and updating each training parameter according to the gradient value of each training parameter, so as to update the intermediate training model, and continuously executing the iterative training.
Optionally, each feature dimension is divided into a plurality of feature dimension groups according to a preset feature category; the determining the working condition of the worker according to at least the working index value of the worker to each feature dimension comprises: determining a group index value of the characteristic dimension group according to the working index value of each characteristic dimension in the same characteristic dimension group; determining a group index value of the working record dimension group according to the working record data of each working record dimension in the same working record dimension group; the work record dimension group is any one of a plurality of work record dimension groups; the plurality of work record dimension groups are dimension groups divided according to preset work record categories; determining the working condition of the staff according to the group index values of the multiple dimension groups and the group weight values of the multiple dimension groups; the plurality of dimension groups includes the plurality of feature dimension groups and the plurality of work record dimension groups.
Optionally, the determining a group index value of the feature dimension group according to the work index value of each feature dimension in the same feature dimension group includes: determining a group index value of the feature dimension group according to the work index value and the feature sub-weight value of each feature dimension in the feature dimension group of each piece of work image information; the determining a group index value of the work record dimension group according to the work record data of each work record dimension in the same work record dimension group includes: and determining a group index value of the work record dimension group according to the work record data and the work record sub-weight value of each work record dimension in the work record dimension group.
Optionally, the sub-weight value of the work record in any work record dimension in the work record dimension group is adjusted according to the following manner: acquiring historical work record data of the work record dimensionality; determining a historical index value of the working record dimension according to the historical working record data and the working record sub-weight value; and if the historical index value is not less than a preset threshold value, reducing the sub-weight value of the work record, and returning to the step of determining the historical index value of the dimension of the work record according to the historical work record data and the sub-weight value of the work record.
Optionally, the group weight values of the multiple dimension groups are adjusted as follows: acquiring component values of the dimension groups according to the group index value of each dimension group in the dimension groups and the group weight value of the dimension group; obtaining a dimension group with the component value being the maximum value and a dimension group with the minimum value in a preset value interval from the multiple dimension groups; increasing the group weight value of the dimensionality group with the maximum score and/or decreasing the group weight value of the dimensionality group with the minimum score; and if the group weight values of the multiple dimension groups meet the preset adjustment condition, returning to the step of acquiring the component value of each dimension group according to the group index value of each dimension group in the multiple dimension groups and the group weight value of the dimension group.
Optionally, the method further includes: for any dimension group in the plurality of dimension groups, determining a group index value of each working personnel in the working group relative to the dimension group according to the group index value of each working personnel in the working group relative to the dimension group; and generating a visual ranking result of the plurality of working groups for each dimension group in the plurality of dimension groups according to the group index value of the plurality of working groups for each dimension group in the plurality of dimension groups.
In a second aspect, the present invention provides an apparatus for determining an operating condition, comprising: the acquisition module is used for acquiring the information of each working image of a worker in the working process; the processing module is used for obtaining a dimension index value of the working image information for each characteristic dimension according to the working image information and a neural network model aiming at any one of the working image information; the neural network model is obtained by performing machine learning training according to an image training data set; any image training data in the image training data set comprises: the dimension index values of the image training information and the image training information for each feature dimension; aiming at any characteristic dimension, determining a working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension; and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension.
Optionally, the processing module is specifically configured to: carrying out gray level processing on the working image information to obtain gray level image information; performing convolution operation and/or pooling operation on the gray image information to obtain dimension reduction image information; and inputting the dimension reduction image information into the neural network model to obtain dimension index values of the working image information to each characteristic dimension.
Optionally, the processing module is further configured to: inputting image training information in the image training data into an intermediate training model aiming at any image training data in the image training data set, and obtaining a predicted value of the image training information on each characteristic dimension according to each training parameter of the intermediate training model; determining whether the intermediate training model meets a preset convergence condition or not according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; if the preset convergence condition is met, ending the iterative training, and taking the intermediate training model as the neural network model; if the preset convergence condition is not met, determining the gradient value of each training parameter according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; and updating each training parameter according to the gradient value of each training parameter, so as to update the intermediate training model, and continuously executing the iterative training.
Optionally, each feature dimension is divided into a plurality of feature dimension groups according to a preset feature category; the processing module is specifically configured to: determining a group index value of the characteristic dimension group according to the working index value of each characteristic dimension in the same characteristic dimension group; determining a group index value of the working record dimension group according to the working record data of each working record dimension in the same working record dimension group; the work record dimension group is any one of a plurality of work record dimension groups; the plurality of work record dimension groups are dimension groups divided according to preset work record categories; determining the working condition of the staff according to the group index values of the multiple dimension groups and the group weight values of the multiple dimension groups; the plurality of dimension groups includes the plurality of feature dimension groups and the plurality of work record dimension groups.
Optionally, the processing module is specifically configured to: determining a group index value of the feature dimension group according to the work index value and the feature sub-weight value of each feature dimension in the feature dimension group of each piece of work image information; and determining a group index value of the work record dimension group according to the work record data and the work record sub-weight value of each work record dimension in the work record dimension group.
Optionally, the processing module is further configured to: acquiring historical work record data of the work record dimensionality; determining a historical index value of the working record dimension according to the historical working record data and the working record sub-weight value; and if the historical index value is not less than a preset threshold value, reducing the sub-weight value of the work record, and returning to the step of determining the historical index value of the dimension of the work record according to the historical work record data and the sub-weight value of the work record.
Optionally, the processing module is further configured to: acquiring component values of the dimension groups according to the group index value of each dimension group in the dimension groups and the group weight value of the dimension group; obtaining a dimension group with the component value being the maximum value and a dimension group with the minimum value in a preset value interval from the multiple dimension groups; increasing the group weight value of the dimensionality group with the maximum score and/or decreasing the group weight value of the dimensionality group with the minimum score; and if the group weight values of the multiple dimension groups meet the preset adjustment condition, returning to the step of acquiring the component value of each dimension group according to the group index value of each dimension group in the multiple dimension groups and the group weight value of the dimension group.
Optionally, the processing module is further configured to: for any dimension group in the plurality of dimension groups, determining a group index value of each working personnel in the working group relative to the dimension group according to the group index value of each working personnel in the working group relative to the dimension group; and generating a visual ranking result of the plurality of working groups for each dimension group in the plurality of dimension groups according to the group index value of the plurality of working groups for each dimension group in the plurality of dimension groups.
In a third aspect, the present invention provides a computer device comprising a program or instructions for performing the method of the first aspect and the alternatives of the first aspect when the program or instructions are executed.
In a fourth aspect, the present invention provides a storage medium comprising a program or instructions which, when executed, is adapted to perform the method of the first aspect and the alternatives of the first aspect.
These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart corresponding to a method for determining a working condition according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a system architecture for determining an applicable operating condition according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating feature extraction and image classification according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the partitioning of a pixel matrix according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating dimension index values of the working image information for each feature dimension according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating the training of a neural network model according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a relationship between a dimension group and a weight value according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating a visualization result ranking in a method of determining operating conditions, according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an apparatus for determining an operating condition according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following first lists abbreviations that appear in embodiments of the present application.
ITSM: the IT service management is the IT service management.
IMS: the Intelligent Monitor System is an Intelligent monitoring platform.
IBMS, Intelligent behavor monitoring system, is the Intelligent behavior monitoring system.
In the operation process of financial institutions (banking institutions, insurance institutions or security institutions) in carrying out businesses (such as loan businesses, deposit businesses and the like of banks), the requirements of the financial industry on the working quality of workers are high, and therefore the working conditions of the workers need to be evaluated. At present, the working state of a worker in the actual working process is difficult to reflect by a working quality evaluation mode, so that the working condition of the worker cannot be accurately reflected. This situation does not meet the requirements of financial institutions such as banks, and the efficient operation of various services of the financial institutions cannot be ensured.
To this end, the present invention provides a method of determining an operating condition, as shown in FIG. 1.
Step 101: and acquiring the information of each working image of the working personnel in the working process.
Step 102: and aiming at any one of the working image information, obtaining a dimension index value of the working image information for each characteristic dimension according to the working image information and a neural network model.
Step 103: and aiming at any characteristic dimension, determining the working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension.
Step 104: and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension.
In the steps 101 to 104, the neural network model is obtained by performing machine learning training according to an image training data set; any image training data in the image training data set comprises: the image training information and the dimension index value of the image training information for each feature dimension.
In the scenario of monitoring the work quality of the IT service desk personnel, the methods of steps 101 to 104 can be applied to the work quality management framework of the IT service desk personnel shown in fig. 2. Specifically, the method comprises the following steps:
monitoring the camera: the monitoring camera deployed in the working area of the IT service desk is used as data input of the IBMS system to record videos of the service desk environment and workers, and the image quality of the videos can be set to be high definition or ultra-definition. The video is composed of a plurality of pieces of image information, and can obtain each piece of working image information of a worker in the working process and input the working image information into the IBMS system.
IBMS system: and classifying, identifying and processing the working image information input by the monitoring camera, and outputting the result to a quality inspection processor. In the IBMS system, the data processing procedure may include: inputting image information, feature extraction, mapping from features, and outputting.
An IMS module: the service desk alarm processing state data which can provide the working record data, the quality inspection processor can obtain the service desk alarm processing state data through the IMS interface, and the fields of the quality inspection processor can include: alarm level, alarm start time, alarm end time, processing state, ITSM dispatching, alarm handler, remark.
An ITSM module: the service request processing flow data of the work record data can be provided, the quality inspection processor can acquire the service request processing flow data processed by the service desk through the ITSM interface, and the fields of the data can include: service request type, request title, request order number, request time, processing step, handler, processing time, current status.
A service desk management module: can provide each post personnel scheduling list of service desk and the time point information of handing over shift, the quality inspection treater can obtain each post personnel scheduling list of service desk and the time point information of handing over shift through service desk management system interface, and its field can include: the shift on duty, the group on duty, the name of the long shift on duty, the name of the alarm post, the name of the on-site post, the time of taking over the shift, and the time of changing over the shift.
The quality inspection processor is divided into two modules: a quality inspection rule control module and an execution module.
Quality control rule control module: the method can be used for carrying out parameterization management on the quality inspection rule and automatically learning and adjusting the optimal parameters according to the execution result. The quality inspection rule control module can also automatically learn and adjust the optimal parameters based on the preset rule parameters, and can use the preset configuration parameters to carry out operation under the condition that the automatic learning algorithm of the system is not ideal.
An execution module: the rule controller can be used for integrating the input data of each module according to the preset rules of the rule controller, and respectively carrying out classified statistics on various working conditions, such as quality inspection result data, during the duty period according to the duty and the duty personnel. For example, the shift and the person on duty are ranked and scored according to different feature dimensions, and the total shift score of each shift is obtained accordingly.
The quality inspection report can visually process the quality inspection result and the detailed data, and provides the functions of storing, inquiring and correcting the quality inspection report.
It should be noted that before step 101, the feature extraction and the image classification may be performed in the following manner, as shown in fig. 3, and the specific flow is as follows:
the dimension index value of the characteristic dimension can be set first and labeled. The working image information with enough classification of the following feature dimensions is input as a training \ test set, and the IBMS forms a 'discriminator' of each classification feature through machine learning. IBMS snatchs the work image information of camera input, discerns the dimension index value and the count of characteristic dimension, and the mapping corresponds the label and exports. The feature dimension classification may include:
(1) environmental sanitation of the service desk:
labeling: tidy 1, untidy 0;
characteristic dimension: bottle E1, paper E2, chair regularity E3, every characteristic dimension can all get the index value, for example E1 equals 0 for no bottle, E1 equals 1 for having the bottle, can set up the rule: labels that exhibit any one or more of the above characteristics are uncluttered.
(2) The image of the working personnel is as follows:
labeling: the requirement 1 is met, and the clothes are not standardized by 0;
characteristic dimension: face F, shorts I2, vest I3, slippers I4; each characteristic dimension can be taken as an index value, for example, when I1 ═ 0 is a worn pant, and I1 ═ 1 is an unworn pant, the rule can be set: and (4) screening out the service desk working personnel according to the face feature data, wherein the labels meeting other features except the face are not standard for clothing.
(3) Staff behavior:
labeling: the requirement 1 is met, and the behavior is not normalized 0;
characteristic dimension: a human face F, a mobile phone screen non-working picture B1, a computer screen non-working picture B2 and a working time out-of-place B3; each feature dimension may take an index value, for example, B1 ═ 0 is a working frame, B1 ═ 1 is a non-working frame, and a rule may be set: and (4) screening out the service desk working personnel according to the face feature data, wherein the labels meeting other features except the face are not standard in behavior.
In an alternative embodiment, step 102 is specifically performed as follows:
step (2-1): and carrying out gray level processing on the working image information to obtain gray level image information.
Step (2-2): and performing convolution operation and/or pooling operation on the gray image information to obtain dimension reduction image information.
Step (2-3): and inputting the dimension reduction image information into the neural network model to obtain dimension index values of the working image information to each characteristic dimension.
The specific processes from step (2-1) to step (2-3) may be as follows:
for any one of the working image information, for example, the working image information in the RGB format of the camera is obtained, the gray processing is carried out on the working image information, and the subsequent system operation amount is reduced. The maximum value of the three-component brightness in the color image is taken as the gray value of the gray image to obtain gray image information, and the formula is expressed as follows:
Gray(i,j)=max{R(i,j),G(i,j),B(i,j)}。
further, the convolution operation may be as follows:
convolving the Gray matrix obtained in the last step, wherein the convolution kernel G is a Gaussian kernel of 3X 3, and decomposing the Gray matrix into n X m matrices of 3X 3X 1n,mAnd respectively performing matrix multiplication with G, wherein n and m are positive integers, a ReLU function is used for endowing image data with nonlinear characteristics, and a characteristic diagram Y1 is output, wherein the function is expressed as:
Figure BDA0002657260990000101
convolution:
Figure BDA0002657260990000111
characteristic diagram: y1(n, m) max (0, X1)n,mG); (formula three)
The convolutional layer aims at highlighting the edge of an object in an image and achieving the purpose of feature extraction. The above operation process is exemplified as follows:
suppose the Gray-scale graph Gray (i, j) is a pixel map of 5 × 5 (as shown in Table 1)
Gray(1,1) Gray(1,2) Gray(1,3) Gray(1,4) Gray(1,5)
Gray(2,1) Gray(2,2) Gray(2,3) Gray(2,4) Gray(2,5)
Gray(3,1) Gray(3,2) Gray(3,3) Gray(3,4) Gray(3,5)
Gray(4,1) Gray(4,2) Gray(4,3) Gray(4,4) Gray(4,5)
Gray(5,1) Gray(5,2) Gray(5,3) Gray(5,4) Gray(5,5)
TABLE 1
As shown in fig. 4, the above Gray (i, j) pixel matrix is divided into 3 × 3 by 3 matrices X1n,mI.e. 9 sub-matrices: including X11,1,X11,2,X11,3,X12,1,X12,2,X12,3,X13,1,X13,2,X13,3(ii) a Wherein, X11,1As shown in table 2, the following may be specifically mentioned:
Gray(1,1) Gray(1,2) Gray(1,3)
Gray(2,1) Gray(2,2) Gray(2,3)
Gray(3,1) Gray(3,2) Gray(3,3)
TABLE 2
X11,2,X11,3The equal matrix can be analogized in turn, wherein the index of X1 is equal to the index of the first Gray pixel map in the top left corner of the X1 matrix. The convolution kernel G (x, y) is a 3 × 3 matrix, and assuming that the center point is the origin, that is, x is 0 and y is 0, 9 point coordinates are calculated as follows, taking σ as 1, and substituting into formula one, and the values of the points of the convolution kernel can be calculated. As shown in table 3 below:
Figure BDA0002657260990000112
Figure BDA0002657260990000121
TABLE 3
After the calculation of formula two and formula three, a feature map Y1(n, m) of 3 × 3 should be output, as shown in table 4 below:
Figure BDA0002657260990000122
TABLE 4
The above characteristic values are obtained by substituting the following formula for operation:
Y1(1,1)=X11,1*G
=Gray(1,1)*G(1,1)+Gray(1,2)*G(1,0)+Gray(1,3)*G(1,-1)+Gray(2,1)*G(0,1)+Gray(2,2)*G(0,0)+Gray(2,3)*G(0,-1)+Gray(3,1)*G(-1,1)+Gray(3,2)*G(-1,0)+Gray(3,3)*G(-1,-1)
and the 9 characteristic values in the Y1 characteristic diagram are calculated by analogy.
Further, the pooling operation may be as follows:
and (3) performing maximum pooling on the Y1 feature map, further highlighting and extracting edge lines, and outputting a feature map Y2, wherein the function is represented as: y2(x, Y) max (Y1)(2x-1,2y-1),Y1(2x-1,2y),Y1(2x,2y-1),Y1(2x,2y)) (ii) a The length and width of Y2 are half of those of Y1, and each pixel (x, Y) in Y2 is the maximum value of 4 pixels near the corresponding subscript of Y1.
Pooling the 3 x 3 profile Y1 in the above step to obtain 2 x 2 profile Y2, e.g. as
As shown in table 5, can be written as:
Y2(1,1) Y2(1,2)
Y2(2,1) Y2(2,2)
TABLE 5
Each feature value of the feature map Y2 is input as an input to the trained fully-connected neural network, and the neural network automatically calculates the similarity of each feature dimension classification of the working image information, as shown in fig. 5.
In the implementation manner of the step (2-1) to the step (2-3), the dimension reduction image information is obtained by performing gray processing on the working image information and performing convolution operation and/or pooling operation on the gray image information, so that the information amount of the working image information is reduced, and further, after the dimension reduction image information is input into the neural network model, the operation amount is reduced, so that the efficiency of obtaining the dimension index value of the working image information for each characteristic dimension is improved.
The method can be applied to the IBMS, and after an intelligent image recognition technology is introduced into the IBMS, the operation pressure of the IBMS can be reduced through the organic combination of the image gray processing, the convolution layer, the pooling layer and the full-link layer structure, and the efficiency is improved.
In an alternative embodiment, the neural network model performs iterative training in the following manner:
step (3-1): and inputting image training information in the image training data into an intermediate training model aiming at any image training data in the image training data set, and obtaining the predicted value of the image training information on each characteristic dimension according to each training parameter of the intermediate training model.
Step (3-2): and determining whether the intermediate training model meets a preset convergence condition or not according to the difference value between the predicted value of the image training information for each characteristic dimension and the index value of the corresponding dimension.
Step (3-3): and if the preset convergence condition is met, ending the iterative training, and taking the intermediate training model as the neural network model.
Step (3-4): if the preset convergence condition is not met, determining the gradient value of each training parameter according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; and updating each training parameter according to the gradient value of each training parameter, so as to update the intermediate training model, and continuously executing the iterative training.
Specifically, as shown in fig. 6, the implementation of the steps (3-1) to (3-4) is as follows:
the neural network model can be obtained by training a plurality of (such as 10000) labeled working image information as a training set by using a gradient descent method. The neural network model training process comprises the following steps:
forward propagation: the feature extraction and image classification process in the foregoing is to extract the image features and output the predicted value, and at this time, the weight in the model takes a random value. The function expression: y ispredict=f(X)
And (3) back propagation: for each weight to calculate the gradient of the output error, the function expression can be:
Figure BDA0002657260990000141
(Yactualtrue value, w _ k is the kth weight).
The weight update process may be as follows:
the function expression: w _ k ═ wkAnd (a is the step size of gradient descent), and returning to the forward propagation link after the weight value is updated, and circulating the steps of forward propagation, backward propagation and weight value updating until the error is within the expectation.
An example of a training procedure for a neural network model is as follows:
initializing a single-layer fully-connected neural network of two inputs I1 and I2 and two outputs O1 and O2, wherein each connection corresponds to a weight w1 and w2 … w8, each input layer corresponds to an offset b1 and b2, and initial values of the weight and the offset are randomly set.
The forward propagation may specifically be as follows:
namely, according to the input I1 and I2, the process of calculating the output O1 and O2 through the intermediate training models H1 and H2 is as follows:
firstly, all input (H1) of H1 are calculated as I1 w1+ I2 w3+ b1, and then the output of H1 is calculated by using a sigmoid activation function
Figure BDA0002657260990000142
Similarly, all input (H2) of H2 are calculated as I1 w2+ I2 w4+ b1, and then the output of H2 is calculated using the sigmoid activation function
Figure BDA0002657260990000143
Similarly, the outputs of O1 and O2 are calculated:
Figure BDA0002657260990000144
Figure BDA0002657260990000145
in the forward propagation process, the input is the working image information in the training set, and the output is the predicted value.
The back propagation may specifically be as follows:
firstly, calculating the difference between the predicted value and the true value:
out (O1) and actual value O1actualError of (2)
Figure BDA0002657260990000146
out (O2) and actual value O2actualError of (2)
Figure BDA0002657260990000151
Total error E (total) E (O1) + E (O2);
here the calculation error is not directly used simply as O1actualOut (O1) calculation is because the absolute value of the error needs to be calculated when the total error is counted, the computer efficiency is better than the absolute value by using a square operation, and in addition, the coefficient 1/2 is added, so that the derivation operation is convenient.
For the weight w1, the effect of the change in w1 on the total error needs to be calculated, and then we need to differentiate w1, denoted by g _ w1, i.e.:
Figure BDA0002657260990000152
(derived chain rule). Similarly, the other 7 weights are calculated and differentiated to obtain g _ w2 and g _ w3 … g _ w 8.
The weight update may specifically be as follows:
the 8 weights are updated by setting the gradient descent step α to 0.01, that is, w1 to w1+ α g _ w1, … to w8 to w8+ α g _ w 8.
And returning to the forward propagation after the weight value is updated, and circularly calculating again until the total error E (total) is stopped within an expected range, thereby finishing the training of the neural network model.
In the implementation manner of the step (3-1) to the step (3-4), after the predicted value of the image training information for each feature dimension is obtained, if the preset convergence condition is not satisfied, determining the gradient value of each training parameter according to the difference between the predicted value of the image training information for each feature dimension and the corresponding dimension index value, and updating each training parameter according to the gradient value of each training parameter, so as to update the intermediate training model, thereby performing iterative training through the gradient value of each training parameter, so that the neural network model has fewer iteration times and higher performance.
The specific implementation of step 103 may be as follows:
and determining the working index value of the worker in the characteristic dimension according to the statistic of the dimension index value of each working image information in the characteristic dimension. The statistic may be an average value of the dimension index values of the respective working image information in the feature dimension, a sum of the dimension index values of the respective working image information in the feature dimension, or the like, and a dimension index threshold may be set, and the statistic may be a ratio of the dimension index value of the respective working image information in the feature dimension to be greater than the dimension index threshold.
For example, the characteristic dimension is a bottle can E1, the value (dimensional index value) of E1 is 0 or 1, E1 ═ 0 is no bottle can, and E1 ═ 1 is a bottle can. The operation image information includes 100 pieces of operation image information, and the average value of the dimensional index values of the bottles E1 in the 100 pieces of operation image information is 0.7, and 0.7 is used as the operation index value of the worker in the bottle E1.
Or, for example, by:
the characteristic dimension is chair regularity E3, the dimension index threshold is 0.6, the value (dimension index value) of the chair regularity E3 is in the interval [0,1], and the chair regularity degree is represented. If the dimension index value of 150 pieces of work image information in the chair regularity E3 is greater than 0.6 among the 200 pieces of work image information, the ratio of the 200 pieces of work image information greater than the dimension index threshold value of 0.75 may be used as the work index value of the worker in the bottle tank E3.
In an alternative embodiment (hereinafter referred to as a multiple dimension group embodiment), each feature dimension is divided into multiple feature dimension groups according to a preset feature category; step 104 is specifically performed as follows:
step (4-1): and determining a group index value of the characteristic dimension group according to the working index value of each characteristic dimension in the same characteristic dimension group.
Step (4-2): and determining a group index value of the working record dimension group according to the working record data of each working record dimension in the same working record dimension group.
Step (4-3): and determining the working condition of the staff according to the group index values of the multiple dimension groups and the group weight values of the multiple dimension groups.
The work record dimension group is any one of a plurality of work record dimension groups; the plurality of work record dimension groups are dimension groups divided according to preset work record categories; the plurality of dimension groups includes the plurality of feature dimension groups and the plurality of work record dimension groups.
In the implementation modes of the step (4-1) to the step (4-3), on one hand, a group index value of the same feature dimension group divided according to a preset feature category is determined, so that the working condition of the feature dimension group is represented, on the other hand, a group index value of the same work record dimension group divided according to a preset work record category is determined, so that the working condition of the work record dimension group is represented, so that the working conditions of the feature dimension group and the work record dimension group are comprehensively considered, and the working condition of the worker is more accurately determined.
In an optional implementation manner (hereinafter, referred to as an implementation manner of sub-weight values) based on the implementation manner of the multiple dimension groups, the implementation manner of determining the group index value of the feature dimension group according to the operation index value of each feature dimension in the same feature dimension group may specifically be:
and determining a group index value of the characteristic dimension group according to the working index value and the characteristic sub-weight value of each characteristic dimension in the characteristic dimension group of each piece of working image information.
Correspondingly, according to the work record data of each work record dimension in the same work record dimension group, the embodiment of determining the group index value of the work record dimension group may specifically be:
and determining a group index value of the work record dimension group according to the work record data and the work record sub-weight value of each work record dimension in the work record dimension group.
Specifically, when the framework shown in fig. 2 is applied, the quality control rule can be parametrically managed by the quality control rule control module, and the optimal parameters can be automatically learned and adjusted according to the execution result.
The rule control module mainly comprises the following 3 parts:
a. rule set RiAnd weight value W of corresponding dimension groupij
b. Sub-rule set rxAnd the sub-weight value w of each dimension in the corresponding dimension groupxy
c. An automatic parameter adjusting module;
the following "quality inspection rules" are preset, and as shown in table 6, R1 to R5 are each dimension group (also referred to as quality inspection item), R1 to R2 are job record dimension groups, and R3 to R5 are feature dimension groups, corresponding to the contents of the above rule control modules a and b:
Figure BDA0002657260990000171
Figure BDA0002657260990000181
TABLE 6
The relationship between the "quality inspection rule" and the rule set (dimension group) and the parameter dictionary (weight value) in the rule control modules a and b is shown in fig. 7.
In the above manner, the group index value of the feature dimension group is determined by the work index value and the feature sub-weight value of each feature dimension in the feature dimension group, the importance degree of each feature dimension in the feature dimension group to the index value of the feature dimension group is considered, the group index value of the work record dimension group is determined according to the work record data and the work record sub-weight value of each work record dimension in the work record dimension group, and the importance degree of each work record dimension in the work record dimension group to the index value of the work record dimension group is considered, so that the accuracy of determining the work condition of the worker is further increased.
In an optional implementation manner, the sub-weight value of the job record in any job record dimension in the job record dimension group is adjusted as follows:
step (5-1): and acquiring historical work record data of the work record dimensionality.
Step (5-2): and determining a historical index value of the working record dimension according to the historical working record data and the working record sub-weight value.
Step (5-3): and if the historical index value is not smaller than a preset threshold value, reducing the sub-weight value of the work record, and returning to the step (5-2).
Further, the working record sub-weight values (also referred to as age parameters) are adjusted in the following manner:
step 1: sub-weight value w for each job recordxySetting the corresponding suggested optimizable parameter variable txy=wxy
Step 2: sub-weight values w for each job record dimensionxyAnd calculating historical work record data of the work record dimension (such as the time-based completion rate p in three months)xy);
And step 3: if p of historical work record data of all value groupsxyAre all 100%, txy=txy-0.02, in txyRecalculating historical index value p for the working record dimensionxyLooping step 3 until at least one value class pxyLess than 100%.
And 4, step 4: outputting the current subweight value t of the working record dimensionxyFurther, a quality inspector may decide whether to tighten the sub-weight values of the job record dimension.
In the implementation manner of the steps (5-1) to (5-3), after the historical index value of the work record dimension is determined according to the historical work record data and the work record sub-weight value, by comparing the historical index value with a preset threshold value, it can be determined whether the preset threshold value is easily reached by the work record data in the history, so that the reference meaning indicating the work record dimension is easily reached is smaller, and if the historical index value is not smaller than the preset threshold value, the work record sub-weight value is reduced, so that the importance degree of the work record dimension which can easily reach a higher index value is weakened, and the work condition of the work record dimension group in which the work record dimension group is located is more accurately characterized.
Based on the multiple dimension group embodiment, in an optional embodiment, the group weight values of the multiple dimension groups are adjusted as follows:
step (6-1): and acquiring the component values of the dimension groups according to the group index value of each dimension group in the dimension groups and the group weight value of the dimension group.
Step (6-2): and acquiring the dimension group with the component value being the maximum value and the dimension group with the minimum value in the preset value interval from the plurality of dimension groups.
Step (6-3): increasing the group weight value of the dimension group with the maximum score and/or decreasing the group weight value of the dimension group with the minimum score.
Step (6-4): and (5) if the group weight values of the multiple dimension groups meet the preset adjusting condition, returning to the step (6-1).
The implementation scheme of the steps (6-1) to (6-4) can be as follows:
step 1: the last day of each month a current month rule set (multi-dimension group) R is calculatediThe total score of each rule (dimension group) in the set is marked as Si
Step 2: taking the maximum score max (S) within a predetermined score interval, e.g., (0.50,0.05)i) Index x, minimum score min (S) of the corresponding rulei) The corresponding rule subscript number y;
and step 3: adding 1% to the weight value of the maximum scoring rule, and subtracting 1% to the weight value of the minimum scoring rule, namely:
Wxj=Wxj+0.01;
Wyj=Wyj-0.01;
and 4, step 4: returning to the step 1, if the group weight values of the multiple dimension groups meet preset adjustment conditions (if the weight values do not converge or reach the boundary of the calculation interval), backtracking each rule score of the month according to the latest weight value, and returning to the step 1.
By adjusting the optimal weight value in the current month by the method, key inspection of indexes with problems is highlighted.
In the implementation manner of the steps (6-1) to (6-4), after the dimension group with the component value of the maximum value and the dimension group with the minimum value in the preset value interval are obtained, the group weight value of the dimension group with the maximum value and/or the group weight value of the dimension group with the minimum value are increased or decreased until a preset adjustment condition is met, so that the dimension group with a large influence is concerned more, and the dimension group with the large influence is more prominent and obvious.
And (3) under the implementation modes of the step (5-1) to the step (5-3) and the implementation modes of the step (6-1) to the step (6-4), flexible quality inspection rule parameterization management is carried out, quality inspection rule optimization suggestions are automatically analyzed and provided through automatic parameter adjustment, and the working quality of staff on the power-assisted service desk is continuously improved.
In the method described in steps 101 to 104, the following optional embodiments may also be performed:
for any dimension group in the plurality of dimension groups, determining a group index value of each working personnel in the working group relative to the dimension group according to the group index value of each working personnel in the working group relative to the dimension group; and generating a visual ranking result of the plurality of working groups for each dimension group in the plurality of dimension groups according to the group index value of the plurality of working groups for each dimension group in the plurality of dimension groups. The visual ranking result can be as shown in fig. 8.
In the above embodiment, for any one of the dimension groups, a group index value of each working group for the dimension group is determined, and further, a visual ranking result of the working groups for each of the dimension groups is generated, so that a comparative ranking condition of the working groups under each dimension group is displayed more intuitively, and a working condition of the worker is represented more intuitively.
As shown in fig. 9, the present invention provides an apparatus for determining an operating condition, comprising: an obtaining module 901, configured to obtain information of each working image of a worker in a working process; a processing module 902, configured to obtain, for any one of the pieces of working image information, a dimension index value of the working image information for each feature dimension according to the working image information and a neural network model; the neural network model is obtained by performing machine learning training according to an image training data set; any image training data in the image training data set comprises: the dimension index values of the image training information and the image training information for each feature dimension; aiming at any characteristic dimension, determining a working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension; and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension.
Optionally, the processing module 902 is specifically configured to: carrying out gray level processing on the working image information to obtain gray level image information; performing convolution operation and/or pooling operation on the gray image information to obtain dimension reduction image information; and inputting the dimension reduction image information into the neural network model to obtain dimension index values of the working image information to each characteristic dimension.
Optionally, the processing module 902 is further configured to: inputting image training information in the image training data into an intermediate training model aiming at any image training data in the image training data set, and obtaining a predicted value of the image training information on each characteristic dimension according to each training parameter of the intermediate training model; determining whether the intermediate training model meets a preset convergence condition or not according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; if the preset convergence condition is met, ending the iterative training, and taking the intermediate training model as the neural network model; if the preset convergence condition is not met, determining the gradient value of each training parameter according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; and updating each training parameter according to the gradient value of each training parameter, so as to update the intermediate training model, and continuously executing the iterative training.
Optionally, each feature dimension is divided into a plurality of feature dimension groups according to a preset feature category; the processing module 902 is specifically configured to: determining a group index value of the characteristic dimension group according to the working index value of each characteristic dimension in the same characteristic dimension group; determining a group index value of the working record dimension group according to the working record data of each working record dimension in the same working record dimension group; the work record dimension group is any one of a plurality of work record dimension groups; the plurality of work record dimension groups are dimension groups divided according to preset work record categories; determining the working condition of the staff according to the group index values of the multiple dimension groups and the group weight values of the multiple dimension groups; the plurality of dimension groups includes the plurality of feature dimension groups and the plurality of work record dimension groups.
Optionally, the processing module 902 is specifically configured to: determining a group index value of the feature dimension group according to the work index value and the feature sub-weight value of each feature dimension in the feature dimension group of each piece of work image information; and determining a group index value of the work record dimension group according to the work record data and the work record sub-weight value of each work record dimension in the work record dimension group.
Optionally, the processing module 902 is further configured to: acquiring historical work record data of the work record dimensionality; determining a historical index value of the working record dimension according to the historical working record data and the working record sub-weight value; and if the historical index value is not less than a preset threshold value, reducing the sub-weight value of the work record, and returning to the step of determining the historical index value of the dimension of the work record according to the historical work record data and the sub-weight value of the work record.
Optionally, the processing module 902 is further configured to: acquiring component values of the dimension groups according to the group index value of each dimension group in the dimension groups and the group weight value of the dimension group; obtaining a dimension group with the component value being the maximum value and a dimension group with the minimum value in a preset value interval from the multiple dimension groups; increasing the group weight value of the dimensionality group with the maximum score and/or decreasing the group weight value of the dimensionality group with the minimum score; and if the group weight values of the multiple dimension groups meet the preset adjustment condition, returning to the step of acquiring the component value of each dimension group according to the group index value of each dimension group in the multiple dimension groups and the group weight value of the dimension group.
Optionally, the processing module 902 is further configured to: for any dimension group in the plurality of dimension groups, determining a group index value of each working personnel in the working group relative to the dimension group according to the group index value of each working personnel in the working group relative to the dimension group; and generating a visual ranking result of the plurality of working groups for each dimension group in the plurality of dimension groups according to the group index value of the plurality of working groups for each dimension group in the plurality of dimension groups.
The beneficial effects of the above-mentioned device for determining a working condition and each optional device can refer to the beneficial effects of the above-mentioned method for determining a working condition and each optional method, and are not described herein again.
Based on the same inventive concept, embodiments of the present invention also provide a computer device, which includes a program or instructions, and when the program or instructions are executed, the method for determining the operating condition and any optional method provided by the embodiments of the present invention are executed.
Based on the same inventive concept, embodiments of the present invention also provide a computer-readable storage medium, which includes a program or instructions, and when the program or instructions are executed, the method for determining the operating condition and any optional method provided by the embodiments of the present invention are executed.
It should be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (11)

1. A method of determining an operating condition, comprising:
acquiring information of each working image of a worker in the working process;
aiming at any one of the working image information, obtaining a dimension index value of the working image information for each characteristic dimension according to the working image information and a neural network model; the neural network model is obtained by performing machine learning training according to an image training data set; any image training data in the image training data set comprises: the dimension index values of the image training information and the image training information for each feature dimension;
aiming at any characteristic dimension, determining a working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension;
and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension.
2. The method of claim 1, wherein obtaining dimension index values for the working image information for each feature dimension based on the working image information and a neural network model comprises:
carrying out gray level processing on the working image information to obtain gray level image information;
performing convolution operation and/or pooling operation on the gray image information to obtain dimension reduction image information;
and inputting the dimension reduction image information into the neural network model to obtain dimension index values of the working image information to each characteristic dimension.
3. The method of claim 1, wherein the neural network model performs iterative training in the following manner:
inputting image training information in the image training data into an intermediate training model aiming at any image training data in the image training data set, and obtaining a predicted value of the image training information on each characteristic dimension according to each training parameter of the intermediate training model;
determining whether the intermediate training model meets a preset convergence condition or not according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information;
if the preset convergence condition is met, ending the iterative training, and taking the intermediate training model as the neural network model;
if the preset convergence condition is not met, determining the gradient value of each training parameter according to the difference value between the predicted value of each characteristic dimension and the index value of the corresponding dimension of the image training information; and updating each training parameter according to the gradient value of each training parameter, so as to update the intermediate training model, and continuously executing the iterative training.
4. A method according to any one of claims 1 to 3, wherein each feature dimension is divided into a plurality of feature dimension groups according to a preset feature class; the determining the working condition of the worker according to at least the working index value of the worker to each feature dimension comprises:
determining a group index value of the characteristic dimension group according to the working index value of each characteristic dimension in the same characteristic dimension group;
determining a group index value of the working record dimension group according to the working record data of each working record dimension in the same working record dimension group; the work record dimension group is any one of a plurality of work record dimension groups; the plurality of work record dimension groups are dimension groups divided according to preset work record categories;
determining the working condition of the staff according to the group index values of the multiple dimension groups and the group weight values of the multiple dimension groups; the plurality of dimension groups includes the plurality of feature dimension groups and the plurality of work record dimension groups.
5. The method of claim 4, wherein determining a group index value for the set of feature dimensions based on the operational index values for each feature dimension in the same set of feature dimensions comprises:
determining a group index value of the feature dimension group according to the work index value and the feature sub-weight value of each feature dimension in the feature dimension group of each piece of work image information;
the determining a group index value of the work record dimension group according to the work record data of each work record dimension in the same work record dimension group includes:
and determining a group index value of the work record dimension group according to the work record data and the work record sub-weight value of each work record dimension in the work record dimension group.
6. The method of claim 5, wherein a work record sub-weight value for any work record dimension in the set of work record dimensions is adjusted as follows:
acquiring historical work record data of the work record dimensionality;
determining a historical index value of the working record dimension according to the historical working record data and the working record sub-weight value;
and if the historical index value is not less than a preset threshold value, reducing the sub-weight value of the work record, and returning to the step of determining the historical index value of the dimension of the work record according to the historical work record data and the sub-weight value of the work record.
7. The method of claim 4, wherein the group weight values of the plurality of dimension groups are adjusted as follows:
acquiring component values of the dimension groups according to the group index value of each dimension group in the dimension groups and the group weight value of the dimension group;
obtaining a dimension group with the component value being the maximum value and a dimension group with the minimum value in a preset value interval from the multiple dimension groups;
increasing the group weight value of the dimensionality group with the maximum score and/or decreasing the group weight value of the dimensionality group with the minimum score;
and if the group weight values of the multiple dimension groups meet the preset adjustment condition, returning to the step of acquiring the component value of each dimension group according to the group index value of each dimension group in the multiple dimension groups and the group weight value of the dimension group.
8. The method of claim 4, wherein the method further comprises:
for any dimension group in the plurality of dimension groups, determining a group index value of each working personnel in the working group relative to the dimension group according to the group index value of each working personnel in the working group relative to the dimension group;
and generating a visual ranking result of the plurality of working groups for each dimension group in the plurality of dimension groups according to the group index value of the plurality of working groups for each dimension group in the plurality of dimension groups.
9. An apparatus for determining an operating condition, comprising:
the acquisition module is used for acquiring the information of each working image of a worker in the working process;
the processing module is used for obtaining a dimension index value of the working image information for each characteristic dimension according to the working image information and a neural network model aiming at any one of the working image information; the neural network model is obtained by performing machine learning training according to an image training data set; any image training data in the image training data set comprises: the dimension index values of the image training information and the image training information for each feature dimension; aiming at any characteristic dimension, determining a working index value of the worker in the characteristic dimension according to the dimension index value of each working image information in the characteristic dimension; and determining the working condition of the worker at least according to the working index value of the worker for each characteristic dimension.
10. A computer device comprising a program or instructions that, when executed, perform the method of any of claims 1 to 8.
11. A computer-readable storage medium comprising a program or instructions which, when executed, perform the method of any of claims 1 to 8.
CN202010891917.0A 2020-08-28 2020-08-28 Method and device for determining working condition Pending CN112036744A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010891917.0A CN112036744A (en) 2020-08-28 2020-08-28 Method and device for determining working condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010891917.0A CN112036744A (en) 2020-08-28 2020-08-28 Method and device for determining working condition

Publications (1)

Publication Number Publication Date
CN112036744A true CN112036744A (en) 2020-12-04

Family

ID=73586293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010891917.0A Pending CN112036744A (en) 2020-08-28 2020-08-28 Method and device for determining working condition

Country Status (1)

Country Link
CN (1) CN112036744A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109145843A (en) * 2018-08-29 2019-01-04 上海萃舟智能科技有限公司 A kind of full vehicle information identification system of bayonet high definition camera and method
CN109635697A (en) * 2018-12-04 2019-04-16 国网浙江省电力有限公司电力科学研究院 Electric operating personnel safety dressing detection method based on YOLOv3 target detection
CN109948890A (en) * 2019-01-30 2019-06-28 红云红河烟草(集团)有限责任公司 Behavior evaluation method and system for workshop machine station operators
CN109948437A (en) * 2019-02-01 2019-06-28 广州玖分半网络科技有限公司 A kind of kitchen management method for campus
CN111178212A (en) * 2019-12-23 2020-05-19 深圳供电局有限公司 Image recognition method and device, computer equipment and storage medium
CN111353452A (en) * 2020-03-06 2020-06-30 国网湖南省电力有限公司 Behavior recognition method, behavior recognition device, behavior recognition medium and behavior recognition equipment based on RGB (red, green and blue) images
CN111582073A (en) * 2020-04-23 2020-08-25 浙江大学 Transformer substation violation identification method based on ResNet101 characteristic pyramid

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109145843A (en) * 2018-08-29 2019-01-04 上海萃舟智能科技有限公司 A kind of full vehicle information identification system of bayonet high definition camera and method
CN109635697A (en) * 2018-12-04 2019-04-16 国网浙江省电力有限公司电力科学研究院 Electric operating personnel safety dressing detection method based on YOLOv3 target detection
CN109948890A (en) * 2019-01-30 2019-06-28 红云红河烟草(集团)有限责任公司 Behavior evaluation method and system for workshop machine station operators
CN109948437A (en) * 2019-02-01 2019-06-28 广州玖分半网络科技有限公司 A kind of kitchen management method for campus
CN111178212A (en) * 2019-12-23 2020-05-19 深圳供电局有限公司 Image recognition method and device, computer equipment and storage medium
CN111353452A (en) * 2020-03-06 2020-06-30 国网湖南省电力有限公司 Behavior recognition method, behavior recognition device, behavior recognition medium and behavior recognition equipment based on RGB (red, green and blue) images
CN111582073A (en) * 2020-04-23 2020-08-25 浙江大学 Transformer substation violation identification method based on ResNet101 characteristic pyramid

Similar Documents

Publication Publication Date Title
TWI742382B (en) Neural network system for vehicle parts recognition executed by computer, method for vehicle part recognition through neural network system, device and computing equipment for vehicle part recognition
CN110728209B (en) Gesture recognition method and device, electronic equipment and storage medium
US20200272866A1 (en) Computer-executed method and apparatus for assessing vehicle damage
EP3540633A1 (en) Method for identifying an object within an image and mobile device for executing the method
CN108885700A (en) Data set semi-automatic labelling
EP3038025A1 (en) Retention risk determiner
CN106934392A (en) Vehicle-logo recognition and attribute forecast method based on multi-task learning convolutional neural networks
CN110930017A (en) Data processing method and device
CN114937179A (en) Junk image classification method and device, electronic equipment and storage medium
CN114118303B (en) Face key point detection method and device based on prior constraint
Liu et al. Attentive semantic and perceptual faces completion using self-attention generative adversarial networks
CN111353577B (en) Multi-task-based cascade combination model optimization method and device and terminal equipment
CN112036744A (en) Method and device for determining working condition
CN116305103A (en) Neural network model backdoor detection method based on confidence coefficient difference
CN110262950A (en) Abnormal movement detection method and device based on many index
Khavalko et al. Classification and Recognition of Medical Images Based on the SGTM Neuroparadigm.
Jayanthi et al. Improved Bayesian regularisation using neural networks based on feature selection for software defect prediction
Lv et al. An image rendering-based identification method for apples with different growth forms
Mahmoodpour et al. A learning based contrast specific no reference image quality assessment algorithm
CN110728310B (en) Target detection model fusion method and fusion system based on super-parameter optimization
CN110855467B (en) Network comprehensive situation prediction method based on computer vision technology
CN114120057A (en) Confusion matrix generation method based on Paddledetection
Varkentin et al. Development of an application for vehicle classification using neural networks technologies
Arroyo et al. Adaptive image pre-processing for quality control in production lines
Szwed Video event recognition with fuzzy semantic petri nets

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination