CN115456421A - Work order dispatching method and device, processor and electronic equipment - Google Patents

Work order dispatching method and device, processor and electronic equipment Download PDF

Info

Publication number
CN115456421A
CN115456421A CN202211131146.0A CN202211131146A CN115456421A CN 115456421 A CN115456421 A CN 115456421A CN 202211131146 A CN202211131146 A CN 202211131146A CN 115456421 A CN115456421 A CN 115456421A
Authority
CN
China
Prior art keywords
target
work order
word
level
words
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211131146.0A
Other languages
Chinese (zh)
Inventor
路民超
熊俊杰
邱宏旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202211131146.0A priority Critical patent/CN115456421A/en
Publication of CN115456421A publication Critical patent/CN115456421A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063118Staff planning in a project environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • G06F16/353Clustering; Classification into predefined classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9027Trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/90335Query processing
    • G06F16/90344Query processing by using string matching techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/253Grammatical analysis; Style critique
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Software Systems (AREA)
  • Marketing (AREA)
  • Accounting & Taxation (AREA)
  • Operations Research (AREA)
  • Biomedical Technology (AREA)
  • Quality & Reliability (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Technology Law (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Machine Translation (AREA)

Abstract

The application discloses a work order dispatching method and device, a processor and electronic equipment, and relates to the field of artificial intelligence. The method comprises the following steps: acquiring first potential characteristics of a plurality of target words in a target work order, wherein the target work order is the work order to be assigned, and the target words are words except stop words in the target work order; calculating to obtain a plurality of probability values based on the first potential characteristics by combining the full connection layer and the normalization index function of the neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold; and determining a target mechanism corresponding to the target work order based on the target mechanism level, and distributing the target work order to the target mechanism. Through the method and the device, the problem that the dispatching accuracy of the work orders in the related technology is low is solved.

Description

Work order dispatching method and device, processor and electronic equipment
Technical Field
The application relates to the field of artificial intelligence, in particular to a work order dispatching method and device, a processor and electronic equipment.
Background
In the related art, the processing mode of the work order is generally manually and gradually shifted down according to the mechanism level, which undoubtedly increases the work order processing time. In addition, manual assignment of work orders not only requires intensive professional literacy, but also consumes a great deal of effort. Moreover, the work order handler needs to pay attention to the work order assignment every day at regular time to reduce the work order response time. Therefore, the work order to be assigned is processed, which causes an additional burden on the relevant person. Therefore, it is necessary to invent a scheme for intelligently allocating work orders. However, most of the solutions for intelligently allocating the work orders in the related art lack the capture of deep semantic information of the work order text, which leads to low accuracy of allocating the work orders.
Aiming at the problem of low accuracy of dispatching work orders in the related technology, no effective solution is provided at present.
Disclosure of Invention
The present application mainly aims to provide a work order assignment method and apparatus, a processor, and an electronic device, so as to solve the problem of low accuracy in assigning work orders in the related art.
In order to achieve the above object, according to one aspect of the present application, there is provided a work order assignment method. The method comprises the following steps: acquiring first potential characteristics of a plurality of target words in a target work order, wherein the target work order is a work order to be dispatched, and the target words are words except stop words in the target work order; calculating and obtaining a plurality of probability values based on the first potential features by combining a full connection layer and a normalization index function of a neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold; and determining a target mechanism corresponding to the target work order based on the target mechanism level, and distributing the target work order to the target mechanism.
Further, determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold comprises: determining the highest two of the plurality of probability values; obtaining a target value by subtracting the two maximum probability values in the probability values; judging whether the target value is larger than the first preset threshold value or not; if the target value is larger than the first preset threshold, determining the maximum probability value in the probability values, and taking the mechanism level corresponding to the maximum probability value as the target mechanism level; if the target numerical value is not larger than the first preset threshold value, second potential features of a plurality of target words in the target work order are obtained; and determining a target mechanism level corresponding to the target work order based on the second potential feature by combining the fully-connected layer of the neural network and the normalized exponential function.
Further, obtaining the first potential features of the plurality of target terms in the target work order comprises: and filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters; filtering stop words in the first work order based on a stop word list to obtain a second work order; based on a first word dictionary, performing word segmentation processing on each word in the second work order to obtain a plurality of word vectors; processing each word vector to obtain a word embedding matrix; and inputting the word embedding matrix into an ELMO model for processing to obtain first potential features of a plurality of target words in the target work order.
Further, obtaining second potential features of the plurality of target terms in the target work order comprises: obtaining a weight matrix of each organization level based on the occurrence condition of each target word in each second word dictionary in the target work order and the TF-IDF matrix of each organization level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each organization level; performing dot product operation on the weight matrix of each mechanism level and the word embedding matrix to obtain a weight matrix of each mechanism level; constructing an image matrix according to the weighting matrix of each mechanism level and the first potential characteristics; and performing convolution processing on the image matrix, and obtaining second potential features of a plurality of target words in the target work order based on a maximum pooling method.
Further, inputting the word embedding matrix into an ELMO model for processing, and obtaining first potential features of a plurality of target words in the target work order includes: respectively utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to process the target work order to obtain a third latent feature and a fourth latent feature; and carrying out weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain first potential features of a plurality of target words in the target work order.
Further, before obtaining a weight matrix of each institution level based on the occurrence of each target word in each second word dictionary and the TF-IDF matrix of each institution level in the target work order, the method further comprises: acquiring a plurality of historical worksheets with special characters, useless characters and stop words filtered out; dividing each historical work order into corresponding mechanism levels; performing word segmentation processing on the historical work orders of each mechanism level, and establishing a plurality of second word dictionaries; and obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
Further, based on the target organization hierarchy, determining a target organization corresponding to the target work order comprises: extracting key information containing the target organization level in the target work order; performing text similarity matching on the key information and the mechanism tree of the target mechanism level to obtain similarity; judging whether the similarity is greater than a second preset threshold value or not; if the similarity is greater than the second preset threshold, determining a target mechanism corresponding to the target work order from the mechanism tree; and if the similarity is not greater than the second preset threshold, taking the mechanism with the highest level as the target mechanism corresponding to the target work order.
In order to achieve the above object, according to another aspect of the present application, there is provided a work order dispatching device. The device includes: the system comprises a first obtaining unit, a first processing unit and a second obtaining unit, wherein the first obtaining unit is used for obtaining first potential characteristics of a plurality of target words in a target work order, the target work order is a work order to be assigned, and the target words are words except stop words in the target work order; the first calculation unit is used for calculating and obtaining a plurality of probability values based on the first potential features by combining a full connection layer and a normalization index function of a neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; the first determining unit is used for determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold value; and the first processing unit is used for determining a target mechanism corresponding to the target work order based on the target mechanism level and distributing the target work order to the target mechanism.
Further, the first determination unit includes: a first determining module for determining the highest two of the plurality of probability values; the first calculation module is used for subtracting the two maximum probability values in the probability values to obtain a target value; the first judgment module is used for judging whether the target value is larger than the first preset threshold value or not; a second determining module, configured to determine, if the target value is greater than the first preset threshold, a maximum probability value of the multiple probability values, and use an organization hierarchy corresponding to the maximum probability value as the target organization hierarchy; the first obtaining module is used for obtaining second potential characteristics of a plurality of target words in the target work order if the target numerical value is not larger than the first preset threshold value; and the third determining module is used for determining a target mechanism level corresponding to the target work order based on the second potential feature by combining the fully-connected layer of the neural network and the normalized exponential function.
Further, the first acquisition unit includes: the first filtering module is used for filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters; the second filtering module is used for filtering stop words in the first work order based on a stop word list to obtain a second work order; the first processing module is used for carrying out word segmentation processing on each word in the second work order based on a first word dictionary to obtain a plurality of word vectors; the second processing module is used for processing each word vector to obtain a word embedding matrix; and the third processing module is used for inputting the word embedding matrix into an ELMO model for processing to obtain first potential characteristics of a plurality of target words in the target work order.
Further, the first obtaining module comprises: the first determining submodule is used for obtaining a weight matrix of each organization level based on the occurrence condition of each target word in each second word dictionary in the target work order and the TF-IDF matrix of each organization level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each organization level; the first operation submodule is used for performing dot product operation on the weight matrix of each mechanism level and the word embedding matrix respectively to obtain the weight matrix of each mechanism level; the first construction submodule is used for constructing an image matrix according to the weighting matrix of each mechanism level and the first potential features; and the second determining submodule is used for performing convolution processing on the image matrix and obtaining second potential characteristics of a plurality of target words in the target work order based on the maximum pooling device.
Further, the third processing module comprises: the first processing sub-module is used for processing the target work order by respectively utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to obtain a third latent feature and a fourth latent feature; and the third determining submodule is used for performing weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain first potential features of a plurality of target words in the target work order.
Further, the apparatus further comprises: a second obtaining unit, configured to obtain a plurality of historical work orders with special characters, useless characters, and stop words filtered out before obtaining a weight matrix of each organization level based on an occurrence condition of each target word in each second word dictionary and a TF-IDF matrix of each organization level in the target work orders; the first dividing unit is used for dividing each historical work order into corresponding mechanism levels; the first establishing unit is used for performing word segmentation processing on the historical work orders of each mechanism level and establishing a plurality of second word dictionaries; and the second determining unit is used for obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
Further, the first processing unit includes: the first extraction module is used for extracting key information containing the target mechanism level in the target work order; the fourth determining module is used for performing text similarity matching on the key information and the mechanism tree of the target mechanism level to obtain similarity; the second judgment module is used for judging whether the similarity is greater than a second preset threshold value or not; a fifth determining module, configured to determine, if the similarity is greater than the second preset threshold, a target mechanism corresponding to the target work order from the mechanism tree; and the sixth determining module is used for taking the mechanism with the highest level as the target mechanism corresponding to the target work order if the similarity is not greater than the second preset threshold.
To achieve the above object, according to another aspect of the present application, there is provided a processor for executing a program, wherein the program executes to perform the method for dispatching a work order as described in any one of the above.
To achieve the above object, according to another aspect of the present application, there is provided an electronic device including one or more processors and a memory for storing one or more programs, wherein when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the dispatch method for a work order of any one of the above.
Through the application, the following steps are adopted: acquiring first potential characteristics of a plurality of target words in a target work order, wherein the target work order is the work order to be assigned, and the target words are words except stop words in the target work order; calculating and obtaining a plurality of probability values based on the first potential characteristics by combining a full connection layer and a normalization index function of the neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold; and determining a target mechanism corresponding to the target work order based on the target mechanism level, and assigning the target work order to the target mechanism, so that the problem of low accuracy of assigning the work order in the related technology is solved. By combining a full connection layer and a normalization index function of a neural network, calculating to obtain a probability value of a target work order corresponding to each mechanism level based on the acquired potential characteristics of a plurality of words in the work order to be processed, determining the mechanism level corresponding to the work order to be processed according to the calculated probability value and a preset threshold value, determining the mechanism corresponding to the work order to be processed according to the mechanism level corresponding to the work order to be processed, and assigning the work order to be processed to the mechanism, thereby achieving the effect of improving the accuracy of assigning the work order.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
FIG. 1 is a flow chart of a method for dispatching work orders provided according to an embodiment of the present application;
FIG. 2 is a flow diagram of text pre-processing module training in an embodiment of the present application;
FIG. 3 is a flow chart of a prediction phase in an embodiment of the present application;
FIG. 4 is a flow chart of a word embedding module in an embodiment of the present application;
FIG. 5 is a flow diagram of a work order potential dependency learning module in an embodiment of the present application;
FIG. 6 is a flow diagram of a dispatch level prediction module in an embodiment of the present application;
FIG. 7 is a flow chart of a decision module in an embodiment of the present application;
FIG. 8 is a flow diagram of an reinforcement learning module in an embodiment of the present application;
FIG. 9 is a flow chart of an institution matching module in an embodiment of the application;
FIG. 10 is a flow diagram of an automated work order assignment module in an embodiment of the subject application;
FIG. 11 is a flow chart of an alternative work order assignment method provided in accordance with an embodiment of the present application;
FIG. 12 is a schematic diagram of a work order dispatching device provided in accordance with an embodiment of the present application;
fig. 13 is a schematic diagram of an electronic device provided according to an embodiment of the application.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that relevant information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for presentation, analyzed data, etc.) referred to in the present disclosure are information and data that are authorized by the user or sufficiently authorized by various parties. For example, an interface is provided between the system and the relevant user or organization, before obtaining the relevant information, an obtaining request needs to be sent to the user or organization through the interface, and after receiving the consent information fed back by the user or organization, the relevant information is obtained.
For convenience of description, some terms or expressions referred to in the embodiments of the present application are explained below:
the ELMO Model is a bidirectional Language Model, namely a novel deep contextualized word token, and is called an Embedding from Language Model. It uses bi-directional LSTM as the basic component of the network, and can model the complex characteristics (such as syntax and semantics) of the words and the changes of the words in the language context (i.e. model the ambiguous words). The word vector is a function of the internal state of a bidirectional language model (Bilm) and is formed by pre-training in a large text corpus. In addition, we must think of word2vec by saying word vector because the word vector concept proposed in it brings great improvement to the development of Natural Language Processing (NLP). The ELMO model is mainly implemented by training a complete language model, and then processing a text to be trained by the language model to generate corresponding word vectors, so that the ELMO model can generate different word vectors in different sentences for the same character.
The Long Short-Term Memory network (LSTM) is a time-cycle Neural network, which is specially designed to solve the Long-Term dependence problem of the general cycle Neural network (RNN), and all RNNs have a chain form of a repetitive Neural network module.
The normalized exponential function, or Softmax function, can "compress" a K-dimensional vector z containing arbitrary real numbers into another K-dimensional real vector σ (z) such that each element ranges between (0, 1) and the sum of all elements is 1. And the function is used in multi-classification problems.
And the full connection layer is formed by connecting each node with all nodes of the previous layer and is used for integrating the extracted features. The parameters of a fully connected layer are also typically the most due to its fully connected nature.
TF-IDF (term frequency-inverse document frequency) is a commonly used weighting technique for information retrieval and data mining. TF is Term Frequency (Term Frequency), and IDF is Inverse text Frequency index (Inverse Document Frequency). That is, TF-IDF is a statistical method to assess how important a word is to one of the documents in a corpus or a corpus. The importance of a word increases in direct proportion to the number of times it appears in a document, but at the same time decreases in inverse proportion to the frequency with which it appears in the corpus. Various forms of TF-IDF weighting are often applied by search engines as a measure or rating of the degree of relevance between a document and a user query.
Word2vec, a correlation model for generating Word vectors, is used for training to reconstruct linguistic Word text.
TextCNN is an algorithm for classifying text using a convolutional neural network.
CNN is called Convolutional Neural Networks in its full name and Chinese name. And identifying the image characters in a convolution mode.
RPA technology refers to Robotic Process Automation, which is collectively known in english as Robotic Process Automation. It is a digital technology, and can make the operation of program and system implement automation by means of imitating user operation on computer.
The present invention is described below with reference to preferred implementation steps, and fig. 1 is a flowchart of a work order assignment method provided in an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step S101, first potential characteristics of a plurality of target words in a target work order are obtained, wherein the target work order is a work order to be assigned, and the target words are words except stop words in the target work order.
For example, the first potential feature described above may be a potential syntactic semantic feature of a term in the work order. The stop words in the work order to be dispatched are filtered, and the potential syntactic semantic information of the rest words in the work order is obtained.
And S102, calculating and obtaining a plurality of probability values based on the first potential characteristics by combining the full connection layer and the normalization index function of the neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level.
For example, according to the potential syntactic semantic information of the remaining words except the stop word in the work order, the probability of each processing level corresponding to the work order is output through the full connection layer and softmax normalization processing. In addition, the target work order may be a work order to be assigned in a financial institution, and the institution level may be a branch level, and a branch level.
And step S103, determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold value.
For example, according to the probability value of each processing level corresponding to the output work order and a preset threshold, the level to which the work order to be assigned belongs is determined, that is, it is specifically determined whether the work order to be assigned is assigned to a branch line, a branch line or a network point.
And step S104, determining a target mechanism corresponding to the target work order based on the target mechanism level, and distributing the target work order to the target mechanism.
For example, after determining to assign the work order to be assigned to the branch, it is determined to which branch the work order to be assigned is specifically assigned. The assignment of branches and dots is similar to the assignment of branches. That is, after determining that the work order to be assigned is assigned to the branch line, determining to which branch line the work order to be assigned is specifically assigned; after determining to assign the work order to be assigned to the network site, it is determined to which network site the work order to be assigned is specifically assigned.
Through the steps from S101 to S104, the probability value of the target work order corresponding to each mechanism level is calculated and obtained based on the acquired potential characteristics of a plurality of words in the work order to be processed by combining the full connection layer and the normalization index function of the neural network, the mechanism level corresponding to the work order to be processed is determined according to the calculated probability value and the preset threshold value, the mechanism corresponding to the work order to be processed is determined according to the mechanism level corresponding to the work order to be processed, the work order to be processed is assigned to the mechanism, and therefore the effect of improving the accuracy of assignment of the work order is achieved.
In order to quickly and accurately determine the TF-IDF matrix of each organization level, in the work order dispatching method provided in the embodiment of the application, the TF-IDF matrix of each organization level may also be determined through the following steps: acquiring a plurality of historical worksheets with special characters, useless characters and stop words filtered out; dividing each historical work order into corresponding mechanism levels; performing word segmentation processing on the historical work orders of each mechanism level, and establishing a plurality of second word dictionaries; and obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
For example, an automated work order assignment scheme may include a text preprocessing module, a word embedding module, a work order potential dependency capture module, a decision making module, an reinforcement learning module, an assignment hierarchy prediction module, an organization matching module, and an automated work order assignment module. In addition, fig. 2 is a flowchart of the training of the text preprocessing module in the embodiment of the present application, and as shown in fig. 2, the text preprocessing module first filters special and useless characters in the work order, removes stop words based on the stop word list, and then performs word segmentation on the historical work order by using a jieba word segmentation tool to establish a word list. And dividing the work order into branch lines, branch lines and mesh point levels according to the actual processing level of the historical work order, and calculating a TF-IDF matrix in each level by a TF-IDF calculation method so as to be used by a subsequent reinforcement learning module.
By the scheme, the word list and the TF-IDF matrix in each level can be established according to the historical work order, so that the subsequent word embedding module and the reinforcement learning module can be conveniently used.
In order to quickly and accurately obtain the first potential features of the multiple target words in the target work order, in the assignment method for the work order provided by the embodiment of the application, the first potential features of the multiple target words in the target work order can be obtained through the following steps: filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters; filtering the stop words in the first work order based on the stop word list to obtain a second work order; based on the first word dictionary, performing word segmentation processing on each word in the second work order to obtain a plurality of word vectors; processing each word vector to obtain a word embedding matrix; and inputting the word embedding matrix into an ELMO model for processing to obtain first potential characteristics of a plurality of target words in the target work order.
For example, fig. 3 is a flowchart of a prediction stage in an embodiment of the present application, and as shown in fig. 3, special and useless characters in a newly input work order are filtered, stop words are removed based on a stop word list, and then a word vector is created by performing word segmentation on the newly input work order with the help of a jieba word segmentation tool. In addition, fig. 4 is a flowchart of the word embedding module in the embodiment of the present application, and as shown in fig. 4, the word vector is segmented according to the established word list, and the segmented word vector is obtained, and then the segmented word vector is represented as a work order embedding matrix based on the word2vec model, so as to facilitate subsequent model modeling and learning. And finally, inputting the work order embedded matrix into the trained ELMO model, and obtaining the potential grammatical semantic features of the words in the newly input work order.
Through the scheme, the potential syntactic semantic features of the words in the newly input work order can be quickly and accurately obtained through the work order embedding matrix and the ELMO model.
In order to quickly and accurately obtain the first potential features of the multiple target words in the target work order, in the assignment method for the work order provided by the embodiment of the application, the first potential features of the multiple target words in the target work order can be obtained through the following steps: respectively processing the target work order by utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to obtain a third latent feature and a fourth latent feature; and carrying out weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain first potential features of a plurality of target words in the target work order.
For example, fig. 2 is a flowchart of the training of the text preprocessing module in the embodiment of the present application, and as shown in fig. 2, the work order potential dependency capturing module loads the ELMO model that has been pre-trained by a large amount of chinese corpora before modeling the work order text potential dependency, and retrains the ELMO pre-training model for a specified turn by using the outbound dialogue and the historical work order data, and fine-tunes the relevant parameters of the pre-training model. And then embedding the preprocessed work order text, inputting the embedded work order text into the fine-tuned ELMO model, and generating a situation semantic expression so as to model the potential dependency of the text and help the dispatch level prediction module to learn the similarity of the work orders at the same dispatch level.
In addition, the ELMO is constructed by adopting a bidirectional long-short term memory network (BilSTM) structure based on a bidirectional language model (BilM), and the training target is to maximize the probability of correct prediction of the forward and backward language model. Specifically, a sequence of length N (t) is given 1 ,...t N ) The forward language model is based on a sequence of historical positions (t) 1 ,...,t k-1 ) Predicting the next position as t using forward LSTM modeling learning k Is maximized, and then the backward language model is reversed, based on the sequence of future positions (t) k+1 ,...,t N ). Predicting historical position k as t using backward LSTM modeling learning k So the training goal of the ELMO model is to maximize the log-likelihood probability:
Figure BDA0003850354260000091
wherein, theta x To input the embedded vectors of the ELMO model,
Figure BDA0003850354260000092
latent vector representation output for forward (backward) LSTM layer,Θ s The post context matrix is normalized for the softmax function.
Moreover, fig. 5 is a flowchart of the work order potential dependency learning module in the embodiment of the present application, and as shown in fig. 5, in the ELMO model, the work order after word segmentation is processed by word2vec to generate an embedded vector, and then the embedded vector is input into a two-layer BiLSTM structure to model syntactic semantic features, and the model output is the embedded vector of the work order, the hidden state output by forward LSTM, and the weighted average of the hidden state output by backward LSTM. In addition, in order to represent the text semantics of the work order better, a corpus established by financial knowledge corpus and historical work order data is input into the ELMO pre-training model in advance, and a small number of training rounds are determined based on experimental results. And then according to the determined turn, retraining the ELMO, fine-tuning related parameters, and storing the trained model. And when a new work order appeal is input into the module, loading the fine-tuned model and outputting the potential semantic expression characteristics of the work order text.
In conclusion, the ELMO model is trained, and the pre-trained ELMO model is used to obtain the potential syntactic semantic features of the words in the newly input work order.
In order to quickly and accurately determine the target mechanism level corresponding to the target work order, in the work order dispatching method provided in the embodiment of the present application, the target mechanism level corresponding to the target work order may also be determined through the following steps: determining the highest two of the plurality of probability values; obtaining a target value by subtracting the two maximum probability values in the multiple probability values; judging whether the target value is larger than a first preset threshold value or not; if the target value is larger than a first preset threshold value, determining the maximum probability value in the probability values, and taking the mechanism level corresponding to the maximum probability value as a target mechanism level; if the target value is not larger than the first preset threshold value, second potential features of a plurality of target words in the target work order are obtained; and determining a target mechanism level corresponding to the target work order based on the second potential characteristics by combining the full connection layer of the neural network and the normalized index function.
For example, fig. 6 is a flowchart of a dispatch level prediction module in an embodiment of the present application, and as shown in fig. 6, the dispatch level prediction module is composed of a fully connected layer and a softmax function, and is responsible for mapping captured potential dependency expressions to dispatch levels of a work order and generating a prediction of a model. It should be noted that when the assignment level prediction module receives the input from the work order text potential dependency capture module, the output of the module is not necessarily used as the assignment level finally predicted by the model, but the weighted value and the prediction level processed by softmax are transmitted to the decision module, and after being determined by the decision module, the decision module determines whether to directly output the level (the mechanism level with the maximum output probability value is used as the level corresponding to the work order to be assigned) or enter the reinforcement learning module.
In addition, fig. 7 is a flow chart of the decision module in the embodiment of the present application, and as shown in fig. 7, the decision module determines the output of the dispatch level prediction module. And if the judgment result is credible, directly outputting the dispatching hierarchy predicted by the dispatching hierarchy prediction module, and taking the mechanism hierarchy with the maximum probability value as the hierarchy corresponding to the work order to be dispatched. Otherwise, the reinforcement learning module is entered for further classification. Specifically, a confidence function f(s) is defined 1 ,s 2 ):
Figure BDA0003850354260000101
Wherein s is 1 ,s 2 And respectively calculating two values with the maximum probability after the calculation of the softmax function in the assignment level prediction module, wherein the sigma is a threshold value. When | s 1 -s 2 When | ≧ σ, f(s) 1 ,s 2 ) The output is 1, namely the result of model prediction is clear, and the assignment level of the output model can be directly classified. If the output is 0, namely the prediction reliability of the model is low, the relation between the keywords in the work order and the classification level is further mined through the reinforcement learning module.
By the scheme, the mechanism level corresponding to the work order to be assigned can be quickly and accurately determined according to the probability value.
In order to quickly and accurately obtain the second potential features of the multiple target words in the target work order, in the assignment method for the work order provided by the embodiment of the application, the second potential features of the multiple target words in the target work order may also be obtained through the following steps: obtaining a weight matrix of each mechanism level based on the occurrence condition of each target word in the target work order in each second word dictionary and the TF-IDF matrix of each mechanism level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each mechanism level; performing dot product operation on the weight matrix of each mechanism level and the word embedding matrix respectively to obtain a weighting matrix of each mechanism level; constructing an image matrix according to the weighting matrix and the first potential features of each mechanism level; and performing convolution processing on the image matrix, and obtaining second potential features of a plurality of target words in the target work order based on a maximum pooling method.
For example, fig. 8 is a flowchart of an reinforcement learning module in an embodiment of the present application, as shown in fig. 8, in the reinforcement learning module, through a TF-IDF method, a degree of influence of a word in a work order on a distribution level is calculated, and a different weight matrix is established for each distribution level. When the degree of distinguishing the input work orders by the main model is insufficient, the learned work order embedded expression is weighted by using the weight matrix of the corresponding level, so that the importance of the words of the strong distinguishing force on the classification of the model is further enhanced, and the model is helped to better distinguish the assignment level of the work orders.
Specifically, the filtered historical work orders are grouped based on their assigned hierarchy, all terms are extracted from the corresponding group i (i ∈ 1,2, 3), and a TF-IDF matrix of the corresponding class is established
Figure BDA0003850354260000117
The elements in the matrix characterize the degree of importance of the word in category i and are calculated by the following formula.
Figure BDA0003850354260000111
Figure BDA0003850354260000112
Figure BDA0003850354260000113
Wherein, for any group i,
Figure BDA0003850354260000114
and | d i | respectively denote words t i Number of occurrences and total number of words, | O i I denotes the work order o i The number of the (c) component(s),
Figure BDA0003850354260000115
the word t being included in the presentation group i Of the number of work orders, and | T i And | represents the number of all words in the work order.
Based on TF-IDF matrix
Figure BDA0003850354260000116
The weight matrix for the input work order may be generated by the following rules. When a word in the input work order appears in the work order of the corresponding category (dispatch level) i, we assign it to the corresponding category TF-IDF matrix M i For the maximum value of the column of the corresponding word position in the work order, the word in the work order does not appear in the work order of the i category, but exists in the work orders of other categories k (k ≠ i), and M is allocated to the word position i Minimum of all column maximums. If the word does not appear in the existing TF-IDF matrix M, the word is assigned with the average value of recognized words in the weight matrix, and the word is regarded as a 'non-influence' word. By the method, a corresponding weight matrix is established for each input work order, and the weight matrix and the embedded vector output by the word embedding module are subjected to point multiplication operation to obtain the work order embedded vector of each word under different categories after weighting.
And then, stacking the output of the ELMO layer and the multi-class embedded weighting matrix established above, jointly inputting the output into the multi-channel textCNN to capture the influence of words on different importance of different classes, outputting the learned potential dependency expression through maximum pooling dimension reduction, transmitting the learned potential dependency expression into a distribution level prediction module, and outputting the predicted distribution level.
In conclusion, the potential features of the words in the work order to be assigned after modeling can be rapidly and accurately output through the reinforcement learning module.
In order to quickly and accurately determine a target mechanism corresponding to a target work order, in the work order dispatching method provided in the embodiment of the present application, the target mechanism corresponding to the target work order may also be determined through the following steps: extracting key information containing a target organization level in a target work order; performing text similarity matching on the key information and a mechanism tree of a target mechanism level to obtain similarity; judging whether the similarity is greater than a second preset threshold value or not; if the similarity is larger than a second preset threshold value, determining a target mechanism corresponding to the target work order from the mechanism tree; and if the similarity is not greater than a second preset threshold, taking the mechanism with the highest level as a target mechanism corresponding to the target work order.
For example, the text similarity calculation module transmits the processed work order content and the mechanism tree into the text similarity calculation module, processes the work order content based on the regular expression, extracts key information, performs similarity matching calculation with the mechanism tree, adopts a sequence mather (for comparing two characters and returning data according to the similarity of the two characters) method for similarity matching, and outputs a corresponding mechanism name if the calculated similarity is greater than a specified threshold value.
In addition, fig. 9 is a flowchart of the mechanism matching module in the embodiment of the present application, and as shown in fig. 9, the mechanism matching module processes the work order content matching work order processing mechanism by adopting different methods based on the assignment level difference of the model output, and the specific rule is as follows:
(1) The mechanism level input into the module is divided into lines, the work order is screened according to the established expert rule and the business category to which the work order belongs, and if the screening is successful, the corresponding department name is output according to the screening category.
(2) The input level is a branch line, the module firstly extracts key information containing branch line names in the work order based on means such as regular expressions and the like, and calculates and matches the key information with branch line subordinate branch line mechanism trees through a text similarity calculation module, if the matched branch line names are output, branch line part rooms to which the work order belongs are further screened according to expert rules, business categories to which the work order belongs and the like, and if the screening is successful, corresponding branch line part room names are output.
(3) The input level is a network point, the module adopts the same processing method to extract key information containing the network point, and performs text similarity calculation with the network point mechanism tree subordinate to the branch, if the matching is successful, the corresponding network point name is output.
(4) If the module is not matched with the organization successfully, the department name processed by the line level work order is output by default.
In addition, fig. 10 is a flowchart of an automated work order assignment module in an embodiment of the present application, and as shown in fig. 10, the automated work order assignment module is configured to receive output content from the facility matching module by the RPA, locate an assignment operation element position, and then automatically assign a work order to a corresponding facility level for processing.
Through the scheme, the work orders can be rapidly and accurately dispatched to the corresponding processing mechanisms according to the mechanism levels corresponding to the work orders to be dispatched. Meanwhile, the RPA technology can replace manpower to finish work order allocation operation, thereby reducing the time for business personnel to process work orders and reducing the burden of business.
For example, fig. 11 is a flowchart of an optional work order assignment method provided according to an embodiment of the present application, and as shown in fig. 11, work order contents are embedded through word2Vec, deep-level potential semantic grammatical features of the embedded work order are captured by using multiple layers of BiLSTM, an influence of a word ambiguity on model assignment work orders is solved, a decision mechanism is introduced, based on a judgment of the prediction reliability of the model on the work order assignment levels, a weighted embedding matrix is established for each assignment level, the weighted embedding matrix and potential semantic grammatical expression of the multiple layers of BiLSTM structures are stacked, and are jointly transmitted to textCNN for learning and modeling, finally, a processing level corresponding to a work order is output through full connection layer and softmax normalization processing, and a manual processing work order is simulated by means of an RPA technology, so that intelligent assignment of the work order is realized, thereby time of service personnel is saved, and burden of services is reduced.
In summary, according to the work order assignment method provided by the embodiment of the application, first potential features of a plurality of target words in a target work order are obtained, wherein the target work order is the work order to be assigned, and the target words are words in the target work order except stop words; calculating and obtaining a plurality of probability values based on the first potential characteristics by combining a full connection layer and a normalization index function of the neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold; and determining a target mechanism corresponding to the target work order based on the target mechanism level, and assigning the target work order to the target mechanism, so that the problem of low accuracy of assigning the work order in the related technology is solved. By combining a full connection layer and a normalization index function of a neural network, calculating to obtain a probability value of a target work order corresponding to each mechanism level based on the acquired potential characteristics of a plurality of words in the work order to be processed, determining the mechanism level corresponding to the work order to be processed according to the calculated probability value and a preset threshold value, determining the mechanism corresponding to the work order to be processed according to the mechanism level corresponding to the work order to be processed, and assigning the work order to be processed to the mechanism, thereby achieving the effect of improving the accuracy of assigning the work order.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than here.
The embodiment of the present application further provides a work order dispatching device, and it should be noted that the work order dispatching device in the embodiment of the present application can be used for executing the work order dispatching method provided in the embodiment of the present application. The following describes a work order dispatching device provided in an embodiment of the present application.
FIG. 12 is a schematic diagram of a work order dispatching device according to an embodiment of the present application. As shown in fig. 12, the apparatus includes: a first acquisition unit 1201, a first calculation unit 1202, a first determination unit 1203, and a first processing unit 1204.
Specifically, the first obtaining unit 1201 is configured to obtain first potential features of a plurality of target words in a target work order, where the target work order is a work order to be assigned, and the target words are words in the target work order except stop words;
the first calculating unit 1202 is configured to calculate, based on the first potential feature, a plurality of probability values by combining a full connection layer of the neural network and the normalized index function, where the probability values are probability values of the target work order corresponding to each mechanism level;
a first determining unit 1203, configured to determine a target mechanism level corresponding to the target work order according to the multiple probability values and a first preset threshold;
the first processing unit 1204 is configured to determine a target facility corresponding to the target work order based on the target facility hierarchy, and assign the target work order to the target facility.
In summary, according to the dispatching device for the work order provided by the embodiment of the present application, the first obtaining unit 1201 obtains the first potential features of a plurality of target words in the target work order, where the target work order is the work order to be dispatched, and the target words are words in the target work order except stop words; the first calculating unit 1202 calculates and obtains a plurality of probability values based on the first potential features by combining the full connection layer and the normalization index function of the neural network, wherein the probability values are probability values of each mechanism level corresponding to the target work order; the first determining unit 1203 determines a target mechanism level corresponding to the target work order according to the multiple probability values and a first preset threshold; the first processing unit 1204 determines a target organization corresponding to the target work order based on the target organization hierarchy, and assigns the target work order to the target organization, thereby solving the problem of low accuracy of assigning work orders in the related art. By combining a full connection layer and a normalization index function of a neural network, calculating to obtain a probability value of a target work order corresponding to each mechanism level based on the acquired potential characteristics of a plurality of words in the work order to be processed, determining the mechanism level corresponding to the work order to be processed according to the calculated probability value and a preset threshold value, determining the mechanism corresponding to the work order to be processed according to the mechanism level corresponding to the work order to be processed, and assigning the work order to be processed to the mechanism, thereby achieving the effect of improving the accuracy of assigning the work order.
Optionally, in the assignment device for work orders provided in this embodiment of the present application, the first determining unit includes: a first determining module for determining the highest two of the plurality of probability values; the first calculation module is used for subtracting the two maximum probability values in the probability values to obtain a target value; the first judgment module is used for judging whether the target value is larger than a first preset threshold value or not; the second determining module is used for determining the maximum probability value in the probability values if the target value is greater than a first preset threshold value, and taking the mechanism level corresponding to the maximum probability value as the target mechanism level; the first obtaining module is used for obtaining second potential characteristics of a plurality of target words in the target work order if the target value is not larger than a first preset threshold value; and the third determining module is used for determining a target mechanism level corresponding to the target work order based on the second potential characteristic by combining the full connection layer and the normalized exponential function of the neural network.
Optionally, in the dispatch device for a work order provided in an embodiment of the present application, the first obtaining unit includes: the first filtering module is used for filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters; the second filtering module is used for filtering stop words in the first work order based on the stop word list to obtain a second work order; the first processing module is used for carrying out word segmentation processing on each word in the second work order based on the first word dictionary to obtain a plurality of word vectors; the second processing module is used for processing each word vector to obtain a word embedding matrix; and the third processing module is used for inputting the word embedding matrix into the ELMO model for processing to obtain first potential characteristics of a plurality of target words in the target work order.
Optionally, in the dispatch device for a work order provided in an embodiment of the present application, the first obtaining module includes: the first determining submodule is used for obtaining a weight matrix of each institution level based on the occurrence condition of each target word in the target work order in each second word dictionary and the TF-IDF matrix of each institution level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each institution level; the first operation submodule is used for performing dot product operation on the weight matrix of each mechanism level and the word embedded matrix respectively to obtain a weight matrix of each mechanism level; the first construction submodule is used for constructing an image matrix according to the weighting matrix and the first potential features of each mechanism level; and the second determining submodule is used for performing convolution processing on the image matrix and obtaining second potential characteristics of a plurality of target words in the target work order based on the maximum pooling device.
Optionally, in the dispatch device for a work order provided in this embodiment of the present application, the third processing module includes: the first processing sub-module is used for processing the target work order by respectively utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to obtain a third latent feature and a fourth latent feature; and the third determining submodule is used for performing weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain the first potential features of a plurality of target words in the target work order.
Optionally, in the dispatching device of a work order provided in the embodiment of the present application, the device further includes: the second acquisition unit is used for acquiring a plurality of historical work orders with special characters, useless characters and stop words filtered out before obtaining the weight matrix of each mechanism level based on the occurrence condition of each target word in each second word dictionary in the target work order and the TF-IDF matrix of each mechanism level; the first dividing unit is used for dividing each historical work order into corresponding mechanism levels; the first establishing unit is used for performing word segmentation processing on the historical work orders of each mechanism level and establishing a plurality of second word dictionaries; and the second determining unit is used for obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
Optionally, in the dispatch device for a work order provided in this embodiment of the present application, the first processing unit includes: the first extraction module is used for extracting key information containing a target mechanism level in a target work order; the fourth determining module is used for performing text similarity matching on the key information and the mechanism tree of the target mechanism level to obtain similarity; the second judgment module is used for judging whether the similarity is greater than a second preset threshold value or not; the fifth determining module is used for determining a target mechanism corresponding to the target work order from the mechanism tree if the similarity is greater than a second preset threshold; and the sixth determining module is used for taking the mechanism with the highest level as the target mechanism corresponding to the target work order if the similarity is not greater than the second preset threshold.
The work order dispatching device includes a processor and a memory, the first acquiring unit 1201, the first calculating unit 1202, the first determining unit 1203, the first processing unit 1204, and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more, and the accuracy of dispatching the work order is improved by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
The embodiment of the invention provides a processor, which is used for running a program, wherein the program executes the dispatching method of the work order when running.
As shown in fig. 13, an embodiment of the present invention provides an electronic device, where the device includes a processor, a memory, and a program stored in the memory and executable on the processor, and the processor executes the program to implement the following steps: acquiring first potential characteristics of a plurality of target words in a target work order, wherein the target work order is a work order to be dispatched, and the target words are words except stop words in the target work order; calculating and obtaining a plurality of probability values based on the first potential features by combining a full connection layer and a normalization index function of a neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold; and determining a target mechanism corresponding to the target work order based on the target mechanism level, and distributing the target work order to the target mechanism.
The processor executes the program and further realizes the following steps: determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold value comprises the following steps: determining the highest two of the plurality of probability values; obtaining a target value by subtracting the two maximum probability values in the probability values; judging whether the target value is larger than the first preset threshold value or not; if the target value is larger than the first preset threshold value, determining the maximum probability value in the probability values, and taking the mechanism level corresponding to the maximum probability value as the target mechanism level; if the target numerical value is not larger than the first preset threshold value, second potential features of a plurality of target words in the target work order are obtained; and determining a target mechanism level corresponding to the target work order based on the second potential feature by combining the fully-connected layer of the neural network and the normalized exponential function.
The processor executes the program and further realizes the following steps: obtaining a first potential feature of a plurality of target terms in a target work order comprises: filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters; filtering stop words in the first work order based on the stop word list to obtain a second work order; based on a first word dictionary, performing word segmentation processing on each word in the second work order to obtain a plurality of word vectors; processing each word vector to obtain a word embedding matrix; and inputting the word embedding matrix into an ELMO model for processing to obtain first potential features of a plurality of target words in the target work order.
The processor executes the program and further realizes the following steps: obtaining second potential features of a plurality of target terms in the target work order comprises: obtaining a weight matrix of each mechanism level based on the occurrence condition of each target word in each second word dictionary in the target work order and the TF-IDF matrix of each mechanism level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each mechanism level; performing dot product operation on the weight matrix of each mechanism level and the word embedding matrix to obtain a weight matrix of each mechanism level; constructing an image matrix according to the weighting matrix of each mechanism level and the first potential characteristics; and performing convolution processing on the image matrix, and obtaining second potential features of a plurality of target words in the target work order based on a maximum pooling method.
The processor executes the program and further realizes the following steps: inputting the word embedding matrix into an ELMO model for processing, and obtaining first potential features of a plurality of target words in the target work order comprises: respectively utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to process the target work order to obtain a third latent feature and a fourth latent feature; and carrying out weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain first potential features of a plurality of target words in the target work order.
The processor executes the program and further realizes the following steps: before obtaining a weight matrix of each institution level based on the occurrence of each target word in the target work order in each second dictionary and the TF-IDF matrix of each institution level, the method further comprises: acquiring a plurality of historical worksheets with special characters, useless characters and stop words filtered out; dividing each historical work order into corresponding mechanism levels; performing word segmentation processing on the historical work orders of each mechanism level, and establishing a plurality of second word dictionaries; and obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
The processor executes the program and further realizes the following steps: based on the target mechanism hierarchy, determining a target mechanism corresponding to the target work order comprises: extracting key information containing the target organization level in the target work order; performing text similarity matching on the key information and the mechanism tree of the target mechanism level to obtain similarity; judging whether the similarity is greater than a second preset threshold value or not; if the similarity is greater than the second preset threshold, determining a target mechanism corresponding to the target work order from the mechanism tree; and if the similarity is not greater than the second preset threshold, taking the mechanism with the highest level as the target mechanism corresponding to the target work order.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: acquiring first potential characteristics of a plurality of target words in a target work order, wherein the target work order is a work order to be dispatched, and the target words are words except stop words in the target work order; calculating and obtaining a plurality of probability values based on the first potential features by combining a full connection layer and a normalization index function of a neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level; determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold; and determining a target mechanism corresponding to the target work order based on the target mechanism level, and distributing the target work order to the target mechanism.
When executed on a data processing device, is further adapted to perform a procedure for initializing the following method steps: determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold value comprises the following steps: determining the highest two of the plurality of probability values; obtaining a target value by subtracting the two maximum probability values in the probability values; judging whether the target value is larger than the first preset threshold value or not; if the target value is larger than the first preset threshold value, determining the maximum probability value in the probability values, and taking the mechanism level corresponding to the maximum probability value as the target mechanism level; if the target numerical value is not larger than the first preset threshold value, second potential features of a plurality of target words in the target work order are obtained; and determining a target mechanism level corresponding to the target work order based on the second potential feature by combining the fully-connected layer of the neural network and the normalized exponential function.
When executed on a data processing device, is further adapted to perform a procedure for initializing the following method steps: obtaining a first potential feature of a plurality of target terms in the target work order comprises: filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters; filtering stop words in the first work order based on the stop word list to obtain a second work order; based on a first word dictionary, performing word segmentation processing on each word in the second work order to obtain a plurality of word vectors; processing each word vector to obtain a word embedding matrix; and inputting the word embedding matrix into an ELMO model for processing to obtain first potential features of a plurality of target words in the target work order.
When executed on a data processing device, is further adapted to perform a procedure for initializing the following method steps: obtaining second potential features of the plurality of target terms in the target work order comprises: obtaining a weight matrix of each mechanism level based on the occurrence condition of each target word in each second word dictionary in the target work order and the TF-IDF matrix of each mechanism level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each mechanism level; performing dot product operation on the weight matrix of each mechanism level and the word embedding matrix respectively to obtain the weight matrix of each mechanism level; constructing an image matrix according to the weighting matrix of each mechanism level and the first potential features; and performing convolution processing on the image matrix, and obtaining second potential features of a plurality of target words in the target work order based on a maximum pooling method.
When executed on a data processing device, is further adapted to perform a procedure for initializing the following method steps: inputting the word embedding matrix into an ELMO model for processing, and obtaining first potential features of a plurality of target words in the target work order comprises: respectively processing the target work order by utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to obtain a third latent feature and a fourth latent feature; and carrying out weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain first potential features of a plurality of target words in the target work order.
When executed on a data processing device, is further adapted to perform a procedure for initializing the following method steps: before obtaining a weight matrix of each institution level based on the occurrence of each target word in the target work order in each second dictionary and the TF-IDF matrix of each institution level, the method further comprises: acquiring a plurality of historical worksheets with special characters, useless characters and stop words filtered out; dividing each historical work order into corresponding mechanism levels; performing word segmentation processing on the historical work orders of each mechanism level, and establishing a plurality of second word dictionaries; and obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
When executed on a data processing device, is further adapted to perform a procedure for initializing the following method steps: based on the target mechanism hierarchy, determining a target mechanism corresponding to the target work order comprises: extracting key information containing the target organization level in the target work order; performing text similarity matching on the key information and the mechanism tree of the target mechanism level to obtain similarity; judging whether the similarity is greater than a second preset threshold value or not; if the similarity is larger than the second preset threshold, determining a target mechanism corresponding to the target work order from the mechanism tree; and if the similarity is not greater than the second preset threshold, taking the mechanism with the highest level as a target mechanism corresponding to the target work order.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement or the like made within the spirit and principle of the present application shall be included in the scope of the claims of the present application.

Claims (10)

1. A work order assignment method, comprising:
acquiring first potential characteristics of a plurality of target words in a target work order, wherein the target work order is a work order to be dispatched, and the target words are words except stop words in the target work order;
calculating and obtaining a plurality of probability values based on the first potential features by combining a full connection layer and a normalization index function of a neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level;
determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold value;
and determining a target mechanism corresponding to the target work order based on the target mechanism level, and distributing the target work order to the target mechanism.
2. The method of claim 1, wherein determining the target facility hierarchy corresponding to the target work order according to the plurality of probability values and a first preset threshold comprises:
determining the highest two of the plurality of probability values;
obtaining a target value by subtracting the two maximum probability values in the probability values;
judging whether the target value is larger than the first preset threshold value or not;
if the target value is larger than the first preset threshold, determining the maximum probability value in the probability values, and taking the mechanism level corresponding to the maximum probability value as the target mechanism level;
if the target numerical value is not larger than the first preset threshold value, second potential features of a plurality of target words in the target work order are obtained;
and determining a target mechanism level corresponding to the target work order based on the second potential feature by combining the fully-connected layer of the neural network and the normalized exponential function.
3. The method of claim 1, wherein obtaining a first potential characterization of a plurality of target terms in a target work order comprises:
and filtering target characters in the target work order to obtain a first work order, wherein the target characters are at least one of the following characters: special characters and useless characters;
filtering stop words in the first work order based on a stop word list to obtain a second work order;
based on a first word dictionary, performing word segmentation processing on each word in the second work order to obtain a plurality of word vectors;
processing each word vector to obtain a word embedding matrix;
and inputting the word embedding matrix into an ELMO model for processing to obtain first potential features of a plurality of target words in the target work order.
4. The method of claim 3, wherein obtaining second potential features of the plurality of target terms in the target work order comprises:
obtaining a weight matrix of each mechanism level based on the occurrence condition of each target word in each second word dictionary in the target work order and the TF-IDF matrix of each mechanism level, wherein the second word dictionary is a word dictionary corresponding to the historical work order of each mechanism level;
performing dot product operation on the weight matrix of each mechanism level and the word embedding matrix respectively to obtain the weight matrix of each mechanism level;
constructing an image matrix according to the weighting matrix of each mechanism level and the first potential characteristics;
and performing convolution processing on the image matrix, and obtaining second potential features of a plurality of target words in the target work order based on a maximum pooling method.
5. The method of claim 3, wherein inputting the word embedding matrix into an ELMO model for processing to obtain a first potential feature of a plurality of target words in the target work order comprises:
respectively utilizing a forward LSTM layer and a backward LSTM layer in the ELMO model to process the target work order to obtain a third latent feature and a fourth latent feature;
and carrying out weighted summation on the word embedding matrix, the third potential feature and the fourth potential feature to obtain first potential features of a plurality of target words in the target work order.
6. The method of claim 4, wherein prior to deriving the weight matrix for each organizational level based on occurrences of each target term in the target work order in each second word dictionary and the TF-IDF matrix for each organizational level, the method further comprises:
acquiring a plurality of historical worksheets with special characters, useless characters and stop words filtered out;
dividing each historical work order into corresponding mechanism levels;
performing word segmentation processing on the historical work orders of each mechanism level, and establishing a plurality of second word dictionaries;
and obtaining a TF-IDF matrix of each mechanism level based on each second word dictionary by adopting a TF-IDF formula.
7. The method of claim 1, wherein determining, based on the target facility hierarchy, a target facility to which the target work order corresponds comprises:
extracting key information containing the target mechanism level in the target work order;
performing text similarity matching on the key information and the mechanism tree of the target mechanism level to obtain similarity;
judging whether the similarity is greater than a second preset threshold value or not;
if the similarity is greater than the second preset threshold, determining a target mechanism corresponding to the target work order from the mechanism tree;
and if the similarity is not greater than the second preset threshold, taking the mechanism with the highest level as the target mechanism corresponding to the target work order.
8. A work order dispatching device, comprising:
the system comprises a first obtaining unit, a first obtaining unit and a second obtaining unit, wherein the first obtaining unit is used for obtaining first potential characteristics of a plurality of target words in a target work order, the target work order is a work order to be dispatched, and the target words are words except stop words in the target work order;
the first calculation unit is used for calculating and obtaining a plurality of probability values based on the first potential features by combining a full connection layer and a normalized index function of a neural network, wherein the probability values are the probability values of the target work order corresponding to each mechanism level;
the first determining unit is used for determining a target mechanism level corresponding to the target work order according to the probability values and a first preset threshold;
and the first processing unit is used for determining a target mechanism corresponding to the target work order based on the target mechanism level and distributing the target work order to the target mechanism.
9. A processor configured to run a program, wherein the program when executed performs the method of dispatching a work order of any of claims 1 to 7.
10. An electronic device comprising one or more processors and memory storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of dispatching a work order of any of claims 1-7.
CN202211131146.0A 2022-09-16 2022-09-16 Work order dispatching method and device, processor and electronic equipment Pending CN115456421A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211131146.0A CN115456421A (en) 2022-09-16 2022-09-16 Work order dispatching method and device, processor and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211131146.0A CN115456421A (en) 2022-09-16 2022-09-16 Work order dispatching method and device, processor and electronic equipment

Publications (1)

Publication Number Publication Date
CN115456421A true CN115456421A (en) 2022-12-09

Family

ID=84304955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211131146.0A Pending CN115456421A (en) 2022-09-16 2022-09-16 Work order dispatching method and device, processor and electronic equipment

Country Status (1)

Country Link
CN (1) CN115456421A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051122A (en) * 2022-12-30 2023-05-02 速达非物流(深圳)有限公司 Work order software system based on logistics scene
CN118095794A (en) * 2024-04-23 2024-05-28 国网辽宁省电力有限公司丹东供电公司 Work order information extraction method and system based on regular algorithm

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051122A (en) * 2022-12-30 2023-05-02 速达非物流(深圳)有限公司 Work order software system based on logistics scene
CN116051122B (en) * 2022-12-30 2024-03-29 速达非物流(深圳)有限公司 Work order software system based on logistics scene
CN118095794A (en) * 2024-04-23 2024-05-28 国网辽宁省电力有限公司丹东供电公司 Work order information extraction method and system based on regular algorithm

Similar Documents

Publication Publication Date Title
CN106991085B (en) Entity abbreviation generation method and device
CN110598206A (en) Text semantic recognition method and device, computer equipment and storage medium
CN115456421A (en) Work order dispatching method and device, processor and electronic equipment
CN110458324B (en) Method and device for calculating risk probability and computer equipment
CN107844533A (en) A kind of intelligent Answer System and analysis method
CN111339249B (en) Deep intelligent text matching method and device combining multi-angle features
CN109598517A (en) Commodity clearance processing, the processing of object and its class prediction method and apparatus
CN112732871A (en) Multi-label classification method for acquiring client intention label by robot
US10824808B2 (en) Robust key value extraction
CN116720515A (en) Sensitive word auditing method based on large language model, storage medium and electronic equipment
Estevez-Velarde et al. AutoML strategy based on grammatical evolution: A case study about knowledge discovery from text
CN112256863A (en) Method and device for determining corpus intentions and electronic equipment
CN112579781B (en) Text classification method, device, electronic equipment and medium
CN113157757A (en) Data recommendation method and device, electronic equipment and storage medium
CN113761192A (en) Text processing method, text processing device and text processing equipment
CN112445914A (en) Text classification method, device, computer equipment and medium
CN111967253A (en) Entity disambiguation method and device, computer equipment and storage medium
CN116701752A (en) News recommendation method and device based on artificial intelligence, electronic equipment and medium
CN113792131B (en) Keyword extraction method and device, electronic equipment and storage medium
CN112270189B (en) Question type analysis node generation method, system and storage medium
CN113987536A (en) Method and device for determining security level of field in data table, electronic equipment and medium
CN114398482A (en) Dictionary construction method and device, electronic equipment and storage medium
CN113408263A (en) Criminal period prediction method and device, storage medium and electronic device
CN112819205B (en) Method, device and system for predicting working hours
CN113792163B (en) Multimedia recommendation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination